Available via license: CC BY 4.0
Content may be subject to copyright.
Learning Analytics for Tracking Student Progress in
LMS
Aimad QAZDAR ( aimad.qazdar@uca.ac.ma )
Cadi Ayyad University
Sara QASSIMI
Cadi Ayyad University
Oussama HASSIDI
Cadi Ayyad University
Meriem HAFIDI
Cadi Ayyad University
El Hassan ABDELWAHED
Cadi Ayyad University
Youssef MELK
Grand Canyon University
Case Report
Keywords: Data driven, Enhanced Learning Technologies, ABC Learning Design, Learning Analytics,
Academic performance
Posted Date: April 4th, 2022
DOI: https://doi.org/10.21203/rs.3.rs-1505417/v1
License: This work is licensed under a Creative Commons Attribution 4.0 International License.
Read Full License
QAZDAR et al.
CASE STUDY
Learning Analytics for Tracking Student Progress
in LMS
Aimad QAZDAR1*†, Sara QASSIMI1, Oussama HASSIDI1
, Meriem HAFIDI1
, El Hassan ABDELWAHED1and Youssef MELK3
*Correspondence:
aimad.qazdar@uca.ac.ma
1Computer Systems Engineering
Laboratory (ISI), Computer
Science Department,Faculty of
Sciences Semlalia, Cadi Ayyad
University (UCA), Bd. Prince My
Abdellah, 40000, 2390 Marrakech,
Morocco
Full list of author information is
available at the end of the article
†Equal contributor
Abstract
The pandemic of the COVID-19 has accelerated the digital transformation of
learning and training. Many industries, especially higher education institutions,
have been using e-Learning platforms. Learning Management System (LMS) in
particular, has gained popularity as a tool capable of managing learning and
training processes as well as supporting learning process and administration. It
presents an important package of services and features capable of facilitating and
enhancing the e-Learning experience. Nevertheless, LMS offers only a few, if not
no, functionalities when it gets to tracking the instructional progress of students.
Progress in a learning experience is indicative of how students interact with
courses and learning materials. Based on such interaction, the teacher can detect
lower performers, project students’ progress, and anticipate the ones at risk as
well as those who need intervention. The use of the LMS generates an important
amount of data. This data can become extremely useful when analysed and
transformed into indicators reflecting the learner’s progress. To this end, our
research project used learning analytics to determine and define indicators
capable of monitoring students’ progress. With such analytics, we were able to
determine struggling students, lower performers and ultimately those who need
reinforcement and tutoring. The main goal was to greatly improve students’
success rates and the overall effectiveness of the learning process. We present a
case study in which students’ progress was tracked and monitored using a set of
student’s Key Performance Indicators (KPIs). The data used in this study was
collected from the Web technologies course from the e-campus platform at the
Cadi Ayyad University. The defined indicators allowed for tracking students’
progress during the course, but they were not able to determine the performance
of students.
Keywords: Data driven; Enhanced Learning Technologies; ABC Learning Design;
Learning Analytics; Academic performance
1 Introduction
The novel Coronavirus pandemic has drastically changed the world system. In the
Educational field, COVID-19 pandemic has transformed the teaching/learning pro-
cess and the students/teachers interactions from the traditional face to face, to the
new and virtual online. Due to the massive and unforeseen closures of institutions,
affected countries and communities have been forced to seek quick solutions based
on innovative technologies such as online learning platforms (LMS[1] and MOOC[2]),
[1]Learning Management System
[2]Massive Open Online Course
QAZDAR et al. Page 2 of 13
and to switch from the in-person to the fully online or hybrid classes with a smaller
number of students [1]. LMS and MOOC are an integrated systems which supports
the teaching/learning process and its administration [2]. LMS/MOOC presents
a package of services and features capable of enhancing e-Learning. It supports
teachers in developing e-learning courses and virtual classrooms where students can
enroll-in and study. In addition, it allows the inclusion of external learning materi-
als (resources/activities) and the reuse of already developed learning materials.An
LMS/MOOC can support designing, administering and grading assessments, as well
as planning for, creating and publishing courses. They also allow for synchronous
and /or asynchronous communications and interactions between all users (teachers,
students, tutors and administrators) through chat rooms, discussion forums, blog
posts and so on.
As the use of new information technology spreads rapidly, all most of data are
generated and exchanged digitally today. New data stored in digital form already
estimated for more than 92% in 2002, while the size of this new data was also over
five exabytes [3]. In Education and Training field, the use of the LMS/MOOC gen-
erates a huge amount of raw data. This data is critical for any given educational
entity and particularly for higher educational institutions. This data, provided it
has been critically analyzed, and synthesized into useful knowledge, can guide the
decision-making and advise all stakeholders (teachers, tutors, students, adminis-
trators and parents/ guardians). Additionally, such data can improve the quality
of both teaching and learning, and improve the overall students’ performance and
academic success [4].
Data-driven approach, in the Education field, is relatively linked to Educational
Data Mining (EDM) and Learning Analytics (LA). These are not by any means, two
new fields of study. It is in fact, the emergence of using “disruptive” technologies
in Education and the massive amount of data generated, that turned LA and EDM
into two promising fields of research capable of improving educational experiences
and decision making based on the aggregated data, for all Stakeholders (teachers,
tutors, students, administrators and researchers) [5].
Based on such a perspective, this study used learning analytics to analyse and pre-
dict the individual performance of students enrolled in a Web Technologies course
at UCA digital campus of the Faculty of Science Semlalia - Cadi Ayyad University
(MOROCCO). The main objective was to monitor the progress of students en-
rolled in the course on the platform, to predict any challenges to their learning and
to prevent them from failing at the end. The idea was to produce a system able to
generate predictive indicators using a dataset (logs file) collected from the UCA dig-
ital campus, these generated indicators could be used to monitor students progress
and to determine those who are not adequately progressing, poorly performing and
in need of reinforcement and tutoring. The outline of this paper goes as follows:
Section2 presents a literature review of the learning analytics, and related works.
Section 3 describes the context and the proposed methodology. Section 4 provides
a discussion of the obtained results. The conclusion and the planned future work
are presented in the last section (Section 5).
QAZDAR et al. Page 3 of 13
2 Background and related work
Learning analytics is a fast-increasing field of Technology-Enhanced Learning (TEL)
research. Its origin dates back more than ten years. In fact, since 2008, the concept
of analysis in education began to gain the attention of the researches with a focus
on understanding, enhancing, and optimizing the learning and teaching process.
From 2010, the concept of Learning Analytics (LA) was separated from the area
of analytics and became an independent field [6]. As a new research field, LA be-
came a interdisciplinary area between data mining, learning technology, pedagogy,
machine learning, business intelligence, big data, artificial intelligence, and statis-
tics [6, 7]. During this period, learning analytics was defined as : ”Using intelligent
data, learner-centered data generation and analysis models to explore information
and social interactions, predict learning, and provide recommendations for learning”
[8]. In 2011, the First International Conference on Learning Analytics and Knowl-
edge (LAK) was established in Banff, Alberta (Canada) [9]. The LAK first edition
conference organizing committee defined learning analytics as ”the measurement,
collection, analysis and reporting of data about learners and their contexts, for
purposes of understanding and optimising learning and the environments in which
it occurs” [9]. Implementing LA allows higher education institutions to better un-
derstand their students and the barriers to their learning, resulting in academic
success and the retention of a larger and more diverse student population, which
was important for operational facilities, fundraising, and admissions [10].
Many research studies have been conducted to understand the students’ learning
behaviors and optimize the LMS learning process using LA. Most of the developed
LA models and indicators served educators and educational institutions to identify
the students’ attitude and to detect those who were low performing or at-risk.
Student performance indicators are deeply dependent on the learning activities
and resources used in the LMS. The indicators can be classified into Productive,
Assimilative, Evaluation, Interactive and Communicative [11]. According to a sys-
tematic review conducted by [12], the most used indicators are Evaluation and Pro-
ductive activities. Evaluation activities had been used to determine the result of
activities and evaluations (formative and summative). Productive indicators could
determine the production actions of students, such as to create, complete, do, among
others. Interactive category showed also a good representativeness. Assimilative and
Communicative are the lowest used indicators. The analysis demonstrated that the
use of a combined indicator or several indicators can best represent the evalua-
tion of student engagement and participation [12]. The Learning Analytics at the
Dublin City University (Ireland) focused on three main axes: (1) improving test
performance using analytics from a general-purpose LMS like Moodle, (2) identify-
ing studying groups and the performance peer effect using on-campus geo-location
data, and (3) detecting lower-performing or at-risk students on programming mod-
ules [13]. LA could also be used in a computational environment to both analyze
and visualize student discussion groups working in a collaborative way to accom-
plish a task [14]. Using course signals, Purdue University allow professors to provide
real-time feedback to the students. Different measure such as Grades, demographic
data, interaction, and students’ effort are generated with the help of adopting the
traffic lights metaphor. In the same context, a personalized email is delivered to
QAZDAR et al. Page 4 of 13
students to inform them on their current status. The system helped to retain in-
formation, and performance outcomes were thus evaluated [15]. Learning Analytics
dashboards (LAD) had supported prior finds that visualizing learning behavior, and
helped students to reflect on their learning. LA framework LAViEW, discussed in
[16], presented an overview of the students indicators of engagement. Thus, Teachers
can directly send personalized feedbacks to selected cohorts of students, clustered
by their engagement indicator scores.
3 Study context
The study was conducted at the department of computer sciences within the Faculty
of Sciences Semlalia Cadi Ayyad University in Marrakech, Morocco. The data set
consisted of 154 instances of students registered in the third trimester of Mathemat-
ical and Computer Sciences Bachelor program. The data used collected in autumn
2020 from the Web technologies course hosted on the e-campus [3] at the depart-
ment. The Web technologies course was designed using the ABC Learning Design
framework [17]. Since its launch in 2015, ABC [4] learning design method has been
widely adopted across universities in Europe and beyond. It was developed by a
team from the University College London (UCL). ABC Learning Design focuses,
not on the technology, but on the types of learning that will take place, and what
students have to do in order to understand concepts. ABC LD was built on the
six learning types concept proposed by Laurillard [18]. The six learning types are :
Acquisition, Inquiry, Collaboration, Discussion, Practice, Production [19]. A good
learning design contains a mixture of these types of learning. The figure 1 below
represent a part of the course with the different used learning types.
Figure 1 course content
4 Data
4.1 Data Collection
The dataset used in this study consisted of teacher log files, student log files and
student gradebook. The course logs files allows the faculty member to see which
[3]https://ecampus-fssm.uca.ma/
[4]Arena Blended Curriculum
QAZDAR et al. Page 5 of 13
resources or activities had been accessed and when. It outlined the date, and time
resources were accessed, the name of the student, completed actions (i.e., view, add,
update, or delete), the performed activities in different modules (e.g., the course,
forum, or quiz etc.), the Internet Protocol (IP) address from which it was accessed,
and additional information about the action [20]. Figure 2 show a part of the course
log file.
Figure 2 Logs file
The data set was completed by using the course gradebook. All of the grades for
each student in the course were recorded in the gradebook, or ”assessor’s report”.
The gradebook collects the items that were graded in the different parts of Moodle
that were assessed. And it allows teachers to view and edit them, as well as sort
them into categories and calculate totals in different ways. The grades are initially
displayed as the raw scores of the reviews themselves, so it would depend on how the
teacher had set them up [21]. The score could be presented gross or as a percentage
as shown in the figure 3.
Figure 3 Gradebooks file
4.2 Data Analysis
After data collection, an important question was what data and indicators to dis-
play to teachers and/or students on LA reports about student performance. In this
context, Key Performance Indicators (KPIs), also known as Key Success Indicators
(KSIs), are quantifiable measures that reflect, as faithfully as possible, the various
critical success factors at different levels [22]. Key Performance Indicators (KPIs)
serves educators and educational institutions to identify the students’ attitude and
to detect students who are low performance or at-risk. student’s key performance
indicators depends deeply on the learning design activities using in the LMS course
and on the data available in learning environments [23]. According to [24] ”Learn-
ing analytics are only likely to effectively and reliably enhance earning outcomes
QAZDAR et al. Page 6 of 13
if they are designed to measure and track signals that are genuine indicators of or
proxies for learning”. To this end, we defined five different pedagogical KPIs sus-
ceptible to give a global vision on the students activities through the LMS. The
indicators in question are adequate for the activities provided using the ABC learn-
ing design. The pedagogical indicators are : Connectivity,Acquisition,Productivity,
Interactivity, and Reactivity.
1Connectivity Indicator
The connectivity indicator aims to measure the state and the degree to which
the students are connected to the course. The connectivity indicator is calcu-
lated based on the course frequency consultation ”freqCons” , the average of
sessions duration ”sessionDuration” and the number of visited units ”num-
berOfUnits”.
Connectivity = 0.1×f reqCons+0.3×sessionDuration+0.6×nbrOf U nits(1)
2Acquisition Indicator
The acquisition indicator specifies the student’s concepts acquisition progress.
It’s based on the number of achieved units ”nbrOfAchvUnits” and the average
score of quiz tentative (attempts) ”quizScore”. We consider that any visited
course unit is achieved.
Acquisition = 0.7×quizScore + 0.3×nbrOf AchvUnits (2)
3Productivity Indicator
The productivity indicator permits to evaluate the degree of learner’s produc-
tivity in the course. The productivity indicator is obtained based on the work-
shop grade ”workShGrad”, work shop peer evaluation grad ”wrkShPeerEvlG”,
and the assignement grade ”AssigGrad”.
P roductivity = 0.4×workShGrad+0.2×wr kShP eer EvlG+0.4×AssigGrad(3)
4Interactivity Indicator
The interactivity indicator is used to evaluate the learner’s degree of inter-
action and communication in the plateforms. This indicator is determinated
using number of student’s posts ”nbrOfPost” and the number of forum con-
sultation ”nbrOfForumCons”.
Interactivity = 0.8×nbr P ost + 0.2×nbrO fF or umCons (4)
5Reactivity Indicator
The reactivity indicator is applied to evaluate the degree of learner’s respon-
siveness to the course. This indicator is calculated by using the time of resource
publication ”timeOfResPub” and the time of the first learner’s consultation
”timeOfFirstCons”
Reactivity =timeO fF ir stCons −timeO fR esP ub (5)
QAZDAR et al. Page 7 of 13
5 Overall results
5.1 Connectivity
From the connectivity graph shown in Figure 4, we notice that students’ connec-
tivity improves as they progress through the course. We also notice that the session
duration and the number of units visited are correlated. The students were more
connective to the course between the week 45 (2020) and 1 (2021), except for the
weeks 51 and 2 where the indicators has been decreased. The session duration was
between 30 minutes and 1h30. The number of unites vary between 80 and 128. For
the consultation frequency, it vary between 3 and 50 consultation per week. Due to
the end of publication of new chapters, the weeks 3 to 12 saw a decrease in this indi-
cator except for the week 5 and 10 in student which start to prepare the exam first
session. Durant the semester, the connectivity has six peaks in the following weeks
(46, 47, 50, 51, 53, and 2) which matches of the number of the course chapters.
Week 50 represent the week where the student consulted the maximum of unite;
128 units in 1 hour 30 minutes
Figure 4 Connectivity indicator
5.2 Acquisition
According to the graphic in Figure 5, the students acquisition could be subdivided
in two period. The first one is between the week 45 (2020) and the 2(2021) which
represent the evolution of the acquisition. In this period, students read the curse and
did the quiz. The peak of the acquisition was in the week 50. The second period was
between week 3 and 12 (2021).This period represents the decrease in acquisitions. In
this period student did quizzes without revising or consulting the course units. All
student had three attempts to do the quiz and all most of them left one tentative
for the end the semester to prepare for the exam. According the same graph we
note the the quiz score is correlated with the number of achievement units.
QAZDAR et al. Page 8 of 13
Figure 5 Acquisition indicator
5.3 Productivity
The productivity can answer about the ability of student to produce in the course.
In the analysis phase, we mentioned that the productivity is obtained using the
workshop grad, the workshop peer evaluation and the assessment grade. In this use
case we used only the workshop peer evaluation and the workshop grad. According
the graphic on view in Figure 6, we observe that the students were not productive
until week 47. The period between week 47 and the week 3 represent the period
of productivity evolution which is vary between 30 and 51 percent. The peak of
the productivity was in week number 52. The workshop peer evaluation rate was
higher than workshop grad which can be explained by the fact that the group was
composed of ”good” evaluators. Week number 6 shows a small change in student
productivity. This week represents the week of a second session/chance we gave to
students who missed a workshop in the first weeks.
5.4 Interactivity
As we mentioned before the interactivity indicator can give the teacher an idea
about the degree of interaction and the communication of student in the course
using especially the forum as a way for communication. Based on the graph Figure
7, the students were not interactive in the platform. The students consulted the
forum more than participated in the discussion. At the end of the semester, the week
five recognized an evolution of students interactivity where students participated
with more than five posts in the forum and consulting it more than 30 times. This
situation could be explained by the need of the students to get more information
about the exam or to ask for clarification of certain concepts in the course. However,
to confirm -or not- this hypothesis, we need a semantic analysis of the the student’s
posts.
QAZDAR et al. Page 9 of 13
Figure 6 Productivity indicator
Figure 7 Interactivity indicator
5.5 Reactivity
The reactivity is an important indicator that informs the teacher about the the
degree of students reactivity with the publication of courses resources and activities
on the platform. We notice that in almost all cases the students were not very
reactive for the first weeks. But after the week number 48(2020) they started to be
more reactive. Week 53 show a reactivity decrease which may be due to the end of
year vacation (See Figure 8).
5.6 Analysing indicators for some students
The following figure 9 represent an overview of the five students indicators : Con-
nectivity, Acquisition, Productivity, Interactivity, and Reactivity for six students
QAZDAR et al. Page 10 of 13
Figure 8 Reactivity indicator
from the samples (Std 3, Std 4, Std 26, Std 32, Std 143, and Std145). The students
were selected randomly according to their grade in the final exam as representing
in the table 1 above. The exam was in-person at the week 11(2021). To pass the
exam, the student must obtain at least 10/20 pts.
Table 1 Students grade in the exam
Std Grade Results
Std 3 14.25 Validate
Std 4 06.00 Not Validate
Std 26 16.5 Validate
Std 32 09.00 Not Validate
Std 143 07.25 Not Validate
Std 145 11.00 Validate
According to the graph of students indicators (Figure 9), Std 4, Std 26 and Std
143 did not exceed 110 in connectivity. Unlike Std 143, Std 4 and Std 32 which
respectively obtain 237, 190, 160 for the same indicator. The acquisition for student
who did passed the exam was between 50 and 80. For those didn’t passe the exam
was more than 120. All the six students got a productivity indicator between 40
and 95. However, we notice that student who passed the exam they didn’t miss any
workshop to do contrary for who failed the exam they miss at least one workshop.
We notice, from the same Figure 9, no variation in the indicators of Std 4 from
week 8. This can be explained by the dropout of this learner from week 8 (2021).
Unlike other students who have been logged into the platform for up to the last
three weeks.
6 Conclusions and perspectives
The following study was conduct to investigate about using indicators to monitor
the learners progression and performance through the learning platform. To this
end, Connectivity, Acquisition, Productivity, Interactivity, and Reactivity were de-
fined as indicators. The Indicators related to course activities designed based on
the ABC learning design framework. The dataset used consist of 154 instances of
QAZDAR et al. Page 11 of 13
Figure 9 Evolution of indicators
students registered in the third semester of Mathematical and Computer Sciences
Bachelor. The dataset was generated using the students logs file, gradebook file and
teacher logs file. After generating the dataset, the next steps consist on processing
and analysing data to determinate the different indicators in each week for all the
students or for a specific one.
At this stage of work, the indicators showed the student progress in the course, but
they aren’t able to determinate his or her performance. For that, we have defined
three objectives for future work. First objective, is to use this indicators to predict
student performance and determine student at risk. The second one is to define the
optimal values for each indicators. The third objective is to use the optimal value
to develop a recommendation system for student at risk and to prevent failure in
the exam.
Declarations
Competing interests
The authors declare that they have no competing interests.
Ethics approval and consent to participate
Not applicable
Consent for publication
Not applicable
QAZDAR et al. Page 12 of 13
Availability of data and material
The data and the material are available from the corresponding author on reasonable request.
Funding
The authors declare that this research was not funded, and didn’t received no specific grant from any funding
agency in the public, commercial, or not-for-profit sectors.
Authors’ contributions
All the authors had contributed to the design and implementation of the case study, to the analysis of the results
and to the writing of the manuscript.
Acknowledgements
Not applicable
Authors’ information (optional)
Not applicable
Author details
1Computer Systems Engineering Laboratory (ISI), Computer Science Department,Faculty of Sciences Semlalia, Cadi
Ayyad University (UCA), Bd. Prince My Abdellah, 40000, 2390 Marrakech, Morocco. 2Department of Computer
Science, Faculty of Sciences and Techniques Guiliz (FSTG), University of Cadi Ayyad (UCA), Marrakech, Morrocco.
3Master of education. Curriculum and instructions Technology, , Grand Canyon University (GCU), Arizona, United
state.
References
1. Ter¨as, M., Suoranta, J., Ter¨as, H., Curcher, M.: Post-Covid-19 Education and Education Technology
‘Solutionism’: a Seller’s Market. Postdigital Science and Education, 1–16 (2020).
doi:10.1007/s42438-020-00164-x
2. Qazdar, A., Cherkaoui, C., Er-Raha, B., Mammass, D.: AeLF: Mixing Adaptive Learning System with Learning
Management System. International Journal of Computer Applications 119(15), 1–8 (2015).
doi:10.5120/21140-4171
3. Tsai, C.W., Lai, C.F., Chao, H.C., Vasilakos, A.V.: Big data analytics: a survey. Journal of Big Data 2(1), 1–32
(2015). doi:10.1186/S40537-015-0030-3/TABLES/3
4. Qazdar, A., Er-Raha, B., Cherkaoui, C., Mammass, D.: A machine learning algorithm framework for predicting
students performance: A case study of baccalaureate students in Morocco. Education and Information
Technologies 24(6) (2019). doi:10.1007/s10639-019-09946-8
5. Rotolo, D., Hicks, D., Martin, B.R.: What is an emerging technology? Research Policy 44(10), 1827–1843
(2015). doi:10.1016/j.respol.2015.06.006. 1503.00673
6. Banihashem, S.K., Aliabadi, K., Pourroostaei Ardakani, S., Delaver, A., Nili Ahmadabadi, M.: Learning
analytics: A systematic literature review. Interdisciplinary Journal of Virtual Learning in Medical Sciences 9(2)
(2018)
7. Chui, K.T., Fung, D.C.L., Lytras, M.D., Lam, T.M.: Predicting at-risk university students in a virtual learning
environment via a machine learning algorithm. Computers in Human Behavior 107, 105584 (2020).
doi:10.1016/j.chb.2018.06.032
8. Siemens, G.: What are learning analytics? elearnspace. Recuperado de http://www. elearnspace.
org/blog/2010/08/25/what-are-learning-analytics/. Consultado el 1(07), 2012 (2010)
9. Long, P.: LAK’11: Proceedings of the 1st International Conference on Learning Analytics and Knowledge,
February 27-March 1, 2011, Banff, Alb erta, Canada. ACM, ??? (2011)
10. Hasan, R., Palaniappan, S., Mahmood, S., Abbas, A., Sarker, K.U., Sattar, M.U.: Predicting Student
Performance in Higher Educational Institutions Using Video Learning Analytics and Data Mining Techniques.
Applied Sciences 2020, Vol. 10, Page 3894 10(11), 3894 (2020). doi:10.3390/APP10113894
11. Rienties, B., Toetenel, L.: The impact of 151 learning designs on student satisfaction and performance: Social
learning (analytics) matters. ACM International Conference Proceeding Series 25-29-April-2016, 339–343
(2016). doi:10.1145/2883851.2883875
12. Costa, L.A., Salvador, L.N., Amorim, R.R.: Evaluation of Academic Performance Based on Learning Analytics
and Ontology: A Systematic Mapping Study. Proceedings - Frontiers in Education Conference, FIE 2018-Octob
(2019). doi:10.1109/FIE.2018.8658936
13. Azcona, D., Corrigan, O., Scanlon, P., Smeaton, A.F.: Innovative Learning Analytics Research at a data-driven
HEI. In: Proceedings of the 3rd International Conference on Higher Education Advances. Universitat Polit`ecnica
Val`encia, Valencia (2017). doi:10.4995/HEAD17.2017.5245.
http://ocs.editorial.upv.es/index.php/HEAD/HEAD17/paper/view/5245
14. Riquelme, F., Munoz, R., Mac Lean, R., Villarroel, R., Barcelos, T.S., de Albuquerque, V.H.C.: Using
multimodal learning analytics to study collaboration on discussion groups. Universal Access in the Information
Society 18(3), 633–643 (2019)
15. Arnold, K.E., Pistilli, M.D.: Course signals at purdue: Using learning analytics to increase student success. In:
Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 267–270 (2012)
16. Majumdar, R., Ak¸capınar, A., Ak¸capınar, G., Ogata, H., Flanagan, B.: Laview: Learning analytics dashboard
towards evidence-based education. In: Companion Proceedings of the 9th International Conference on Learning
Analytics and Knowledge (2019). Society for Learning Analytics Research (SoLAR)
QAZDAR et al. Page 13 of 13
17. Young, C.P., Perovi´c, N.: Ab c ld–a new toolkit for rapid learning design. In: European Distance Education
Network (EDEN) Conference 2020 (2020)
18. Laurillard, D.: Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology.
Routledge, ??? (2013)
19. ABC LD: The 6 Learning types – ABC Learning Design (2020). https://abc-ld.org/6-learning-types/ Accessed
2021-12-07
20. Mo odle: Logs - MoodleDo cs (2020). https://docs.moodle.org/311/en/Logs Accessed 2021-11-09
21. Mo odle: Grader report - MoodleDocs (2021). https://docs.moodle.org/311/en/Grader report Accessed
2021-11-09
22. Jia, H., Wang, M., Ran, W., Yang, S.J.H., Liao, J., Chiu, D.K.W.: Design of a performance-oriented workplace
e-learning system using ontology. Expert Systems with Applications 38(4), 3372–3382 (2011).
doi:10.1016/j.eswa.2010.08.122
23. Bussu, A., Detotto, C., Serra, L.: Indicators to prevent university drop-out and delayed graduation: an italian
case. Journal of Applied Research in Higher Education (2019)
24. Wilson, A., Watson, C., Thompson, T.L., Drew, V., Doyle, S.: Learning analytics: Challenges and limitations.
Teaching in Higher Education 22(8), 991–1007 (2017)