Access to this full-text is provided by Springer Nature.
Content available from Smart Learning Environments
This content is subject to copyright. Terms and conditions apply.
LearningViz: adashboard forvisualizing,
analyzing andclosing learning performance
gaps—a case study approach
Bo Pei1* , Ying Cheng2, Alex Ambrose3, Eva Dziadula4, Wanli Xing5 and Jie Lu6
Introduction
Interactive learning analytics dashboards have been increasingly incorporated into
educational settings to support instructors track and analyze students’ performance,
enabling them to make informed decisions to enhance teaching outcomes (Gutiérrez-
Braojos et al., 2023; Lee-Cultura et al., 2024). ese platforms combine data analy-
sis approaches with information visualization techniques in an organic way, allowing
Abstract
The availability of large-scale learning data presents unprecedented opportunities
for investigating student learning processes. However, it is challenging for instruc-
tors to fully make sense of this data and effectively support their teaching practices.
This study introduces LearningViz, an interactive learning analytics dashboard to help
instructors identify, analyze, and close performance gaps among students in their
classes. In this dashboard, we incorporated three modules to enhance human
and computer interactions for better supporting the teaching practices: the Student
Overall Performance Analysis Module, which provides a comprehensive understanding
of students’ learning in the course; the Student Group Performance Analysis Mod-
ule, which examines performance gaps across different groups and identifies factors
contributing to these gaps; and the Final Exam Item Analysis Module, which evaluates
the quality of exam questions and identifies strategies for closing performance gaps.
The overall design of the platform follows a user-centered approach, integrating data
analysis with various visualization strategies in a unified platform. A case study is then
conducted to highlight the effectiveness of LearningViz in supporting instructors
analyzing students’ learning patterns and associated factors impacting learning per-
formance. We further conduct a usability test with several domain experts, to evaluate
the usefulness and effectiveness of this platform in supporting the teaching practices.
Our findings underscore the platform’s ability to support instructors in detecting
performance gaps among students, investigating influential factors, evaluating assess-
ment quality and implementing targeted instructional strategies for closing perfor-
mance gaps.
Keywords: Data visualization, Learning dashboard, Visual learning analytics,
Performance gaps analysis
Open Access
© The Author(s) 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits
use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original
author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third
party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the mate-
rial. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or
exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://
creat iveco mmons. org/ licen ses/ by/4. 0/.
RESEARCH
Peietal. Smart Learning Environments (2024) 11:56
https://doi.org/10.1186/s40561-024-00346-1
Smart Learning Environments
*Correspondence:
bpei@usf.edu
1 Department of Educational
and Psychological Studies,
College of Education, University
of South Florida, 4110 USF Apple
Dr, Tampa, FL 33620, USA
2 Department of Psychology,
University of Notre Dame, Notre
Dame, IN 46556, USA
3 Notre Dame Learning,
University of Notre Dame, Notre
Dame, IN 46556, USA
4 Department of Economics,
University of Notre Dame, Notre
Dame, IN 46556, USA
5 School of Teaching & Learning,
College of Education, University
of Florida, 1221 SW 5th Ave,
Gainesville, FL 32601, USA
6 Mary Frances Early College
of Education, University
of Georgia, 217 River’s Crossing,
Athens, GA 30603, USA
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 2 of 25
Peietal. Smart Learning Environments (2024) 11:56
instructors to understand students’ learning performance more effectively and accu-
rately. For example, some platforms focus on presenting students’ learning behaviors
such as posting activities in online forums (Wong & Zhang, 2018), video watching activ-
ities (Mohammadhassan & Mitrovic, 2022; Shi etal., 2015), to help instructors optimize
their instructional materials. ere are also other platforms focusing on students’ perfor-
mance on specific tasks including course assignments (Deng etal., 2019), exams (Xiaoya
et al., 2009), and practical problem-solving (Xia et al., 2020) in teaching practices to
reveal and highlight the most salient performance patterns. Furthermore, various visual
learning analytics have also been employed to interpret data analytics results about stu-
dents’ performance contextualized in the specific teaching and learning environment to
help instructors have a more comprehensive understanding of students’ learning status
(Lopez etal., 2017; Mendez etal., 2021; Paiva etal., 2018).
Recently, researchers have increasingly suggested that examining the performance
gaps across student groups and identifying the potential causes offer another approach
to improve students’ overall learning performance in classes (Al-Tameemi etal., 2023;
Johar etal., 2023; Meyer etal., 2024). By focusing on investigating where and how the
performance gaps occurred in the course among students, these approaches allow
instructors to provide more direct and targeted interventions for students to close the
performance gaps. Furthermore, these approaches can be particularly beneficial in large
courses, where instructors often face challenges in identifying the causing factors for the
low performance and implementing effective intervention strategies (Hew & Cheung,
2014). However, current research and practices on visual learning analytics for closing
students’ learning gaps are limited. As such, a unified framework targeting the analysis
of performance gaps is needed to provide instructors with comprehensive guidance on
detecting performance groups, examining contributing factors of performance gaps, and
identifying targeted interventions to close the performance gaps.
In this study, we implement such an interactive visual learning analytics platform –
LearningViz – to help instructors identify, analyze and reduce students’ learning gaps in
their classes. We further conduct an associated case study to demonstrate the practical
use of LearningViz in supporting data-driven decision-making processes within a course
that employed the mastery-based instructional strategy (Spertus & Kurmas, 2021; Wang
& Qi, 2018). is instructional setting is chosen because the mastery-based instructional
strategy has been a widely used pedagogical strategy in education, allowing instruc-
tors to revise their teaching strategies in real time based on student performance on a
series of low-stake assessments (Burns etal., 2023). However, despite the advantages,
instructors often face challenges in effectively implementing this strategy in the real
teaching and learning settings. e first challenge is that the performance scores may
not fully reflect students’ actual learning status. is is particularly true for the low-stake
assignments, where multiple attempts can lead to inflated scores, making it difficult for
instructors to accurately gauge students’ true mastery of the topic. e second challenge
is that these scores often obscure specific topics where students struggle, making it diffi-
cult for instructors to pinpoint areas that need additional instructional support. As such,
conducting such a case study in this setting can showcase how LearningViz addresses
these challenges and further demonstrate its effectiveness in supporting instructors in
making informed decisions and enhancing overall teaching outcomes.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 3 of 25
Peietal. Smart Learning Environments (2024) 11:56
Particularly, this platform encompasses 3 different modules: (1) Student Overall
Performance Analysis Module for analyzing student overall performance patterns in
the course. Within this module, instructors can examine how students’ weekly per-
formance leads to their corresponding final performances as well as the differences
in performances on each assessment. (2) Student Group Performance Analysis Mod-
ule for identifying the achievement gaps between performance groups assignments.
In this module, the differences in the average performance of each group on each
assessment were examined to identify the assessments and topics that contribute the
most to performance gaps across these groups. (3) Final Exam Item Analysis Mod-
ule for identifying and closing the achievement gaps in final exams. Focusing on the
questions in the final exam, this module not only evaluates the abilities of each ques-
tion in distinguishing performance groups but also identifies the difficult topics of
each group for generating targeted interventions to close the performance gaps.
With the successful implementation of these modules, LearningViz provides a
comprehensive guidance for analyzing student learning, from identifying perfor-
mance gaps to providing contextualized and targeted insights for closing the gaps
within mastery-based instructional settings. This design and the whole analytical
framework can be easily adopted in other courses, and particularly, we summarize
our contributions as follows:
Contribution 1:
This study provides a detailed description regarding the development of a plat-
form through a case study that integrates technical development into the real-world
teaching and learning settings.
Contribution 2:
This study provides a human-centered, interactive visual learning analytics plat-
form for the educational technology field, which supports instructors identifying,
analyzing, and closing performance gaps on various dimensions:
(1) Identification of performance gaps on each weekly assignment, as well as midterm
and final exam assessment questions.
(2) Evaluation on the quality of each assessment item, with targeted insights extracted
to support the reduction of performance gaps.
To ensure the LearningViz platform can be easily adopted by instructors with lim-
ited time and computational expertise, the design process involved collaborations
with experts and professors from various fields, including Teaching Excellence,
Computer Science, Psychology, Educational Technology, and the course instructors
themselves. This interdisciplinary approach allows us to align the practical needs in
the actual teaching and learning settings with the various constraints in the algorith-
mic spaces, translating the analytical results into meaningful, context-specific indi-
cators within the learning settings. Additionally, the whole design process follows
the user-centered design principle, guided by the nine-stage visual design methodol-
ogy outlined by Sedlmair etal. (2012), ensuring ease of use and responsiveness to the
various needs of instructors.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 4 of 25
Peietal. Smart Learning Environments (2024) 11:56
Literature review
Mastery-based learning, first conceptualized by Benjamin S. Bloom in the 1960s, advo-
cates for students to demonstrate a comprehensive understanding or mastery of specific
skills or knowledge before progressing to subsequent topics or levels (Block & Burns,
1976). is pedagogical approach has undergone significant refinement over the dec-
ades, integrating emerging insights from cognitive science and educational psychology.
It has also adapted to technological advancements, facilitating increasingly personal-
ized and adaptive learning experiences (Weinstein etal., 2018). Mastery-based learning
provides a more individualized educational experience, allowing students to progress
at their own pace and spend additional time on topics they find particularly challeng-
ing. is method also helps to mitigate the pressures of competition among students
by emphasizing individual achievement over peer comparison (Bloom, 1984; Guskey,
2010). However, despite its benefits, the implementation of mastery-based learning pre-
sents significant challenges from an instructional perspective. Educators face difficulties
aligning sufficient resources with the demands of this approach, as it requires extensive
time commitments, comprehensive teacher training, and substantial educational tech-
nology resources (Caena & Redecker, 2019). Additionally, creating and managing precise
assessments that accurately measure mastery is complex and labor-intensive. e neces-
sity for frequent and varied assessments can place a substantial burden on educators
(Szulewski etal., 2023).
In the realm of education, visual analytics represents a significant interdisciplinary
approach that merges data analysis, information visualization, and human–computer
interaction (Ramanujan etal., 2017). is approach is particularly valuable for helping
educators, administrators, and students decipher complex data related to educational
processes and outcomes. One of the most prominent applications of visual analytics in
education is the development of Learning Analytics Dashboards (LADs), which are cus-
tomizable interfaces that reflect real-time updates on learning processes (Masiello etal.,
2024). For example, ECoach, initially developed for STEM courses, has proven effective
in enhancing student outcomes, notably in improving GPAs across various disciplines
(Matz etal., 2021).
e integration of visual learning analytics technologies in educational settings
reduces educators’ cognitive load while interpreting the complex students’ learning pat-
terns (Paas etal., 2004; Sweller, 2020). Cognitive load theory posits that visual presenta-
tion can help educators understand and digest information more easily and effectively,
allowing them to spend more time on decision-making processes rather than on extract-
ing information from the data. Visual representations, such as interactive dashboards,
provide educators with the unique opportunity to directly interact with students’ learn-
ing data and examine the associations between learning patterns with learning perfor-
mance. is process facilitates educators understanding how their teaching strategies
influence students’ learning activities and are further associated with learning perfor-
mance (Sedig & Parsons, 2013).
ese benefits have also been extensively demonstrated in various studies, which show
that visual learning analytics not only facilitate decision-making in the teaching process
but also enhance students’ learning outcomes. For example, Chen etal. (2019) imple-
mented an interactive platform aiming to support students’ self-regulated learning in
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 5 of 25
Peietal. Smart Learning Environments (2024) 11:56
online environments by analyzing learning logs. is system includes a knowledge mon-
itoring dashboard, and a strategy using dashboard, both designed to facilitate student
metacognition. Additionally, He etal. (2019) introduced the video utilization calendar
(VUC), a visualization tool that promotes engagement by allowing students and instruc-
tors to track viewing progress and daily viewing history across multiple online courses.
Furthermore, Bañeres etal. (2020) developed a dashboard with an early warning sys-
tem capable of predicting and intervening in cases of students at risk, demonstrating
high accuracy in identifying students who may require additional support in a first-year
undergraduate computer science course.
Considering these technological advancements, the potential synergy between mas-
tery-based learning and visual analytics deserves more in-depth examination. Inte-
grating visual analytics approaches in mastery-based learning instructional settings,
instructors can track the changes in students’ learning performance on each topic each
week, and further analyze the performance gap across student groups and investigate the
causing factors of these gaps. On the one hand, it addresses the drawbacks of employing
data visualization approaches in complex situations but presenting superficial perfor-
mance data that provides limited insights for supporting students. On the other hand,
this platform situated students’ learning performance in the assessment context and at
the same time evaluating the quality of each question, which allows instructors to detect
the potential flaws in questions while investigating students’ performance. Implement-
ing such a platform facilitates the process of identifying students’ performance gaps and
developing targeted interventions for instructors, which further contribute to strategies
for supporting personalized teaching and learning practices.
Problem description
In this section, we first introduce the specific context of the study and the associated
data, providing a foundation for analysis. After that, we outline several tasks that our
platform will accomplish to better support the teaching and learning practices. Finally,
we list the design requirements that guide the development of our platform, ensuring
the functionalities in the platform align with the identified tasks.
Research context anddata description
is study analyzes student performance in a gateway course for economics offered at
a top 20 private university in the U.S. In the course, mastery-based learning strategies
were employed, providing students with multiple chances to revise and submit their
homework and one chance on quizzes and exams. ere were 200 students in total reg-
istered for the course. In each week, multiple low-stakes assignments and quizzes were
employed to provide students with enough practice on the learning topic. e data col-
lected for this study includes students’ scores on each assignment, quiz, and exam. In
addition, we also include the text-based description data about the topics relevant to
each assignmet and item data for each question in the final exam. All the learning per-
formance data were obtained by collaborating with the course instructors and the Uni-
versity’s Office of Information Technology’s Teaching & Learning Technologies team. All
information that can be used to identify an individual student was anonymized to ensure
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 6 of 25
Peietal. Smart Learning Environments (2024) 11:56
privacy and confidentiality. is study was approved by the Umbrella IRB Protocol in the
university involved.
Task description
e tasks in this study were defined through a combination of challenges identified
during instructors’ teaching practices in real environments and insights from research
on learning analytics and instructional design. In particular, we work closely with the
course instructor through several rounds of discussions, pinpointing the specific chal-
lenge that most instructors face is “How can we identify students’ actual learning needs
and improve the overall performance in a situation where most of them have a high per-
formance on weekly homework?”. Based on this question, we extracted the following tasks
that the platform should accomplish:
T.1. What is students’ overall learningstatus of the course? At the end of each semester,
instructors typically have only a general understanding of the overall performance of the
course. ey often lack a detailed understanding of students’ performance throughout
the course, including how the design of course structure related to students’ learning
performance, how the learning topics in each week related to student learning perfor-
mance and so on. An analysis on these aspects can provide actionable and targeted
insights for instructors to revise the corresponding topics or assignments to better sup-
port students’ learning (See Fig.3). is task is also supported by literature on learning
platforms, which emphasizes the importance of providing instructors with actionable
insights into overall course performance (Gutiérrez-Braojos etal., 2023; Masiello etal.,
2024).
T.2. What is the learning progression of students in the course? Unlike that in face-to-
face classes, instructors have little interactions with students, which limits the under-
standing of student learning status in online settings. is limitation significantly
hindered instructors’ ability to initiate effective strategies to track and analyze students’
learning status in a timely manner. An approach that supports the analysis of the distri-
bution of students’ performance and visualizes students’ learning performance pathway
would allow instructors to gain a deeper insight into how students’ progress overtime
and adjust teaching strategies timely (See Fig.6). Moreover, this analysis can also facili-
tate instructors to identify different performance groups and analyze the performance
gaps. is aligns with research on mastery learning, which emphasizes continuous
assessment and feedback to guide student learning (Bloom, 1984; Guskey, 2009).
T.3. How can students be characterized into meaningful performance groups? Grouping
students into meaningful groups for analyzing performance gaps is essential for instruc-
tors to implement more appropriate instructional strategies to enhance the teach-
ing outcomes (Al-Tameemi etal., 2023; Meyer etal., 2024). is mechanism can help
instructors analyze how students from different performance groups learn differently,
how the learning patterns related to learning performances, and so on. For instance,
instructors might be interested in what learning behaviors students conducted in spe-
cific weeks relate to their learning performance, to adjust their instructional strategies
for enhancing the learning performance (See Figs.9 and 10).
T.4. How to eliminate performance gaps and what are the implications for future
practices? With the identified performance gaps across groups, instructors may want
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 7 of 25
Peietal. Smart Learning Environments (2024) 11:56
to implement effective strategies for providing targeted interventions to close these
gaps. Particularly, these interventions include two aspects: (1) identifying the topics
that each group has difficulty with offering the corresponding interventions, and (2)
assessing the quality of assessment questions and revising the ill-designed ones to
ensure they correctly reflect students’ learning performance. is analysis aligns with
prior research such as Weinstein etal. (2018) on formative assessment and feedback
loops, which highlight the importance of timely interventions to improve student
learning outcomes (See Figs.12 and 13).
Design requirements
To ensure that LearningViz is practically effective in the real teaching and learn-
ing environments while closely aligning with the requirements outlined in the task
descriptions, we developed a set of design requirements. ese requirements focus
on providing instructors with tools that offer accurate, intuitive insights about stu-
dents’ real learning status in the class.
R1. Simplified and intuitive visual representations. Understanding and interpreting
data visualization, especially in the large-scale educational data environments, often
impose a significant cognitive load on instructors without a computational back-
ground (Gutiérrez-Braojos etal., 2023). Instructors need to quickly grasp essential
insights from this data to make timely decisions that directly impact student learning
outcomes. As such, the first design requirement is to offer intuitive and straightfor-
ward visualizations that reduce cognitive load to interpret the complex learning data.
is has also been highlighted in the prior literature, investigating how the overly
complex displays can impede effective decision-making (Paas etal., 2004). As such,
all the visualization strategies employed within the paper have been determined after
extensive discussions with the course instructor.
R2. Multidimensional visualizations of learning performance. Student learning per-
formance can be influenced by multiple factors (Zesch etal., 2023), including their
own learning patterns, topic difficulty levels, assessment qualities and so on. Present-
ing all this information to instructors requires multiple-layered data visualization
strategies. To this end, LearningViz employs multidimensional visualizations to pro-
vide a comprehensive view of student learning status. ese visualizations include:
Performance pathway visualization: visualizing the changes of student learning
performance on each assessment in each week across the semester.
Performance distribution on each assessment: displaying the distributions of stu-
dent learning performance on each assessment to obtain insights regarding the
difficulty level of topics in the corresponding week.
Item-level visualization: visualizing student learning performance on each ques-
tion to identify the questions and topics leading to large performance gaps.
ese multiple visualization views allow instructors to examine student learning
performance from various perspectives and help them discover effective instructional
strategies to better support students’ learning.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 8 of 25
Peietal. Smart Learning Environments (2024) 11:56
R3. Finer granularity visualizations. Student learning is a dynamic, ongoing process,
it is essential for instructors to have access to granular data that describes student per-
formance in the course (Gutiérrez-Braojos etal., 2023). Apart from visualizing students’
learning performance from multiple views, LearningViz is also designed to assess the
performance gaps across predefined performance groups at multiple levels: exams,
weekly assignments and questions in the final exam. is design aims to help instruc-
tors detect the subtle changes in student performance and facilitate targeted and timely
interventions.
R4. Visualizing assessment qualities. e quality of assessment questions is signifi-
cantly associated with students’ learning outcomes. Poorly designed questions can have
a negative impact on students’ learning performance, leading to inaccurate estimation
about student learning abilities (Yang etal., 2022). erefore, it is necessary to include
the functionality of assessing the quality of assessments in LearningViz based on stu-
dent performance. Particularly, LearningViz illustrates the capability of each question
in distinguishing between the high- and low- performance students. With this feature,
instructors can pinpoint specific questions to investigate the reasons leading to the high
or poor performance of students on that question.
LearningViz implementation
Aligning with the above discussions about task analysis and design requirements, this
section will introduce the implementation of LearningViz ensuring that each feature is
directly linked to the instructional needs. Figure1 describes the overall architecture of
LearningViz uncovering how the functions within each module are connected to the
corresponding tasks. As shown in the figure, LearningViz encompasses three modules:
(1) e overall performance analysis module mainly analyzes and presents the over-
all student performance in the course, addressing the tasks of understanding the
Fig. 1 Overall architecture of LearningViz
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 9 of 25
Peietal. Smart Learning Environments (2024) 11:56
overall learning outcomes in the course and student general learning performance
within the course.
(2) e group performance analysis module mainly supports the identification of differ-
ent performance groups and examinations of performance gaps across groups.
(3) Final exam item analysis module analyzes student performance and performance
gaps on each question in the final exam.
Overall performance analysis module
In the Overall Performance Analysis Module, instructors are allowed to examine stu-
dents’ learning from the overall course level. e purpose of this module is to help
instructors have a general understanding of the overall learning outcomes in the course
and provide them with intuitive insights about the potential associations between course
structure and student performance. As described in Fig.1, the visualizations in this part
mainly address T.1 and T.2. Particularly, in this module, LearningViz presents student
learning performance from the following three aspects: (1) In the Letter Grade Distribu-
tion, the bar plot is adopted to offer instructors a general understanding of whether the
students’ learning outcomes in the course aligned with their initial expectations; (2) In
the Course Structure Analysis, the tree plot is used to present the structure of the course.
e second layer indicates students’ average performance within the corresponding
week, while the third layer further disaggregates students’ average performance on each
individual assessment. is visualization strategy allows instructors to clearly identify
potential factors affecting student learning performance, such as whether it was influ-
enced by the course load, or the specific topic cover during the corresponding week.
Moreover, we further visualized the associations between students’ average performance
on each assignment and their final performance, to highlight the topics and weeks that
significantly influence students’ final learning performance. (3) In the Performance
Pathway Analysis, we use the parallel categories diagram to present the changes of each
student’s learning performance across the consecutive weeks in the course on different
assessments. ese weekly performances can also be clustered based on the assessment
categories. For example, we can display how the changes of performance on homework,
quizzes and exams led to the final performance, respectively. Finally, we also employed
boxplot to explicitly visualize student performance distribution on each homework,
quiz, and exam in each week to further highlight the difficult topics for students.
Group performance analysis module
In the Group Performance Analysis Module, instructors can explore approaches to group
students into meaningful groups and investigate the differences in the average group
performance on each assignment. As indicated in Fig.1, the visualizations in this mod-
ule mainly address T.2 and T.3. Particularly, in this module, LearningViz provides infor-
mation about students’ learning performance from two dimensions: (1) Performance
Group Analysis provides fine-grained students’ performance information for instruc-
tors to determine learning performance groups: riving Group, nonriving Group, and
DFW Group. We employed the density estimation approach – a method commonly used
in data analysis to estimate the probability distribution of a dataset (Dehnad, 1987). is
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 10 of 25
Peietal. Smart Learning Environments (2024) 11:56
approach is particularly useful for identifying students’ performance patterns in a course,
by estimating the performance trends across various assessments. (2) Assignment Per-
formance Gap Analysis presents the gaps of average performance of students from dif-
ferent groups. In this section, we employed the heatmap plot to visualize the average
performance of each group on each assessment. Employing this visualization strategy
allows instructors to clearly distinguish differences in each group’s mastery levels on
each topic with the color saturation. On the other hand, the similarities of the color
saturation of a specific assessment across different groups can also indicate the capa-
bility of the assessment in distinguishing students from different performance groups.
For example, similar color saturation indicates highly similar group performance, which
also means that the assessment is either too easy or too difficult for students. As such,
instructors might need to revise the corresponding assessment in the further offerings.
In addition, we applied the line chart to explicitly present the performance gaps of stu-
dents from the riving and nonriving groups on multiple assessment categories (e.g.,
homework, quizzes and exam), to help instructors understand students’ performance
from multiple views. is visualization strategy highlights the performance gaps across
groups on each assessment, which can support instructors to generate specific interven-
tions to close those gaps. To a certain extent, the consistency of the performance gaps
across assessments also indicates the consistency of the assessment quality. For instance,
the similarity of performance gaps on several consecutive exams partially suggests that
the difficulty level of the exams is consistent.
Final exam item analysis module
In the Final Exam Item Analysis Module, instructors are able to assess students’ perfor-
mance on each question in the final exam, analyze group performance gaps and identify
difficult topics for each group to provide targeted interventions. As shown in Fig.1, the
visualizations in this module aim to address T.3 and T.4. Particularly, we present stu-
dents’ performance in the final exam as well as the insights for eliminating performance
gaps with the following aspects: (1) Question Quality Analysis presents the quality of
each question. To assess the quality of each question, we employed two commonly used
strategies: difficulty level analysis (Kaur & Kaur, 2015) and point-biserial correlation
(Bonett, 2020). Difficulty level analysis involves calculating the proportion of students
who answered a question correctly, with a lower value indicating a higher difficulty level
of the question. e calculation formula of difficulty (p) is as follows:
e point-biserial correlation between item and total score, on the other hand,
measures the capability of a question distinguishing high versus low performers. For
the ease of understanding, we visualize the correlations between correct and incor-
rect responses for each question among students with the overall final performance.
With this visualization, instructors can have a clear understanding whether answer-
ing the specific question correctly or incorrectly has an association with their over-
all performance in the final exam. (2) Question Performance Gap Analysis presents
the group performance gaps on each question. Specifically, we employed line charts
(1)
p=
The number of students who answered the question correctly
Total number of students
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 11 of 25
Peietal. Smart Learning Environments (2024) 11:56
to present the performance gaps, providing finer-grained information for instructors
about topics that cause those performance disparities. (3) Difficult Keyword Analysis
presents the keywords from the top 10 most difficult questions for each performance
group to provide targeted insights for instructors to close the performance gaps. With
this visualization, instructors can examine the topics that involve difficult keywords
and design targeted instructional strategies to enhance students’ performance on the
relevant topics.
Case studies
Based on the above discussions, we implemented LearningViz incorporating all the
relevant modules to support the course instructors’ teaching practices. e overall
system interface is shown in Fig.2. As displayed in the figure, the interface mainly
includes 3 panels, corresponding to the modules listed in Fig.1. Particularly, Fig.2A
addresses T1 and T2, Fig.2B addresses T2 and T3, and Fig.2C addresses T3 and T4.
After LearningViz was fully implemented, we invited the course instructor, along
with six other instructors with backgrounds in Educational Technology, to use the
platform. ese additional instructors were selected based on their expertise in
technology-integrated instructional and research practices. All of them have demon-
strated the ability to identify the optimal practices, with at least three years’ teaching
experience and published research articles in the relevant fields.
We reviewed each of the visualizations in LearningViz with them, discussing their
functionality, ease of understanding, and insights for applications in the teaching
practices. With this process, we can get detailed feedback about the use of Learn-
ingViz in the real teaching practices, identify the areas for improvement, and initiate
effective strategies for revisions.
Fig. 2 System interface of LearningViz. A provides an overview of the overall performance analysis module. B
presents the approaches for identifying and analyzing different performance groups. C offers approaches for
final exam item analysis and targeted interventions for closing performance gaps
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 12 of 25
Peietal. Smart Learning Environments (2024) 11:56
Overall course visual analysis
Overall course analysis aims to provide an overview about students’ overall performance
in the course, including letter grade distribution, course structure, associations among
weekly performance, student performance pathway on assessments (e.g., homework,
quiz, exam), and student performance distribution on those assessments. Figure3 pre-
sents the distribution of the letter grade in the course. From this figure, instructors can
have a clear understanding that most students in this course got a score above B-. Nota-
bly, a significant portion of these students received an A. Furthermore, fewer than 10
students’ performance fell within the range between C + and F.
To support instructors in having a fine-grained understanding of the factors that
influence student learning performance, we further visualized the course structure as
shown in Fig.4. In this figure, students’ learning performance in the course is pre-
sented from three different levels: the first level shows the course title; the second
level indicates student average performance within each week; and the third level
provides a breakdown of average performance on each assessment, with “Qi” indicat-
ing quiz i (i = 1, 2, … 30), “HWj” indicating homework j (j = 1, 2, …, 30), and “QSk”
for sectional exam k (k = 1, 2, 3, 4). Overall, this figure offers instructors two differ-
ent aspects of insights: (1) At the second level, the node sizes indicates how strongly
Fig. 3 Student overall letter grade distribution in the course. Most students got an A, and fewer than 10
students fell below C +
Fig. 4 Students’ average performance in each week and on each assessment. Note: some nodes are folded
for the sake of readability. For example, in the weeks, such as Week1, Week2, Week6, and Week13, there was
no homework
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 13 of 25
Peietal. Smart Learning Environments (2024) 11:56
the weekly workload impacts student learning performance, enabling instructors to
adjust the course schedule as needed, and (2) at the third level, the variations in node
sizes can help instructors better understand how students’ learning performance dif-
fer across different types of assessments (i.e., quizzes, homework, vs. exams).
With an understanding of the distribution of students’ letter grades over the course
as well as the average performance on each assessment in each week, Fig.5 further
analyzes how students’ performance on the assignments in each week correlated with
their overall performance in the final exam. As shown in the figure, we explicitly iden-
tified the number of students who had a performance over the average performance
on both the specific assessment and final exam. For example, the data on the selected
edge indicates 103 students scored above average on Q_29 in Week 15 and obtained a
score higher than average in their final exam. At the same time, the differences in the
portion of the color also indicate how learning in each week overall is related to that
in the final exam. With this analysis, instructors can gain a fine-grained understand-
ing of associations of students’ performance on different topics across weeks.
Instructors might also want to examine students’ general performance patterns over
the course, especially how performance on each assignment influences subsequent
assignments and the final exam. Figure6 visualizes students’ performance pathways
throughout the course, focusing on their homework performance. In this visualiza-
tion, each line represents the performance pathway of a specific student. is visu-
alization mechanism enables instructors to examine each student’s or all students’
performance patterns across each homework in the class, helping them identify the
specific topics and areas which students struggle with and need more instruction on.
For instance, the red line indicates a student who scored 0 on homework W8_HW9,
W9_HW11, W12_HW13, W14_HW14, W14_HW15, and W15_HW16, achieved a
score of 40 on the final exam. is indicates that this student failed to submit those
assignments and had a low engagement over the second half of the class. Moreo-
ver, similar visualization strategies have also been employed to show the students’
Fig. 5 The performance associations between each assessment and final exam
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 14 of 25
Peietal. Smart Learning Environments (2024) 11:56
performance on quizzes and exams, offering instructors a comprehensive under-
standing of students’ learning.
Furthermore, we also visualized the distribution of students’ learning performance
on each assessment type. Figure7 shows the distribution of students’ performance on
each quiz over the course. With this figure, instructors can clearly identify the quizzes
that most students struggled with as well as the ones students did well, which supports
instructors to implement effective strategies for better supporting students. For exam-
ple, from this figure, instructors can tell those students had a poor performance on some
quizzes such as W4_Q7, W5_Q12, and W5_Q13, while had a good performance on
some other quizzes such as W3_Q1, W3_Q2, and W3_Q3. Additionally, we also visual-
ized students’ performance distribution on homework and exams.
Group performance analysis
e group performance analysis module aims to evaluate student performance across
different groups. In this module, the density estimation is conducted to estimate the
distribution of students’ final scores, supporting instructors in grouping students into
riving Group, nonriving Group and DFW Group. en, the heatmap is used to pre-
sent the average performance of each group on each assessment to further examine the
differences in mastery levels of the topics in each week. Finally, the performance gap
analysis visualization is implemented to reveal performance gaps on each homework,
quiz and midterm exam. In this section, the performance gaps visualization will particu-
larly focus on students from riving and nonriving groups for demonstration.
In Fig.8, we aim to support instructors to define meaningful student performance
groups according to students’ learning performance as well as their domain knowl-
edge about the course. With this figure, the instructors in this course defined students
Fig. 6 Students’ performance pathway on each homework to the final exam. Each line in this figure
represents a student’s performance pathway in the course
Fig. 7 Students’ performance distribution on each quiz
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 15 of 25
Peietal. Smart Learning Environments (2024) 11:56
in the riving group with a score in the range of greater than 84, students in the
nonriving group between 70 and 84, and students in the DFW group with a score
less than 70. Further, this visualization also can support instructors to group students
Fig.8. Density estimation of students’ final scores.
After categorizing students into different performance groups, Fig.9 displays the
average performance of students within each performance group each week. e
colors in the figure indicate student knowledge mastery level on the topic within the
week. It is evident from the figure that most students have a high performance on
assessments such as W3_HW1, W3_HW2, W4_HW4, etc. And most students had a
low performance on assessments, such as W4_Q7, W4_Q8, W5_Q11, W5_Q12 etc.
Beyond showcasing student learning outcomes on individual assessments, this heat-
map visualization can also help instructors evaluate the capability of each assessment
in differentiating students’ performance groups. It further offers insights into refining
teaching strategies and adjusting the difficulty of assessments. For instance, assess-
ments that are too easy or too difficult for students are less effective at distinguish-
ing among student performance levels, which is demonstrated by the average group
performance on the above questions. With this visualization combined with contex-
tual knowledge, instructors can conduct appropriate strategies such as reducing the
lecture time and increasing question difficulty levels on the easy topics or increasing
lecture time and decreasing question difficulty levels on challenging ones.
Instructors also found it useful to examine the performance differences across
groups on each assessment. In Fig. 10, we particularly examined the performance
differences on each homework, quiz and midterm exam between riving and non-
riving groups, respectively. Figure10a reveals that the performance of students in
these two groups is similar across different homework, aligning with the applied mas-
tery-based instructional strategy with multiple submission attempts for homework.
Fig. 8 Density estimation of students’ final scores
Fig. 9 Heatmap of group average performance on each assignment
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 16 of 25
Peietal. Smart Learning Environments (2024) 11:56
However, in Fig. 10b, there are various levels of differences in quiz performance
across these two groups, with relatively larger performance gaps on quizzes such as
W5_Q12 and W5_Q13. Moreover, this visualization also highlights abnormal per-
formance on certain assignments for instructors. For example, it can be found that
there is a drop in average performance on the quiz W9_Q21 for students from both
the riving and nonriving groups, reaching the same performance level. As such,
instructors can particularly investigate the topics covered in these quizzes and their
corresponding design qualities to enhance students’ performance and close the per-
formance gaps. Figure10c compares the average performance differences on exams
of students from the riving and nonriving groups. From this figure, apart from
the consistent lower performance of students from the nonriving group, the per-
formance gaps across these two groups were consistent across the three exams. is
consistency indicates the effectiveness of these exams is comparable, further suggest-
ing a high level of design quality for each exam overall.
Final exam item analysis
With the understanding of the differences in the learning performance of students from
different groups, this part presents a fine-grained analysis of student groups’ perfor-
mance on each question in the final exam. Particularly, we examined the difficulty level
Fig. 10 Differences in average group performance between Thriving and nonThriving Groups on homework,
quizzes, and midterm exam
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 17 of 25
Peietal. Smart Learning Environments (2024) 11:56
of each question to help instructors identify the questions that were either too easy or
too difficult for students. e distributions of the final performance scores of students
who answered each question correctly and incorrectly were presented to highlight the
influence on the overall score regarding answering the question correctly or not. To offer
instructors targeted insights regarding closing the group performance gaps, we analyzed
the performance gaps on each question as well as the keywords involving the top 10
most difficult questions for each group of students.
Figure11a allowsinstructors to have a fine-grained understanding of students’ per-
formance on each question. For example, there are 31 out of 50 questions on which stu-
dents had a performance over 0.8 and 4 questions on which they scored lower than 0.4.
From this figure, instructors can identify the specific questions that need to be revised.
In this case, it can be found that nearly all students answered questions Q23 and Q32
correctly, while most of them answered Q35 incorrectly. As such, instructors would
closely investigate the design of these questions to make sure they can better evaluate
students’ learning. Figure11b further examined the relationships between the perfor-
mance on a specific question with that of the final exam. With this figure, instructors
can identify the questions that have less impact on the overall performance in the final
exam, whether students answered them correctly or not. ose questions include Q8,
Q27, and Q35, on which there is a large overlap in the distributions of final exam perfor-
mance between the students who answered questions correctly and those who answered
incorrectly.
After examining all students’ performance on each question, the following section
explores the performance gaps of the performance groups on each question as well as
the process of generating targeted insights to close these gaps. Figure12 illustrates the
performance gaps on each question across the three performance groups. From this fig-
ure, instructors can identify which questions contribute to significant performance gaps
Fig. 11 Students’ performance on each question
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 18 of 25
Peietal. Smart Learning Environments (2024) 11:56
among different groups. For instance, Q30 led to a larger performance gap among stu-
dents from the riving Group and nonriving Group. Moreover, instructors can also
analyze how the performance gaps vary with the order of the questions, which could
inform the design of future exams.
Figure13 extracted the keywords involved in the top 10 most difficult questions for
each group to help instructors better design the instructional materials for closing the
performance gaps. It shows that there were distinct differences in the topics that posed
difficulties for these groups. For example, students in the riving Group had more dif-
ficulties in the topics related to “total revenue”, “elasticity”, “demand” etc., and these top-
ics were also among the most difficult ones for students in the nonriving Group with
an additional difficult topic related to “cost”. However, for students in the DFW Group,
Fig.13c indicates that they were more likely to be struggling with topics related to “firm”,
“cost”, “equilibrium point”, etc., in addition to the challenging topics faced by both the
riving and nonriving Groups. ese analyses provide instructors specific insights
into the topics that require additional instruction for students from different perfor-
mance groups.
Fig. 12 The performance gaps on each question across the three performance groups
Fig. 13 The differences in keywords involved the top 10 most difficult questions for each performance
group. a The keywords involved in the top 10 most difficult questions for Thriving Group; b The keywords
involved in the top 10 most difficult questions for nonThriving Group; c The keywords involved in the top 10
most difficult questions for DFW Group
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 19 of 25
Peietal. Smart Learning Environments (2024) 11:56
Usability testing
e generalizability and usability of LearningViz is further evaluated based on a struc-
tured usability test. During the case study, we particularly employed the cognitive walk-
through method (Tinmaz & Singh Dhillon, 2024), a user-centered evaluation technique
that focuses on understanding how users interact with each module of the platform.
is method involves guiding participants through all the modules within the plat-
form, observing their interactions, and identifying the challenges and concerns that they
encountered. e relevant feedback is obtained for the purpose of refining and improv-
ing LearningViz, ensuring it meets various teaching and learning needs. Data collection
involved taking field notes to capture specific feedback from participants and admin-
istering a self-developed questionnaire to assess the usability, functionality and overall
user experience of the platform. During the case study, we fully discuss with participants
about the functions in LearningViz, encouraging them to provide detailed written feed-
back on each module. Participants were encouraged to reflect on how each feature could
better support their teaching practices, identifying areas that needed improvement, and
suggesting potential enhancements to make the platform more effective in their instruc-
tional contexts. e questionnaire included 11 selection questions, with 5 Likert scale
items focused on system usability and usefulness, and 7 open-ended questions. e
aim was to gauge participants’ experiences on using the platform and to gather sugges-
tions for future improvement. Descriptive statistics of the Likert items are presented in
Table1. Participants rated overall usability with a mean score of 4.12 (sd = 0.65), sug-
gesting favorable perceived usability. e average score on items assessing the system’s
usefulness was 4.33 (sd = 0.27), indicating above-average perceived utility.
Qualitative analysis revealed positive feedback on features and suggestions for future
improvements. Since the usability test was conducted with a specific focused group of
Table 1 Descriptive statistics of Likert items in the self-developed questionnaire
Likert item by theme Mean
System usability
If possible, could you please give a score indicating the overall usefulness of the platform? 4.5
Were you able to easily find the feature or function you were looking for? 4.17
Would you feel confident using this platform? 4.33
Were you able to understand each of the visualizations in this part? Could you provide a score to indicate
your understanding level? 4.33
Were there any moments you felt lost or unsure about the function next (The higher the more confu-
sion)? 2.67
How would you rate the performance pathway visualization? 4.5
How are you likely to recommend this platform to a colleague or friend? 4.33
System usefulness
If possible, could you please give a score indicating the overall usefulness of the Performance Overview
Interface? 4.33
To what extent, do you think the interfaces in this part help you examine students’ group performance
gaps? 4.67
To what extent, do you think visualizing the performance gaps and struggling topics are useful for you, if
you were an instructor? 4.33
How do you think the Biserial Correlation Plot helps you understand the associations of student perfor-
mance on a specific question and their overall performance on the final exam (Could you provide a score
indicating its usefulness?)?
4
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 20 of 25
Peietal. Smart Learning Environments (2024) 11:56
instructors and researchers, in the analysis stage, we extensively discussed and coded
each feedback provided by our participants. e goal was to derive the most actionable
and targeted insights that would inform further revisions, ensuring the revised Learn-
ingViz can be directly used by instructors in the following semester. Overall, participants
liked the user-friendly interface (“I love its clean UI”, “is platform has a clear structure
and a friendly user interface”) and found its visualizations beneficial for understanding
students’ past performance (“I think these figures are very helpful in visually demonstrat-
ing the difference in the performance of students”, “It provides valuable insights into which
areas students found challenging, enabling me to make necessary improvements and
adjustments to the course for the next semester”). Furthermore, participants identified
the group performance heatmap visualization as particularly valuable within this inter-
active dashboard for informing targeted course adjustments (“I will probably adjust the
difficulty levels of some too easy assignments or too difficult quizzes”, “ If I notice significant
areas where a majority of students are struggling or excelling, I will use this information
to make adjustments”). However, some participants noted down “support” and “guid-
ance” to help them make sense of the visualizations, as they contain massive information
and could get complex (“e figure is a little bit complex”, “ e figure of the performance
association panel is very attractive, but it seems to contain a lot of information, requir-
ing more time to understand the relationships between these variables”). Specifically, sug-
gestions included adding an introduction to the system and providing explanations and
scaffolding to assist users.
Discussions andconclusions
is study highlights the value of visual learning analytics for enhancing the teaching
and learning practices in large-scale courses. e data analysis and visualization meth-
ods were exemplified with a case study analyzing students’ actual learning and perfor-
mance data collected from Canvas, with implications that can be generalized to other
learning management systems (LMS). e visualizations in our LearningViz platform
enable instructors to investigate students’ performance, identify contributing factors,
and evaluate the quality of assessments. After the platform was implemented, we fur-
ther conducted a case study and a usability test with six researchers who have teaching
experience within the educational technology field. e evaluation feedback from the
usability test confirmed the usefulness and effectiveness of this platform in identifying
performance gaps as well as improving the quality of the instructional process.
LearningViz provides instructors with fine-grained insights to better understand stu-
dents’ learning in their classes. LearningViz analyzes and presents students’ learning per-
formance in a detailed and explainableway, highlighting the nuances and patterns that
might otherwise go unnoticed. As shown in Fig.6, without such a visualization, the poor
performance of a few students during certain weeks, along with the corresponding asso-
ciationsacross those weeks, would be easily overlooked due to the high performance of
the majority. is issue is particularly serious in courses that adopt mastery-based learn-
ing strategies, where tracking students’ performance can be more challenging (Searfoss,
2019). ese strategies allow students to use instructors’ feedback to revise their assign-
ments and improve their scores, which makes it difficult to capture the nuanced changes
in students’ learning performance with traditional analytical methods. By highlighting
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 21 of 25
Peietal. Smart Learning Environments (2024) 11:56
the abnormal and easily overlooked learning patterns, LearningViz enables instructors
to make timely and targeted adjustments in their teaching practices. As discussed in
Czerkawski and Lyman (2016), these dynamics can further drive the ongoing improve-
ment and adaptability in instructional techniques, fostering a more supportive and effec-
tive learning environment.
LearningViz provides instructors with more flexible ways to investigate student per-
formance within their classes. By offering visualizations on various aspects of students’
learning performance, LearningViz makes it easier for instructors to investigate stu-
dent performance gaps by categorizing them into different learning groups. ese tools
can further help instructors identify the specific topics that cause performance gaps as
shown in Figs.9 and 10. e employed color scheme can highlight the gaps between
students’ average scores, making it easier for instructors to quickly assess the areas of
concern and performance disparities. Furthermore, LearningViz can also help instruc-
tors examine the consistency in the difficulty levels across exams. As shown in Fig.10c,
the consistency on the gaps across the three exams indicates that these exams were uni-
formly challenging for students from the two performance groups. With such informa-
tion, instructors can investigate the fairness of the instructional approaches to students
from different backgrounds (AlQuraan, 2024).
LearningViz allows instructors to evaluate the quality of assessments and provide tar-
geted insights for improvement. By visualizing students’ performance on each assessment
including questions in the final exam, LearningViz can help instructors easily identify
the assessment and questions that are either too difficult or too easy, thereby failing to
serve their assessing purpose. For instance, Fig.11a clearly indicated that the Q8 and
Q35 in the final exam had a high difficulty level for all students, facilitating instructors
to investigate those questions closely. Highlighting those outlier questions, instead of
students’ performance, can drive instructors to refine assessment strategies and adjust
question difficulty to ensure that assessments effectively assess students’ learning per-
formance. Moreover, this visualization strategy can also facilitate instructors to analyze
the differences in performance of students from different demographic backgrounds
(Steinke & Fitch, 2017), enabling them to investigate the cultural relevance and inclusiv-
ity of the instructional materials.
In conclusion, with increasing discussions about the effectiveness of educational
dashboard in supporting the teaching and learning practices (Albó etal., 2019; Fer-
reira etal., 2019; Giannakos, 2022), LearningViz is proposed to explicitly address the
performance gaps in the large-sized classes, especially when using certain instruc-
tional strategy like mastery-based learning. rough a case study, we demonstrated
the features and functionalities of the platform in great detail, showing how it can
help instructors generate informed and timely interventions for eliminating the exist-
ing performance gaps and improving the overall class performance. Additionally, we
employed the cognitive walkthrough method as our usability evaluation approach,
extensively discussing with instructors about the features of the platform and collect-
ing their feedback and experiences while engaging with LearningViz. e results indi-
cate that LearningViz can effectively help instructors examine students’ performance
patterns in classes, identify performance gaps across performance groups, examine
the causing factors, and further implement efficient strategies to close the gaps. e
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 22 of 25
Peietal. Smart Learning Environments (2024) 11:56
positive feedback from the usability test further highlights the potential of general-
izing LearningViz to the teaching and learning practices in other educational settings.
Limitations andfuture directions
e implementation of LearningViz contributes to the practices of promoting trans-
parency and accountable educational decision-making processes by offering a unique
perspective from examining and addressing performance disparities. However, limi-
tations exist in this study. LearningViz was designed with learning performance data
from a particular fundamental course, which limits the potential of the insights gen-
erated from this study to be applied by instructors from other fields. Additionally, the
participants of the study for usability testing were mainly from the Educational Tech-
nology fields, chosen for their ability to provide specialized feedback on the effective-
ness of LearningViz with the considerations of various educational pedagogies. is
might be another limitation, as the feedback may not fully consider the contextual
nuances of different instructional environments. Given these limitations, the future
practice should consider including the following aspects: (1) Expanding the usabil-
ity testing participants to include a more diverse group of instructors from various
subject areas, ensuring that the feedback reflects a broader range of teaching needs
in various contexts; (2) Incorporating more courses from other areas with different
instructional strategies to enhance the platform’s adaptability to multiple disciplines.
In addition, the implementation of LearningViz also offers valuable insights regarding
integrating data visualization into educational assessment, offering a model to pro-
mote transparency and informed decision-making about students’ learning perfor-
mance in the educational testing field. With the collaborative efforts of researchers
across related disciplines, LearningViz can be further refined to support more equita-
ble and effective educational practices.
Acknowledgements
N/A
Author contributions
Bo Pei and Ying Cheng conceptualized the study. Bo Pei and Alex Ambrose developed the methodology. Bo Pei was
responsible for formal analysis, visualizations, and software development. Alex Ambrose, Ying Cheng, Wanli Xing carried
out the validation and investigation. Resources were provided by Eva Dziadula. And usability test was conducted by Jie
Lu. The original draft was written by Bo Pei and Jie Lu supervised by Alex Ambrose. The manuscript was reviewed and
edited by Alex Ambrose, Ying Cheng, and Eva Dziadula.
Funding
N/A.
Availability of data and material
The datasets and materials used and/or analyzed during the current study are available from the corresponding author
upon request.
Declarations
Ethics approval and consent to participate
The student learning performance data used in this study was collected in collaboration with the course instructors and
the University’s Office of Information Technology’s Teaching & Learning Technologies team. Before handing it over for
analysis, all the information that could be used to identify a specific student has been encrypted. This study was further
approved by the Umbrella IRB Protocol in the university involved. For the instructors involved in the usability test, con-
sent for participation was obtained prior to the start of the study.
Competing interests
The authors declare that they have no conflict of interests.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 23 of 25
Peietal. Smart Learning Environments (2024) 11:56
Received: 14 July 2024 Accepted: 11 November 2024
References
Albó, L., Barria-Pineda, J., Brusilovsky, P., & Hernández-Leo, D. (2019). Concept-level design analytics for blended courses.
In M. Scheffel, J. Broisin, V. Pammer-Schindler, A. Ioannou, & J. Schneider (Eds.), Transforming learning with meaningful
technologies (pp. 541–554). Springer. https:// doi. org/ 10. 1007/ 978-3- 030- 29736-7_ 40
AlQuraan, M. (2024). Assessing item fairness in students’ evaluation of teaching based on students’ academic college
using measurement invariance analysis. Journal of Applied Research in Higher Education. https:// doi. org/ 10. 1108/
JARHE- 07- 2023- 0279
Al-Tameemi, R. A. N., Johnson, C., Gitay, R., Abdel-Salam, A.-S.G., Hazaa, K. A., BenSaid, A., & Romanowski, M. H. (2023).
Determinants of poor academic performance among undergraduate students—a systematic literature review.
International Journal of Educational Research Open, 4, 100232. https:// doi. org/ 10. 1016/j. ijedro. 2023. 100232
Bañeres, D., Rodríguez, M. E., Guerrero-Roldán, A. E., & Karadeniz, A. (2020). An early warning system to detect at-risk
students in online higher education. Applied Sciences, 10(13), 4427. https:// doi. org/ 10. 3390/ app10 134427
Block, J. H., & Burns, R. B. (1976). Mastery learning. Review of Research in Education, 4, 3–49. https:// doi. org/ 10. 2307/ 11671
12
Bloom, B. S. (1984). The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutor-
ing. Educational Researcher, 13(6), 4–16. https:// doi. org/ 10. 3102/ 00131 89X01 30060 04
Bonett, D. G. (2020). Point-biserial correlation: Interval estimation, hypothesis testing, meta-analysis, and sample size
determination. British Journal of Mathematical and Statistical Psychology, 73(S1), 113–144. https:// doi. org/ 10. 1111/
bmsp. 12189
Burns, M. A., Johnson, V. N., Grasman, K. S., Habibi, S., Smith, K. A., Kaehr, A. I., Lacar, M. F., & Yam, B. F. (2023). Pedagogically
grounded techniques and technologies for enhancing student learning. Advances in Engineering Education, 11(3),
77–107.
Caena, F., & Redecker, C. (2019). Aligning teacher competence frameworks to 21st century challenges: The case for
the European digital competence framework for educators (Digcompedu). European Journal of Education, 54(3),
356–369. https:// doi. org/ 10. 1111/ ejed. 12345
Chen, L., Lu, M., Goda, Y., & Yamada, M. (2019). Design of learning analytics dashboard supporting metacognition. In
International association for development of the information society. https:// eric. ed. gov/? id= ED608 646
Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering student engagement in online
learning environments. TechTrends, 60(6), 532–539. https:// doi. org/ 10. 1007/ s11528- 016- 0110-z
Dehnad, K. (1987). Density estimation for statistics and data analysis. Technometrics. https:// doi. org/ 10. 1080/ 00401 706.
1987. 10488 295
Deng, H., Wang, X., Guo, Z., Decker, A., Duan, X., Wang, C., Alex Ambrose, G., & Abbott, K. (2019). PerformanceVis: Visual
analytics of student performance data from an introductory chemistry course. Visual Informatics, 3(4), 166–176.
https:// doi. org/ 10. 1016/j. visinf. 2019. 10. 004
Ferreira, H., de Oliveira, G. P., Araújo, R., Dorça, F., & Cattelan, R. (2019). Technology-enhanced assessment visualization for
smart learning environments. Smart Learning Environments, 6(1), 14. https:// doi. org/ 10. 1186/ s40561- 019- 0096-z
Giannakos, M. (2022). Educational data, learning analytics and dashboards. In M. Giannakos (Ed.), In Experimental
studies in learning technology and child–computer interaction (pp. 27–36). Springer. https:// doi. org/ 10. 1007/
978-3- 031- 14350-2_4
Guskey, T. R. (2009). Formative assessment: The contributions of Benjamin S. Bloom. In Handbook of formative assessment.
Routledge.
Guskey, T. (2010). Lessons of mastery learning. Educational Leadership, 68(2), 52–57.
Gutiérrez-Braojos, C., Rodríguez-Domínguez, C., Daniela, L., & Carranza-García, F. (2023). An analytical dashboard of
collaborative activities for the knowledge building. Technology, Knowledge and Learning. https:// doi. org/ 10. 1007/
s10758- 023- 09644-y
He, H., Dong, B., Zheng, Q., & Li, G. (2019). VUC: Visualizing daily video utilization to promote student engagement in
online distance education. In Proceedings of the ACM conference on global computing education (pp 99–105). https://
doi. org/ 10. 1145/ 33001 15. 33095 14
Hew, K. F., & Cheung, W. S. (2014). Students’ and instructors’ use of massive open online courses (MOOCs): Motivations
and challenges. Educational Research Review, 12, 45–58. https:// doi. org/ 10. 1016/j. edurev. 2014. 05. 001
Johar, N. A., Kew, S. N., Tasir, Z., & Koh, E. (2023). Learning analytics on student engagement to enhance students’ learning
performance: A systematic review. Sustainability, 15(10), 7849. https:// doi. org/ 10. 3390/ su151 07849
Kaur, K., & Kaur, K. (2015). Analyzing the effect of difficulty level of a course on students performance prediction using
data mining. In 2015 1st International conference on next generation computing technologies (NGCT) (pp. 756–761).
https:// doi. org/ 10. 1109/ NGCT. 2015. 73752 22
Lee-Cultura, S., Sharma, K., & Giannakos, M. N. (2024). Multimodal teacher dashboards: Challenges and opportunities of
enhancing teacher insights through a case study. IEEE Transactions on Learning Technologies, 17, 181–201. https://
doi. org/ 10. 1109/ TLT. 2023. 32768 48
Lopez, G., Seaton, D. T., Ang, A., Tingley, D., & Chuang, I. (2017). Google bigquery for education: framework for parsing and
analyzing edX MOOC data. In Proceedings of the 4th (2017) ACM conference on learning @ scale (pp. 181–184). https://
doi. org/ 10. 1145/ 30514 57. 30539 80
Masiello, I., Mohseni, Z. A., Palma, F., Nordmark, S., Augustsson, H., & Rundquist, R. (2024). A current overview of the use of
learning analytics dashboards. Education Sciences, 14(1), 82. https:// doi. org/ 10. 3390/ educs ci140 10082
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 24 of 25
Peietal. Smart Learning Environments (2024) 11:56
Matz, R., Schulz, K., Hanley, E., Derry, H., Hayward, B., Koester, B., Hayward, C., & McKay, T. (2021). Analyzing the efficacy of
ECoach in supporting gateway course success through tailored support. In LAK21: 11th International learning analyt-
ics and knowledge conference (pp. 216–225). https:// doi. org/ 10. 1145/ 34481 39. 34481 60
Mendez, G., Galárraga, L., & Chiluiza, K. (2021). Showing academic performance predictions during term planning: effects
on students’ decisions, behaviors, and preferences. In Proceedings of the 2021 CHI conference on human factors in
computing systems (pp. 1–17). https:// doi. org/ 10. 1145/ 34117 64. 34457 18
Meyer, M. S., Shen, Y., & Plucker, J. A. (2024). Reducing excellence gaps: A systematic review of research on equity in
advanced education. Review of Educational Research, 94(1), 33–72. https:// doi. org/ 10. 3102/ 00346 54322 11484 61
Mohammadhassan, N., & Mitrovic, A. (2022). Investigating the effectiveness of visual learning analytics in active video
watching. In M. M. Rodrigo, N. Matsuda, A. I. Cristea, & V. Dimitrova (Eds.), Artificial intelligence in education (pp.
127–139). Springer. https:// doi. org/ 10. 1007/ 978-3- 031- 11644-5_ 11
Paas, F., Renkl, A., & Sweller, J. (2004). cognitive load theory: instructional implications of the interaction between informa-
tion structures and cognitive architecture. Instructional Science, 32(1/2), 1–8.
Paiva, R., Bittencourt, I. I., Lemos, W., Vinicius, A., & Dermeval, D. (2018). Visualizing learning analytics and educational data
mining outputs. In C. Penstein Rosé, R. Martínez-Maldonado, H. U. Hoppe, R. Luckin, M. Mavrikis, K. Porayska-Pomsta,
B. McLaren, & B. du Boulay (Eds.), Artificial intelligence in education (pp. 251–256). Springer. https:// doi. org/ 10. 1007/
978-3- 319- 93846-2_ 46
Ramanujan, D., Bernstein, W. Z., Chandrasegaran, S. K., & Ramani, K. (2017). Visual analytics tools for sustainable lifecycle
design: current status, challenges, and future opportunities. Journal of Mechanical Design, 139(11), 111415. https://
doi. org/ 10. 1115/1. 40374 79
Searfoss, R. (2019). Teachers’ perceptions about a high school mastery-based learning program. Walden dissertations and
doctoral studies. https:// schol arwor ks. walde nu. edu/ disse rtati ons/ 6666
Sedig, K., & Parsons, P. (2013). Interaction design for complex cognitive activities with visual representations: A pattern-
based approach. AIS Transactions on Human-Computer Interaction, 5(2), 84–133.
Sedlmair, M., Meyer, M., & Munzner, T. (2012). Design study methodology: reflections from the trenches and the stacks.
IEEE Transactions on Visualization and Computer Graphics, 18(12), 2431–2440. https:// doi. org/ 10. 1109/ TVCG. 2012. 213
Shi, C., Fu, S., Chen, Q., & Qu, H. (2015). VisMOOC: Visualizing video clickstream data from massive open online courses.
IEEE Pacific Visualization Symposium (PacificVis), 2015, 159–166. https:// doi. org/ 10. 1109/ PACIF ICVIS. 2015. 71563 73
Spertus, E., & Kurmas, Z. (2021). Mastery-based learning in undergraduate computer architecture. ACM/IEEE Workshop on
Computer Architecture Education (WCAE), 2021, 1–7. https:// doi. org/ 10. 1109/ WCAE5 3984. 2021. 97071 47
Steinke, P., & Fitch, P. (2017). Minimizing bias when assessing student work. Research & Practice in Assessment, 12, 87–95.
Sweller, J. (2020). Cognitive load theory and educational technology. Educational Technology Research and Development,
68(1), 1–16. https:// doi. org/ 10. 1007/ s11423- 019- 09701-3
Szulewski, A., Braund, H., Dagnone, D. J., McEwen, L., Dalgarno, N., Schultz, K. W., & Hall, A. K. (2023). The assessment
burden in competency-based medical education: How programs are adapting. Academic Medicine, 98(11), 1261.
https:// doi. org/ 10. 1097/ ACM. 00000 00000 005305
Tinmaz, H., & Singh Dhillon, P. K. (2024). User-centric avatar design: A cognitive walkthrough approach for metaverse in
virtual education. Data Science and Management. https:// doi. org/ 10. 1016/j. dsm. 2024. 05. 001
Wang, Y., & Qi, G. Y. (2018). Mastery-based language learning outside class: Learning support in flipped classrooms. http:// hdl.
handle. net/ 10125/ 44641
Weinstein, Y., Madan, C. R., & Sumeracki, M. A. (2018). Teaching the science of learning. Cognitive Research: Principles and
Implications, 3(1), 2. https:// doi. org/ 10. 1186/ s41235- 017- 0087-y
Wong, J.-S., & Zhang “Luke,” X. (2018). MessageLens: A visual analytics system to support multifaceted exploration of
mooc forum discussions. Visual Informatics, 2(1), 37–49. https:// doi. org/ 10. 1016/j. visinf. 2018. 04. 005
Xia, M., Xu, M., Lin, C., Cheng, T. Y., Qu, H., & Ma, X. (2020). SeqDynamics: Visual analytics for evaluating online problem-
solving dynamics. Computer Graphics Forum, 39(3), 511–522. https:// doi. org/ 10. 1111/ cgf. 13998
Xiaoya, G., Kan, L., & Ping, L. (2009). Visual analysis of college students’ scores in English test. In 2009 4th International
conference on computer science & education (pp. 1816–1819). https:// doi. org/ 10. 1109/ ICCSE. 2009. 52282 53
Yang, A. C. M., Flanagan, B., & Ogata, H. (2022). Adaptive formative assessment system based on computerized adaptive
testing and the learning memory cycle for personalized learning. Computers and Education: Artificial Intelligence, 3,
100104. https:// doi. org/ 10. 1016/j. caeai. 2022. 100104
Zesch, T., Horbach, A., & Zehner, F. (2023). To score or not to score: Factors influencing performance and feasibility of
automatic content scoring of text responses. Educational Measurement: Issues and Practice, 42(1), 44–58. https:// doi.
org/ 10. 1111/ emip. 12544
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Dr. Bo Pei is an Assistant Professor of Instructional Technology at the College of Education at the Univer-
sity of South Florida. His research interests lie in Learning Analytics, AI in Education, and Educational Data
Visualization. He mainly works on mechanisms or strategies for promoting equitable education through
exploring approaches to explain and visualize the decision-making processes of AI algorithms in educa-
tional settings to better support the teaching and learning practices.
Dr. Ying Cheng is a Professor in the Department of Psychology and Associate Director in the Lucy Fam-
ily Institute for Data and Society at the University of Notre Dame. She is interested in trustworthiness of AI
in education, theoretical development and applications of item response theory (IRT), including comput-
erized adaptive testing (CAT), test equity across different ethnicity/gender groups (formally known as dif-
ferent item functioning or DIF), classification accuracy and consistency with licensure/certification of state
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Page 25 of 25
Peietal. Smart Learning Environments (2024) 11:56
graduation exams, and cognitive diagnostic models and their applications to CAT.
Dr. Alex Ambrose is a professor of practice at Notre Dame Learning at the University of Notre Dame. His
research focuses on applied learning research, design, and evaluation, including learning analytics, active
learning classrooms, flexible learning spaces, digital portfolios, and badges.
Dr. Eva Dziadula is a Teaching Professor at the Department of Economics at the University of Notre
Dame. Her research focuses on the area of migration choices and immigrant assimilation in the United
States, specifically focusing on marriage, divorce, fertility, and citizenship acquisition.
Dr. Wanli Xing is an Associate Professor of Educational Technology at the University of Florida. His
research interests are artificial intelligence, learning analytics, STEM education and online learning. Dr.
Xing’s research is dedicated to pioneering strategies, frameworks, and technologies that revolutionize
STEM education and online learning. He creates learning environments using cutting-edge technologies,
such as artificial intelligence, computer simulations and modeling, internet of things, and augmented real-
ity to support learning in diverse classrooms and online environments.
Dr. Jie Lu is an Assistant Professor at the University of Georgia in the department of Workforce Education
and Instructional Technology. Her work focuses on the development and evaluation of advanced learning
technologies, including utilizing AI in education with a focus on preservice teachers.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com