Conference PaperPDF Available

Abstract and Figures

It has been long argued that learning analytics has the potential to act as a \middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
Content may be subject to copyright.
Awareness is not enough. Pitfalls of learning
analytics dashboards in the educational practice
Ioana Jivet1, Maren Scheffel1, Hendrik Drachsler2,3,1, and Marcus Specht1
1Open Universiteit, Valkenburgerweg 177, 6419AT Heerlen, NL
ioana.jivet@ou.nl, maren.scheffel@ou.nl, hendrik.drachsler@ou.nl,
marcus.specht@ou.nl
2Goethe University Frankfurt, Germany
3German Institute for International Educational Research (DIPF), Germany
drachsler@dipf.de
Abstract. It has been long argued that learning analytics has the po-
tential to act as a “middle space” between the learning sciences and data
analytics, creating technical possibilities for exploring the vast amount
of data generated in online learning environments. One common learn-
ing analytics intervention is the learning dashboard, a support tool for
teachers and learners alike that allows them to gain insight into the
learning process. Although several related works have scrutinised the
state-of-the-art in the field of learning dashboards, none have addressed
the theoretical foundation that should inform the design of such inter-
ventions. In this systematic literature review, we analyse the extent to
which theories and models from learning sciences have been integrated
into the development of learning dashboards aimed at learners. Our crit-
ical examination reveals the most common educational concepts and the
context in which they have been applied. We find evidence that cur-
rent designs foster competition between learners rather than knowledge
mastery, offering misguided frames of reference for comparison.
Keywords: learning dashboards, learning theory, learning analytics, sys-
tematic review, learning science, social comparison, competition
1 Introduction
Learning Analytics (LA) emerged from the need to harness the potential of the
increasingly large data sets describing learner data generated by the widespread
use of online leaning environments and it has been defined as “the measurement,
collection, analysis, and reporting of data about learners and their contexts,
for purposes of understanding and optimising learning and the environments in
which it occurs” [37]. Ferguson [13] identified two main challenges when it comes
to learning analytics: (i) building strong connections to learning sciences and (ii)
focusing on the perspectives of learners.
There is a common notion in the LA community that learning analytics
research should be deeply grounded in learning sciences [28,29]. Suthers & Ver-
bert [38] labelled LA the “middle space” as it lies at the intersection between
PRE PRINT
technology and learning sciences. Moreover, LA should be seen as an educational
approach guided by pedagogy and not the other way around [19]. However, there
is a strong emphasis on the “analytics”, i.e. computation of the data and cre-
ation of predictive models, and not so much on the “learning”, i.e. applying
and researching LA in the learning context where student outcomes can be im-
proved [17].
One of the focuses of LA research is to empower teachers and learners to make
informed decisions about the learning process, mainly by visualising the collected
learner data through dashboards [9]. Learning analytics dashboards are “single
displays that aggregate different indicators about learner (s), learning process(es)
and/or learning context(s) into one or multiple visualizations” [35]. Dashboards
have been developed for different stakeholder groups, including learners, teach-
ers, researchers or administrative staff [35]. Charleer et al. [5] suggest that LA
dashboards could be used as powerful metacognitive tools for learners, triggering
them to reason about the effort invested in the learning activities and learning
outcomes. However, a large majority of dashboards are still aimed at teachers, or
at both teachers and learners [35]. Moreover, there has been very little research
in terms of what effects such tools have on learning [26].
As a first step towards building effective dashboards for learners, we need to
understand how learning sciences can be considered in the design and pedagogi-
cal use of learning dashboards. Following Suther and Verbert’s [38] position that
learning analytics research should be explicit about the theory or conception of
learning underlying the work, we sought out to investigate which educational
concepts constitute the theoretical foundation for the development of learning
dashboards aimed at learners.
A number of previous works reviewed LA dashboards from different per-
spectives, including their design and evaluation. Verbert et al. [42] introduced
a conceptual framework for analysing LA applications and reviewed 15 dash-
boards based on the target users, displayed data and the focus of the evaluation.
A follow-up review [43] extended this analysis to 24 dashboards, examining the
context in which the dashboards had been deployed, the data sources, the de-
vices used and the evaluation methodology. Yoo et al. [47] reviewed the design
and evaluation of 10 educational dashboards for teachers and students through
their proposed evaluative tool based on Few’s principles of dashboard design [15]
and Kirkpatrick’s four-level evaluation model [25]. A more recent systematic re-
view by Schwendimann et al. [35] of 55 dashboards looked at the context in
which dashboards had been deployed, their purpose, the displayed indicators,
the technologies used, the maturity of the evaluation and open issues.
The scope of all these reviews included learning analytics dashboards, regard-
less of their target users. Focusing on the challenges identified by Ferguson [13],
we narrow down our scope to LA dashboards aimed at learners in order to focus
on their perspective. A closely related work to this paper was published by Bod-
ily & Verbert [4]. They provided a systematic review that focused exclusively
on student-facing LA systems, including dashboards, educational recommender
systems, EDM systems, ITS and automated feedback systems. The systems were
PRE PRINT
analysed based on functionality, data sources, design analysis, perceived effects
on learners and actual effects.
Although other works looked into the learning theory foundations of game-
based learning [46], one major limitation of previous dashboard reviews is that
none investigate the connection to learning sciences. Moreover, [35] and [4] pro-
vide recommendations for the design of learner dashboards, but none suggest
the use of educational concepts as a basis for the design or evaluation of the
dashboards. Through this systematic literature review we aim to bridge this gap
by investigating the relation between educational concepts and the design of
learning dashboards. Dashboard design was previously examined by looking at
the type of data displayed on the dashboard and the type of charts or visuali-
sation that were used. However, in this study, we will specifically focus on how
the data presented on the dashboard is contextualised and framed to ease the
sense-making for the learners.
Throughout this literature review, we explore how educational concepts are
integrated into the design of learning dashboards. Our study is guided by the
following research question: According to which educational concepts are learning
analytics dashboards designed?
2 Methodology
Prior to the systematic review, we conducted an informative literature search in
order to get an overall picture of the field. We ran the systematic literature review
following the PRISMA statement [31] and we selected the following databases
which contain research in the field of Technology Enhanced Learning: ACM Dig-
ital Library, IEEEXplore, SpringerLink, Science Direct, Wiley Online Library,
Web of Science and EBSCOhost. Additionally, we included Google Scholar to
cover any other sources, limiting the number of retrieved results to 200. We
searched the selected databases using the following search query: “learning an-
alytics” AND (visualization OR visualisation OR dashboard OR widget). The
first term narrows down the search field to learning analytics, while the second
part of the query is meant to cover different terminologies used for this type of
intervention, addressing one of the limitations identified in [35]. Although the
scope of this review is limited to visualisations that have learners as end-users, it
was not possible to articulate this criterion in relevant search terms. Therefore,
the approach that we took was to built a query that retrieves all dashboards,
regardless of their target end-users, and remove the ones that fall out of our
scope in a later phase.
The queries were run on February 20th, 2017, collecting 1439 hits. Each result
was further screened for relevance, i.e. whether it described a learning dashboard
aimed at learners, by examining the title and the abstract, thus reducing the list
of potential candidate papers to 212. Eleven papers that we came across during
the informal check and fit the scope of our survey were also added to the set of
papers to be further examined. Next, we accessed the full text of each of these
PRE PRINT
223 studies in order to assess whether they are eligible for our study considering
the following criteria:
1. the paper’s full text is available in English;
2. the paper describes a fully developed dashboard, widget or visualisation, i.e.
we excluded theoretical papers, essays or literature reviews;
3. the target user group of the dashboard is learners;
4. the authors explicitly mention theoretical concepts for the design;
5. the paper includes an evaluation of the dashboard.
We identified 95 papers that satisfied the first three criteria. Only half of these
papers have theoretical grounding in educational concepts, suggesting a large gap
between learning sciences and this type of learning analytics interventions. The
focus of this study is set on 26 papers that describe dashboards that both rely
on educational concepts (criterion 4) and were empirically evaluated (criterion
5). The list of papers included in this review is available at bit.ly/LADashboards.
3 Results
We started this investigation by collecting the theoretical concepts and models
used in the dashboards and analysing the relationships between the purpose of
the dashboards and the concepts that were employed in the development of the
dashboard. Next, we looked at how the design of these dashboards integrate
different concepts from learning sciences.
3.1 Learning theories and models
By analysing the introduction, background and dashboard design sections of
each of the papers included in this study, we identified 17 theories, models and
concepts that we bundled into six clusters (see Table 1).
EC1: Cognitivism cluster relies upon the cognitivism paradigm which posits
that learning is an internal process, involving the use of memory, thinking,
metacognition and reflection [1]. This is the most represented category through
self-regulated learning (SRL), 16 papers citing the works of Zimmerman [48],
Pintrich [33] or Winne [44]. Deep vs surface learning theory explains differ-
ent approaches to learning, where deep learners seek to understand the mean-
ing behind the material and surface learners concentrate on reproducing the
main facts [21]. EC2: Constructivism cluster is rooted in the assumption that
learners are information constructors and learning is the product of social in-
teraction [1]. Social constructivist learning theory [24] and Paul-Elder's critical
thinking model [11] have been used mostly in dashboards aimed to offer learner
support in collaborative settings, while Engestr¨om’s activity theory [12] was used
as a pedagogical base for supporting university students overcome dyslexia. EC3:
Humanism cluster puts the learner at the centre of the learning process, seeking
to engage the person as a whole and focusing on the study of the self, motivation
PRE PRINT
Table 1. Six clusters presentation of educational concepts identified and the pa-
pers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Cluster Educational concept Freq. Papers
EC1: Cognitivism Self-regulated learning 16
D1; D4; D5; D7; D9; D11; D12;
D14; D15; D18; D20; D21; D22;
D23; D25; D26
Deep vs surface learning 2 D16; D19
EC2: Constructivism
Collaborative learning 6 D12; D13; D14; D16; D24; D26
Social constructivist
learning theory 4 D7; D13; D19; D22
Engestr¨
om activity theory 1 D12
Paul-Elder's critical
thinking model 1 D19
EC3: Humanism
Experiential learning 2 D4; D13
Learning dispositions 1 D2
21st century skills 4 D2; D11; D13; D19
Achievement goal orientation 3 D15; D19; D24
EC4: Descriptive
models Engagement model 1 D10
EC5: Instructional
design
Universal Design for Learning
instructional framework 1 D19
Formative assessment 3 D3; D6; D19
Bloom’s taxonomy 3 D3; D4; D22
EC6: Psychology
Ekman’s model for
emotion classification 1 D23
Social comparison 3 D8; D15; D25
Culture 1 D25
and goals [8]. More recent works focus on developing 21st century skills [40] and
learning dispositions [36]. Achievement goal orientation theory is concerned with
learners’ motivation for goal achievement [32]. EC4: Descriptive models cluster
includes the engagement model [16] which differentiates between behavioural,
emotional and cognitive engagement. Several papers also cover the pedagogical
use of dashboards, aligning the EC5: Instructional design in which the dashboard
is embedded with Bloom’s taxonomy [3], formative assessment [34] or Universal
Design for Learning framework [6]. While the majority of these clusters contain
concepts belonging to the learning sciences field, we also identified three con-
cepts that originate in the broader field of EC6: Psychology: Ekman’s model of
emotions and facial expressions [10], social comparison [14] and culture [18, 22].
3.2 Dashboard goals and educational concepts
In order to understand the reasons behind using these educational concepts, we
analysed the goals of the dashboards and looked at how their use was explained in
the papers. We extracted the goals of each dashboard and categorised them based
on the competence they aimed to affect in learners: metacognitive, cognitive,
PRE PRINT
Table 2. Competencies, the goals that are intended to affect each competence and the
papers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Competence Goal Freq. Papers
C1: Metacognitive
Improvemetacognitive
skills 4 D6; D7; D20; D23
Support awareness
and reflection 20
D1; D2; D3; D4; D6; D7; D9; D10;
D11; D12; D13; D14; D17; D18;
D20; D21; D22; D23; D25; D26
Monitor progress 8D7; D8; D11; D15; D19; D20;
D22; D23
Support planning 2 D20; D22
C2: Cognitive Support goal achievement3 D9; D18; D25
Improve performance 3 D16; D23; D24
C3: Behavioural
Improve retention or
engagement2 D10; D25
Improve online social
behaviour 7D7; D13; D14; D16; D19; D24;
D26
Improve help-seeking
behaviour 1 D17
Offer navigational support 2 D8; D15
C4: Emotional
Deactivate negative
emotions 1 D9
Increase motivation 4 D2; D8; D15; D19
C5: Self-regulation Support self-regulation 13
D1; D4; D7; D9; D11; D12;
D15; D19; D20; D21; D22;
D23; D25
behavioural or emotional (see Table 2). Most of the dashboards do not serve only
one goal, but rather aim to catalyse changes in multiple competencies. A fifth
category C5: Self-regulation was also added to account for papers that explicitly
described their goal as supporting self-regulation, a concept that involves all four
competencies [48].
Figure 1 illustrates the relation between the goals of the dashboards and the
educational concept clusters listed in Table 1. We can draw some interesting
observations from these connections. Firstly, the largest part of the visualisa-
tions aim to influence learners’ metacognitive competence, with the purpose of
supporting awareness and reflection. This aim is often motivated by SRL the-
ory, a learning concept that heavily relies on the assumption that actions are
consequences of thinking as SRL is achieved in cycles consisting of forethought,
performance and self-reflection [49]. SRL also motivates the goal of monitoring
progress and supporting planning, but to a lesser extent. Social constructivist
learning theory and collaborative learning also appear quite frequently in rela-
tion to metacognition, due to the collaborative setting in which the dashboards
were used. Dashboard developers argue that for effective collaboration, learners
need to be aware of their teammates’ learning behaviour, activities and out-
PRE PRINT
comes. Other concepts used for affecting the metacognitive level are formative
assessment as it implies evaluation of one’s performance, 21st century skills with
the focus on learning how to learn and social comparison as a means for framing
the evaluation of one’s performance.
Secondly, there is a strong emphasis on supporting the self-regulation com-
petence by using cognitivist concepts. The design of these dashboards is usually
informed by SRL theory. Constructivist concepts are also commonly used for the
development of these dashboards because the context in which these dashboards
were deployed is the online collaborative learning setting. Less used are instruc-
tional design concepts, more notable being the use of formative assessment as a
means for reflection and self-evaluation.
Thirdly, in order to affect the behavioural level, SRL is again one of the
most commonly used concepts, alongside social constructivism and collaborative
learning. Social comparison has a stronger presence on this level as it is used
to reveal the behaviour of peers as a source of suggestions on how learners
could improve. Surprisingly, very few dashboards aim to support learners on the
cognitive level, i.e. acquiring knowledge and improving performance, and the few
that do, rely mostly on SRL and social comparison. Finally, in order to animate
changes on the emotional level, dashboards build mostly on social comparison
and the modelling of learning dispositions and 21st century skills.
Fig. 1. The competence level targeted by the dashboards included in the review in
relation to the educational concept clusters that were used as a theoretical basis for
their development.
PRE PRINT
3.3 Reference frames
According to the framework for designing pedagogical interventions to support
student use of learning analytics proposed by Wise [45], learners need a “repre-
sentative reference frame” for interpreting their data. We analysed this aspect
by looking at how the information was contextualised on the dashboard based on
the dashboard goals. We identified three types of reference frames: i) social, i.e.
comparison with other peers, ii) achievement, i.e. in terms of goal achievement,
and iii) progress, i.e. comparison with an earlier self (see Table 3).
Apart from the origin of the reference frame, the three types are also char-
acterised by where in time the anchor for comparison is set. The social reference
frame focuses on the present, allowing learners to compare their current state to
the performance levels of their peers at the same point in time. The achievement
reference frame directs learner’ attention to the future, outlining goals and a fu-
ture state that learners aim for. Finally, the progress reference frame is anchored
in the past, as the learners use as an anchor point a past state to evaluate what
they achieved so far. In the following paragraphs we discuss in detail each type.
Social The most common frame was showing learners their data in compar-
ison to the whole class. We also identified cases where learners had access to
the data of individual members of their working groups in collaborative learn-
ing settings. In other cases, learners compared themselves to previous graduates
of the same course. In order to avoid the pitfalls of averages in heterogeneous
groups, D22 allowed learners a more specific reference: peers with similar goals
and knowledge. A few dashboards compared learners to the “top” students, while
on some dashboards learners had the option to choose against which group they
compare themselves. On one dashboard, learners compared their self-assessment
of group work performance with the assessment made by their peers. We also
looked at how the data of the reference groups is aggregated. Most of the dash-
boards displayed averages (16 dashboards), while only six showed data of indi-
viduals and three presented a learner’s ranking within the reference group.
Table 3. The reference frames for comparison and their frequency. The list of papers
included in this review is available at bit.ly/LADashboards
Type Reference frame Freq. Papers
Social
Class 15 D1; D3; D4; D5; D7; D8; D11; D15;
D16; D18; D19; D21; D22; D23; D24
Teammates 2 D14; D26
Previous graduates 2 D21; D25
Top peers 4 D8; D15; D16; D24
Peers with similar goals 1 D22
Achievement Learning outcomes 15 D2; D3; D4; D5; D6; D8; D9; D11;
D12; D15; D16; D20; D21; D22; D24
Learner goals 1 D22
Progress Self 10 D1; D2; D3; D4; D5; D10; D18; D23;
D25; D26
PRE PRINT
Fig. 2. The competence level targeted by the dashboards in relation to the three refer-
ence frames identified: social (S: red), achievement (A: blue) and progress (P: yellow).
Achievement The second way of framing the information displayed on the
dashboard is in terms of the achievement of the learning activity. Here, we dis-
tinguish between two types of goals: i) learning outcomes set by the teachers and
ii) learner goals set by the learners themselves. One purpose of presenting learn-
ers’ performance in relation to learning outcomes was to illustrate mastery and
skillfulness achievement. Content mastery was expressed through the use of key
concepts in forum discussion (D16, D24), performance in quizzes covering topics
(D5, D8, D9, D15) or different difficulty levels (D3). The acquisition of skills was
quantified through the number of courses covering those skills in the curriculum
objectives (D21), while learning dispositions were calculated from self-reported
data collected through questionnaires (D2). A second purpose for using teacher
defined goals is to support learners in planning their learning by offering them
a point of reference as to how much effort is required for the completion of a
learning activity (D11). Concerning the learner goals, our results were surprising.
Only one dashboard allowed learners enough freedom to set their own goals: on
D22, learners could establish their aimed level of knowledge and time investment
and follow their progress in comparison to their set targets.
Progress The third frame of reference refers to whether dashboards allow
learners to visualise their progress over time, by having access to their historical
data. This functionality directly supports the “execution and monitoring” phase
of the SRL cycle [48]. Our results show that only 10 dashboards offered this
feature, while the rest displayed only the current status of the learners.
PRE PRINT
4 Discussion
Through this literature review, we seek to investigate the relation between learn-
ing sciences and learning analytics by looking into which educational concepts
inform the design of learning analytics dashboards aimed at learners. Our in-
vestigation revealed that only 26 out of the 95 dashboard designs identified by
our search have grounding in learning sciences and have been evaluated. This
might indicate that the development of these tools is still driven by the need
to leverage the learning data available, rather than a clear pedagogical focus of
improving learning. The most common foundation for LA dashboard design is
self-regulated learning theory, used frequently to motivate dashboard goals that
aim to support awareness and trigger reflection. Two findings related to the use
of SRL are striking.
Firstly, very few papers have a secondary goal besides fostering awareness
and reflection. However, being aware does not imply that remedial actions are
being taken and learning outcomes are improved. Moreover, awareness and re-
flection are not concepts that can be measured objectively, making the evaluation
of such dashboards questionable. According to McAlpine & Weston, reflection
should be considered a mechanism through which learning and teaching can be
improved rather than an end in itself [30]. Thus, we argue that LA dashboards
should be designed and evaluated as pedagogical tools which catalyse changes
also in the cognitive, behavioural or emotional competencies, and not only on
the metacognitive level.
Secondly, since more than half of the analysed dashboards rely on SRL, we
took a closer look at how the different phases of the self-regulation cycle are
supported, i.e. fore-thought and planning, monitoring and self-evaluation [49].
The investigation of the reference frames used on the dashboards revealed that
there is little support for goal setting and planning as almost no dashboard al-
lowed learners to manage self-set goals. Moreover, tracking one’s own progress
over time was also not a very common feature. These two shortcomings sug-
gest that current dashboards are built mostly to support the “reflection and
self-evaluation” phase of SRL and neglect the others. This implies that apart
from a learning dashboard, online learning environments need to provide addi-
tional tools that facilitate learners to carry out all the phases of the SRL cycle,
supporting learners in subsequent steps once awareness has been realised. These
findings emphasise the need of designing LA dashboards as a tool embedded into
the instructional design, potentially solving problems related to low uptake of
LA dashboards [28].
Furthermore, our analysis revealed that social framing is more common than
achievement framing. Comparison with peers is usually used in order to motivate
students to work harder and increase their engagement, sometimes by “induc-
ing a feeling of being connected with and supported by their peers” [41]. When
looking at the theoretical concepts that inform the design of the studied dash-
boards, only two theories would justify the use of comparison with peers: social
comparison theory and achievement goal orientation theory.
PRE PRINT
Social comparison [14] states that we establish our self-worth by comparing
ourselves to others when there are no objective means of comparison. However,
empirical research in the face-to-face classroom has shown that comparison to
self-selected peers who perform slightly better has a beneficial effect on mid-
dle school students’ grades, whereas no effects were found when there was a
bigger gap in performance [23]. Despite the availability of such research, social
comparison theory is rarely used to inform the design of dashboards. Only 3
works rationalise the use of comparison by grounding it on social comparison
theory and validations of this theory in educational sciences [7, 20, 27]. More-
over, learners usually got to see their data in comparison to the average of their
peers. Averages are often misleading because they are skewed by data of inactive
learners and the diversity of learning goals among learners, offering a misguided
reference frame.
A second theory that might support the use of social comparison is achieve-
ment goal orientation theory. This theory distinguishes between mastery and per-
formance orientations as the motivation behind why one engages in an achieve-
ment task [32]. In contrast to learners who set mastery goals and focus on learn-
ing the material and mastering the tasks, learners who have performance goals
are more focused on demonstrating their ability by measuring skill in compari-
son to others. We found few dashboards that contextualised the data in terms
of goals achieved, while the majority used different groups of peers as a frame
of reference. This finding suggests that the design of current dashboards is more
appealing to performance oriented learners, neglecting learners who have a ten-
dency towards mastery. Indeed, as Beheshitha et al. [2] observed, learners that
considered the subject matter of the course more motivating than competition
between students were more inclined to rate negatively the visualisation based
on social comparison. We found only one dashboard proposal that catered to the
needs of learners with different achievement goal orientations. Mastery Grids [20]
provides an open learner model for mastery oriented learners on which they can
monitor their progress, as well as social comparison features for performance
oriented learners.
The lack of support for goal achievement and the prevalence of comparison
fosters competition in learners. On the long-term, there is the threat that by
constantly being exposed to motivational triggers that rely on social compari-
son, comparison to peers and “being better than others” becomes the norm in
terms of what defines a successful learner. We argue that learning and education
should be about mastering knowledge, acquiring skills and developing compe-
tencies. For this purpose, comparison should be used carefully in the design of
learning dashboards, and research needs to investigate the effects of social com-
parison and competition in LA dashboards. More attention should be given to
the different needs of learners and dashboards should be used as pedagogical
tools to motivate learners with different performance levels that respond differ-
ently to motivating factors. As Tan [39] envisioned, “differentiated instruction
can become an experienced reality for students, with purposefully-designed LA
serving to compress, rather than exacerbate, the learning and achievement gap
between thriving and struggling students”.
PRE PRINT
5 Conclusion
This paper presents the results of a systematic survey looking into the use of
educational concepts in learning analytics dashboards for learners. Our main
findings show that, firstly, self-regulated learning is the core theory that informs
the design of LA dashboards that aim to make learners aware of their learning
process by visualising their data. However, just making learners aware is not
enough. Dashboards should have a broader purpose, using awareness and re-
flection as means to improve cognitive, behavioural or emotional competencies.
Secondly, effective support for online learners that do not have well developed
SRL skills should also facilitate goal setting and planning, and monitoring and
self-evaluation. As dashboards mostly aim to increase awareness and trigger self-
reflection, different tools should complement dashboards and be seamlessly inte-
grated in the learning environment and the instructional design. Thirdly, there
is a strong emphasis on comparison with peers as opposed to using goal achieve-
ment as reference frame. However, there is evidence in educational sciences that
disproves the benefits of fostering competition in learning. Our findings suggest
that the design of LA dashboards needs better grounding in learning sciences.
Finally, we see the need to investigate the effectiveness of using educational
concepts in the design of LA dashboards by looking at how these tools were eval-
uated, what are the effects perceived by learners and how learning was improved.
Our study was limited by a narrow focus set within the LA field, a relatively
recent research area. Valuable proposals could also be found in related fields,
e.g. educational data mining. We plan to answer these research questions in the
future by extending this work.
References
1. Anderson, T.: The theory and practice of online learning. Athabasca University
Press (2008)
2. Beheshitha, S.S., Hatala, M., Gaˇsevi´c, D., Joksimovi´c, S.: The role of achievement
goal orientations when studying effect of learning analytics visualizations. In: Proc.
of LAK’16. pp. 54–63. ACM (2016)
3. Bloom, B., Krathwohl, D., Masia, B.: Bloom taxonomy of educational objectives.
Allyn and Bacon, Boston, MA. Copyright by Pearson Education. (1984)
4. Bodily, R., Verbert, K.: Trends and issues in student-facing learning analytics
reporting systems research. In: Proc. of LAK’17. pp. 309–318. ACM (2017)
5. Charleer, S., Klerkx, J., Duval, E., De Laet, T., Verbert, K.: Creating effective
learning analytics dashboards: Lessons learnt. In: Proc. of EC-TEL’16. pp. 42–56.
Springer (2016)
6. Corey, M.L., Leinenbach, M.T.: Universal design for learning: Theory and practice.
In: Proc. of 2004 Society for Information Technology and Teacher Education Int.
Conf. pp. 4919–4926 (2004)
7. Davis, D., Jivet, I., Kizilcec, R.F., Chen, G., Hauff, C., Houben, G.J.: Follow the
successful crowd: raising MOOC completion rates through social comparison at
scale. In: Proc. of LAK’17. pp. 454–463. ACM (2017)
8. DeCarvalho, R.J.: The humanistic paradigm in education. The Humanistic Psy-
chologist 19(1), 88 (1991)
PRE PRINT
9. Durall, E., Gros, B.: Learning analytics as a metacognitive tool. In: CSEDU (1).
pp. 380–384 (2014)
10. Ekman, P., Friesen, W.V.: Facial action coding system (1977)
11. Elder, L., Paul, R.: Critical thinking: Why we must transform our teaching. Journal
of Developmental Education 18(1), 34 (1994)
12. Engestr¨om, Y.: Expansive visibilization of work: An activity-theoretical perspec-
tive. Computer Supported Cooperative Work (CSCW) 8(1), 63–93 (1999)
13. Ferguson, R.: Learning analytics: drivers, developments and challenges. Interna-
tional Journal of Technology Enhanced Learning 4(5-6), 304–317 (2012)
14. Festinger, L.: A theory of social comparison processes. Human relations 7(2), 117–
140 (1954)
15. Few, S.: Information Dashboard Design: Displaying data for at-a-glance monitor-
ing. Analytics Press (2013)
16. Fredricks, J.A., Blumenfeld, P.C., Paris, A.H.: School engagement: Potential of the
concept, state of the evidence. Review of educational research 74(1), 59–109 (2004)
17. Gaˇsevi´c, D., Dawson, S., Siemens, G.: Lets not forget: Learning analytics are about
learning. TechTrends 59(1), 64–71 (2015)
18. Gelfand, M.J., Raver, J.L., Nishii, L., Leslie, L.M., Lun, J., Lim, B.C., Duan, L.,
Almaliach, A., Ang, S., Arnadottir, J., et al.: Differences between tight and loose
cultures: A 33-nation study. science 332(6033), 1100–1104 (2011)
19. Greller, W., Drachsler, H.: Translating learning into numbers: A generic framework
for learning analytics. Educational technology & society 15(3), 42–57 (2012)
20. Guerra, J., Hosseini, R., Somyurek, S., Brusilovsky, P.: An intelligent interface
for learning content: Combining an open learner model and social comparison to
support self-regulated learning and engagement. In: Proc. of IUI’16. pp. 152–163.
ACM (2016)
21. Haggis, T.: Constructing images of ourselves? A critical investigation into ’ap-
proaches to learning’ research in higher education. British Educational Research
Journal 29(1), 89–104 (2003)
22. Hofstede, G.: Cultures and organizations. intercultural cooperation and its impor-
tance for survival. software of the mind. London: Mc Iraw-Hill (1991)
23. Huguet, P., Galvaing, M.P., Monteil, J.M., Dumas, F.: Social presence effects in the
Stroop task: further evidence for an attentional view of social facilitation. Journal
of personality and social psychology 77(5), 1011 (1999)
24. Kim, B.: Social constructivism. Emerging perspectives on learning, teaching, and
technology 1(1), 16 (2001)
25. Kirkpatrick, D.L.: Evaluating training programs. Tata McGraw-Hill Education
(1975)
26. Klerkx, J., Verbert, K., Duval, E.: Enhancing learning with visualization tech-
niques. In: Handbook of research on educational communications and technology,
pp. 791–807. Springer (2014)
27. Loboda, T.D., Guerra, J., Hosseini, R., Brusilovsky, P.: Mastery grids: An open
source social educational progress visualization. In: Proc. of EC-TEL’14. pp. 235–
248. Springer (2014)
28. Lonn, S., Aguilar, S.J., Teasley, S.D.: Investigating student motivation in the con-
text of a learning analytics intervention during a summer bridge program. Com-
puters in Human Behavior 47, 90–97 (2015)
29. Marzouk, Z., Rakovic, M., Liaqat, A., Vytasek, J., Samadi, D., Stewart-Alonso, J.,
Ram, I., Woloshen, S., Winne, P.H., Nesbit, J.C.: What if learning analytics were
based on learning science? Australasian Journal of Educational Technology 32(6)
(2016)
PRE PRINT
30. McAlpine, L., Weston, C.: Reflection: Issues related to improving professors’ teach-
ing and students’ learning. Instructional Science 28(5), 363–385 (2000)
31. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., Group, P., et al.: Preferred
reporting items for systematic reviews and meta-analyses: the PRISMA statement.
PLoS med 6(7), e1000097 (2009)
32. Pintrich, P.R.: Multiple goals, multiple pathways: The role of goal orientation in
learning and achievement. Journal of educational psychology 92(3), 544 (2000)
33. Pintrich, P.R., De Groot, E.V.: Motivational and self-regulated learning compo-
nents of classroom academic performance. Journal of educational psychology 82(1),
33 (1990)
34. Sadler, D.R.: Formative assessment and the design of instructional systems. In-
structional science 18(2), 119–144 (1989)
35. Schwendimann, B., Rodriguez-Triana, M., Vozniuk, A., Prieto, L., Boroujeni, M.,
Holzer, A., Gillet, D., Dillenbourg, P.: Perceiving learning at a glance: A systematic
literature review of learning dashboard research. IEEE Transactions on Learning
Technologies (2016)
36. Shum, S.B., Crick, R.D.: Learning dispositions and transferable competencies: ped-
agogy, modelling and learning analytics. In: Proc. of LAK’12. pp. 92–101. ACM
(2012)
37. Siemens, G., Gaˇsevi´c, D.: Guest editorial-learning and knowledge analytics. Edu-
cational Technology & Society 15(3), 1–2 (2012)
38. Suthers, D., Verbert, K.: Learning analytics as a middle space. In: Proc. of LAK’13.
pp. 1–4. ACM (2013)
39. Tan, J.P.L., Yang, S., Koh, E., Jonathan, C.: Fostering 21st century literacies
through a collaborative critical reading and learning analytics environment: user-
perceived benefits and problematics. In: Proc. of LAK’16. pp. 430–434. ACM (2016)
40. Trilling, B., Fadel, C.: 21st century skills: Learning for life in our times. John Wiley
& Sons (2009)
41. Venant, R., Vidal, P., Broisin, J.: Evaluation of learner performance during practi-
cal activities: An experimentation in computer education. In: Proc. of ICALT’16.
pp. 237–241. IEEE (2016)
42. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
43. Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx,
J.: Learning dashboards: an overview and future research opportunities. Personal
and Ubiquitous Computing 18(6), 1499–1514 (2014)
44. Winne, P.H., Zimmerman, B.J.: Self-regulated learning viewed from models of
information processing. Self-regulated learning and academic achievement: Theo-
retical perspectives 2, 153–189 (2001)
45. Wise, A.F.: Designing pedagogical interventions to support student use of learning
analytics. In: Proc. of LAK’14. pp. 203–211. ACM (2014)
46. Wu, W.H., Hsiao, H.C., Wu, P.L., Lin, C.H., Huang, S.H.: Investigating the
learning-theory foundations of game-based learning: a meta-analysis. Journal of
Computer Assisted Learning 28(3), 265–279 (2012)
47. Yoo, Y., Lee, H., Jo, I.H., Park, Y.: Educational dashboards for smart learning: Re-
view of case studies. In: Emerging Issues in Smart Learning, pp. 145–155. Springer
(2015)
48. Zimmerman, B.J.: Self-regulated learning and academic achievement: An overview.
Educational psychologist 25(1), 3–17 (1990)
49. Zimmerman, B.J., Boekarts, M., Pintrich, P., Zeidner, M.: A social cognitive per-
spective. Handbook of self-regulation 13(1), 695–716 (2000)
PRE PRINT
... Dashboards typically provide multiple indicators and visualizations of learners and learning processes. For example, three major reference frames for dashboards have been outlined: (a) social, which involves comparison with peers (e.g., a whole class), (b) achievement, which includes goal achievement, learning outcomes and goals, and (c) progress, which involves comparison with earlier performance to represent progress or evolution through time [60]. Depending on the information provided, the recipients can take appropriate actions. ...
... [43] concluded that the LA tools that focus on feedback provision are typically not grounded in established Learning Sciences theory or research. Similarly, [60] found that only about one quarter of dashboard designs were grounded in Learning Sciences. Matcha et al. [75] also concluded that LA dashboards are rarely grounded on Learning Sciences theory and research. ...
... Some studies do highlight this problem. In particular, Jivet et al. [60] point out that awareness is insufficient: knowing how well one does is not necessarily translatable into specific actions for improving content mastery. In the case of class comparisons, Bodily and Verbert [15] question what performance scores really mean. ...
Article
Full-text available
Most learning analytics (LA) systems provide generic feedback, because they primarily draw on performance data based on quiz scores. This study explored the potential of student-generated summaries as an alternative method for predicting learning performance. Two hundred and fifty-four undergraduates first watched a series of six short video lectures and then wrote a short summary for each one. Based on their median performance quiz scores, the participants were divided into two performance groups. Sparse and dense text vectorization methods were used to represent the video lectures and student summaries. Three semantic textual similarity features were computed using cosine similarity and were used as input into seven common machine learning algorithms. The results indicated that the sparse similarity features outperformed dense ones in classifying performance. Also, the best classification accuracy was achieved using the K-Nearest Neighbors and Random Forrest algorithms. Overall, the findings suggest that semantic similarity measures can be used as additional proxy measures of learning, thereby enabling the real-time monitoring and evaluation of student understanding in LA contexts.
... However, literature reviews acknowledge a huge mismatch between the theoretical aspects of SRL and the design of the technologies (Pérez-Álvarez et al., 2018). Others also critically point out that many past initiatives focus on only a partial pedagogic support (for example only on learning dashboard, or implementing only self-assessment tasks), rather than utilizing a comprehensive support for SRL (Jivet et al., 2017;Radović et al., 2023aRadović et al., , 2023b. Placing emphasis on certain aspects of SRL process while ignoring others is insufficient and even hindering the learning process (for instance, praising the self-monitoring phase while neglecting the self-reflection phase) (Radović et al., 2023a(Radović et al., , 2023b. ...
... They have recently received significant attention in both scientific and practical domains (Bodily & Verbert, 2017;Viberg et al., 2018). These graphical interfaces with user-friendly views of data help students make informed decisions about their learning (Jivet et al., 2017;Schwendimann et al., 2017), which is especially important in the context of distance learning where students may have limited access to in-person support and teacher guidance (Gašević et al., 2015). LADs have the potential to be effective metacognitive tools for learners, promoting key self-regulatory behaviors. ...
... They can encourage students to plan their learning activities (F2), adopt clear goal orientations (F5), reflect on their learning processes (S1), and develop deeper insights into managing their time effectively (P3). This self-awareness can also help learners identify strengths and areas for improvement (S4), set more effective learning goals (F1), and develop different task strategies (P5) to enhance their academic performance (Jivet et al., 2017). Additionally, LA dashboards can foster a sense of accountability and motivation, as students can track their progress over time and see the tangible results of their efforts. ...
Article
Full-text available
The integration of advanced learning analytics and data-mining technology into higher education has brought various opportunities and challenges, particularly in enhancing students' self-regulated learning (SRL) skills. Analyzing developed features for SRL support, it has become evident that SRL support is not a binary concept but rather a continuum, ranging from limited to advanced levels of SRL support. This article introduces the rubric, designed to evaluate the degree of self-regulated learning support available within technology enhanced learning environments. Following rubric design best practices, we took a multifaceted methodological approach to ensure rubric validity and reliability: consulting Zimmerman's theoretical model, comparing technological features distilled from empirical studies that demonstrated significant effectiveness, consulting SRL experts, and iterative development and feedback. Across three phases of SRL the rubrics describe evaluation criteria and in detail define performance levels (Limited, Moderate and Advance). By employing the rubric, educators and researchers can 1) gain insights into the extent of implemented SRL approaches, 2) further develop SRL support of learning environments, and 3) better support students on their journey towards becoming self-regulated learners. Finally, the reliability analysis demonstrated a high degree of agreement among different raters evaluating the same course, indicating that the rubric is a reliable tool for obtaining relevant evaluations of SRL support in higher education. We conclude by discussing the significance of the rubric in promoting self-regulated learning within the current pedagogical and technological landscape.
... AI systems can augment teacher feedback by providing real-time data analytics and visualizations of student performance, exemplified by teacher dashboards that facilitate monitoring and assessment through visualizing relevant learner variables (Knoop-van Campen et al., 2023;Xhakaj et al., 2017). Likewise, learners can benefit from visual feedback tools, such as student-facing dashboards, which facilitate students' self-assessment by providing real-time overviews of individual or collaborative learning activities and outcomes (Breideband et al., 2023;Jivet et al., 2017;Long & Aleven, 2017). However, the effectiveness of both teacher-and student-facing dashboards depends on the dashboard usability and audience characteristics, as designing dashboards that provide accessible, relevant, and actionable information can be challenging, and users may struggle to translate insights into meaningful actions because of their knowledge and skills (Jivet et al., 2017;Matcha et al., 2020). ...
... Likewise, learners can benefit from visual feedback tools, such as student-facing dashboards, which facilitate students' self-assessment by providing real-time overviews of individual or collaborative learning activities and outcomes (Breideband et al., 2023;Jivet et al., 2017;Long & Aleven, 2017). However, the effectiveness of both teacher-and student-facing dashboards depends on the dashboard usability and audience characteristics, as designing dashboards that provide accessible, relevant, and actionable information can be challenging, and users may struggle to translate insights into meaningful actions because of their knowledge and skills (Jivet et al., 2017;Matcha et al., 2020). In contrast to visual feedback, instructional feedback provides students with verbal information about their performance, ranging from simple feedback on task performance to elaborate feedback that presents a formative assessment with suggestions for improvement (Narciss et al., 2014). ...
Article
Full-text available
Artificial intelligence (AI) holds significant potential for enhancing student learning. This reflection critically examines the promises and limitations of AI for cognitive learning processes and outcomes, drawing on empirical evidence and theoretical insights from research on AI-enhanced education and digital learning technologies. We critically discuss current publication trends in research on AI-enhanced learning and rather than assuming inherent benefits, we emphasize the role of instructional implementation and the need for systematic investigations that build on insights from existing research on the role of technology in instructional effectiveness. Building on this foundation, we introduce the ISAR model, which differentiates four types of AI effects on learning compared to learning conditions without AI, namely inversion, substitution, augmentation, and redefinition. Specifically, AI can substitute existing instructional approaches while maintaining equivalent instructional functionality, augment instruction by providing additional cognitive learning support, or redefine tasks to foster deep learning processes. However, the implementation of AI must avoid potential inversion effects, such as over-reliance leading to reduced cognitive engagement. Additionally, successful AI integration depends on moderating factors, including students’ AI literacy and educators’ technological and pedagogical skills. Our discussion underscores the need for a systematic and evidence-based approach to AI in education, advocating for rigorous research and informed adoption to maximize its potential while mitigating possible risks.
... Wise and Vytasek (2017) argue that an appropriate reference frame is required to determine the meaning of information in LADs. Reference frames are 'the comparison points which orient students' interpretation of analytics' (Wise et al. 2016, 170), and function as comparators to support data interpretation (Jivet et al. 2017). However, reference frames in LADs can also have negative effects. ...
Article
Full-text available
Background University students need to self‐regulate but are sometimes incapable of doing so. Learning Analytics Dashboards (LADs) can support students' appraisal of study behaviour, from which goals can be set and performed. However, it is unclear how goal‐setting and self‐motivation within self‐regulated learning elicits behaviour when using an LAD. Objectives This study's purpose is exploring reference frames’ influence on goal setting, LAD elements’ influence on student motivation, and the predictive value of goal setting and motivation on behaviour, adding to our understanding of the factors predicting task attainment and the role of reference frames. Methods In an experimental survey design, university students (n = 88) used an LAD with a peer reference frame (Condition 1) or without one (Condition 2), set a goal, determined goal difficulty, self‐assessed motivation and LAD elements' influence on motivation. Researchers coded goal specificity. Four weeks later, students self‐assessed task attainment, task satisfaction, time on task, and task frequency. T‐tests and MANOVA explored effects of the reference frame. Regression analyses determined predictive potential of goal difficulty, goal specificity, and motivation on goal attainment. Results and Conclusions Results showed no difference between conditions on goal specificity, difficulty, or motivation. The peer reference frame's perceived influence on motivation was small. LAD elements’ influence on motivation varied but were mainly positive. Regression models were not predictive, except the task satisfaction exploratory model. Most participants (77%) attained their goals. Reference frame integration should be carefully considered, given potential negative effects. Students may require educators’ support when setting goals, but the support should balance students’ autonomy.
Conference Paper
Lerndatenanalysen eröffnen in digital gestützten Settings weitreichende Möglichkeiten, Lernende individuell zu unterstützen. Eine datenbasierte Lernunterstützung erscheint insbesondere für die Förderung der Kompetenzen zum Selbstregulierten Lernen (SRL) relevant und zielführend. In diesem Bereich weisen Lernende häufig unterschiedliche Unterstützungsbedarfe auf, die es gezielt zu adressieren gilt. Im Rahmen einer Feldstudie erhielten Studierende (N = 77) während ihrer Teilnahme an einem Statistik-Modul, abhängig von der Ausprägung individueller Selbstregulationskompetenzen, eine Lernunterstützung auf metakognitiver oder motivationaler Ebene in Form von Prompts. Diese zielten primär auf die Entwicklung und Anwendung von Regulationsstrategien auf der jeweiligen SRL-Ebene ab und wurden als Selbstreflexionsfragen und Handlungsempfehlungen im Online-Kurs implementiert. Studierende, denen motivationale Prompts dargeboten wurden, verfügten am Ende ihrer Modulteilnahme über stärker ausgeprägte motivationale SRL-Kompetenzen als zum Modulbeginn. Ihre metakognitiven SRL-Kompetenzen blieben unverändert. Studierende, die metakognitive Prompts erhielten, verbesserten ihre metakognitiven SRL-Kompetenzen während der Modul-teilnehme nicht. Somit erwies sich die datenbasierte Unterstützung bezogen auf die SRL-Kompetenzentwicklung teilweise als wirksam.
Article
Full-text available
Η παρούσα εργασία εξετάζει τη χρήση της Επεξεργασίας Φυσικής Γλώσσας με στόχο την πρόβλεψη της επίδοσης των φοιτητών σε περιβάλλοντα ηλεκτρονικής μάθησης. Παρόλο που η πρόοδος στον τομέα αυτόν είναι ενθαρρυντική, λείπουν μέχρι σήμερα συστηματικές ερευνητικές προσπάθειες να διερευνηθεί πλήρως το δυναμικό της Επεξεργασίας Φυσικής Γλώσσας για την ανάλυση και πρόβλεψη της επίδοσης. Στην έρευνα αυτή συμμετείχαν 85 φοιτητές, οι οποίοι παρακολούθησαν 6 βιντεοδιαλέξεις, για κάθε μία εκ των οποίων κλήθηκαν να συντάξουν μια μικρή περίληψη. Από την επεξεργασία των περιλήψεων προέκυψαν δύο σύνολα μεταβλητών (πρωτογενούς και επεξεργασμένου κειμένου) που τροφοδότησαν οκτώ αλγόριθμους μηχανικής μάθησης. Στις περισσότερες βιντεοδιαλέξεις παρατηρήθηκε μια μικρή διαφοροποίηση τιμών των μέτρων ακρίβειας ταξινόμησης και F1 ενώ η λογιστική παλινδρόμηση (LR) ήταν ο αλγόριθμος που επέφερε τα υψηλότερα επίπεδα ακρίβειας ταξινόμησης τόσο για το πρωτογενές όσο και για το επεξεργασμένο κείμενο. Τα αποτελέσματα της ανάλυσης αναδεικνύουν ότι οι περιλήψεις που δημιουργήθηκαν από τους φοιτητές μπορούν να αποτελέσουν ένα υποσχόμενο σύνολο χαρακτηριστικών για την πρόβλεψη της επίδοσης μέσω βιντεοδιαλέξεων.
Article
In the last decade, Learning Analytics (LA) has evolved in a positive way, considering that the term emerged in 2011 through the Society for Learning Analytics Research (SoLAR). This area of data analytics can be identified as a specialization of Educational Data Mining (EDM). LA emphasizes student learning outcomes. In addition to, a better understanding of student learning behavior and processes. While EDM focuses on helping teachers and students with the analysis of the learning process using popular data mining methods. The purpose of this research is to explore the first decade of work with the application of Learning Analytics in Higher Education Institutions (HEI) in the context of Tutoring Information Systems (TIS), with the intention of supporting institutions, teachers and students to decrease dropout rates. This article presents a systematic literature review (SLR) with 17 primary studies, comprised between 2014 and 2024. The findings reflect the use of LA in improving or optimizing learning using student academic history obtained through Learning Management Systems (LMS), noting the scarcity of works with a focus on tutoring or academic advising. Ultimately, a gap is opened to apply LA in HEI, with information from Institutional Tutoring Program (PIT), integrated with information from an LMS, to contribute to student permanence.
Article
Full-text available
This paper presents a systematic literature review of the state-of-the-art of research on learning dashboards in the fields of Learning Analytics and Educational Data Mining. Research on learning dashboards aims to identify what data is meaningful to different stakeholders and how data can be presented to support sense-making processes. Learning dashboards are becoming popular due to the increased use of educational technologies, such as Learning Management Systems (LMS) and Massive Open Online Courses (MOOCs). The initial search of five main academic databases and GScholar resulted in 346 papers out of which 55 papers were included in the final analysis. Our review distinguishes different kinds of research studies as well as various aspects of learning dashboards and their maturity regarding evaluation. As the research field is still relatively young, most studies are exploratory and proof-of-concept. The review concludes by offering a definition for learning dashboards and by outlining open issues and future lines of work in the area of learning dashboards. There is a need for longitudinal research in authentic settings and studies that systematically compare different dashboard designs.
Conference Paper
Full-text available
The affordances of learning analytics (LA) are being increasingly harnessed to enhance 21st century (21C) pedagogy and learning. Relatively rare, however, are use cases and empirically based understandings of students’ actual experiences with LA tools and environments at fostering 21C literacies, especially in secondary schooling and Asian education contexts. This paper addresses this knowledge gap by 1) presenting a first iteration design of a computer-supported collaborative critical reading and LA environment and its 16-week implementation in a Singapore high school; and 2) foregrounding students’ quantitative and qualitative accounts of the benefits and problematics associated with this learning innovation. We focus the analytic lens on the LA dashboard components that provided visualizations of students’ reading achievement, 21C learning dispositions, critical literacy competencies and social learning network positioning within the class. The paper aims to provide insights into the potentialities, paradoxes and pathways forward for designing LA that take into consideration the voices of learners as critical stakeholders.
Conference Paper
We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not meet the inclusion criteria, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student-facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work.
Conference Paper
Social comparison theory asserts that we establish our social and personal worth by comparing ourselves to others. In in-person learning environments, social comparison offers students critical feedback on how to behave and be successful. By contrast, online learning environments afford fewer social cues to facilitate social comparison. Can increased availability of such cues promote effective self-regulatory behavior and achievement in Massive Open Online Courses (MOOCs)? We developed a personalized feedback system that facilitates social comparison with previously successful learners based on an interactive visualization of multiple behavioral indicators. Across four randomized controlled trials in MOOCs (overall N = 33, 726), we find: (1) the availability of social comparison cues significantly increases completion rates, (2) this type of feedback benefits highly educated learners, and (3) learners' cultural context plays a significant role in their course engagement and achievement.
Article
Learning analytics are often formatted as visualisations developed from traced data collected as students study in online learning environments. Optimal analytics inform and motivate students' decisions about adaptations that improve their learning. We observe that designs for learning often neglect theories and empirical findings in learning science that explain how students learn. We present six learning analytics that reflect what is known in six areas (we call them cases) of theory and research findings in the learning sciences: Setting goals and monitoring progress, distributed practice, retrieval practice, prior knowledge for reading, comparative evaluation of writing, and collaborative learning. Our designs demonstrate learning analytics can be grounded in research on self-regulated learning and self-determination. We propose designs for learning analytics in general should guide students toward more effective self-regulated learning and promote motivation through perceptions of autonomy, competence, and relatedness.
Conference Paper
Learning Analytics (LA) dashboards help raise student and teacher awareness regarding learner activities. In blog-supported and inquiry-based learning courses, LA data is not limited to student activities, but also contains an abundance of digital learner artefacts, such as blog posts, hypotheses, and mind-maps. Exploring peer activities and artefacts can help students gain new insights and perspectives on learning efforts and outcomes, but requires effort. To help facilitate and promote this exploration, we present the lessons learnt during and guidelines derived from the design, deployment and evaluation of five dashboards.
Chapter
An educational dashboard is a display which visualizes the results of educational data mining in a useful way. Educational data mining and visualization techniques allow teachers and students to monitor and reflect on their online teaching and learning behavior patterns. Previous literature has included such information in the dashboard to support students’ self-knowledge, self-evaluation, self-motivation, and social awareness. Further, educational dashboards are expected to support the smart learning environment, in the perspective that students receive personalized and automatically-generated information on a real-time base, by use of the log files in the Learning Management System (LMS). In this study, we reviewed ten case studies that deal with development and evaluation of such a tool, for supporting students and teachers through educational data mining techniques and visualization technologies. In the present study, a conceptual framework based on Few’s principles of dashboard design and Kirkpatrick’s fourlevel evaluation model was developed to review educational dashboards. Ultimately, this study is expected to evaluate the current state of educational dashboard development and suggest an evaluative tool to judge whether or not the dashboard function is working properly, in both a pedagogical and visual way.