Conference PaperPDF Available

Awareness Is Not Enough: Pitfalls of Learning Analytics Dashboards in the Educational Practice

Authors:

Abstract and Figures

It has been long argued that learning analytics has the potential to act as a \middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
Content may be subject to copyright.
Awareness is not enough. Pitfalls of learning
analytics dashboards in the educational practice
Ioana Jivet1, Maren Scheffel1, Hendrik Drachsler2,3,1, and Marcus Specht1
1Open Universiteit, Valkenburgerweg 177, 6419AT Heerlen, NL
ioana.jivet@ou.nl, maren.scheffel@ou.nl, hendrik.drachsler@ou.nl,
marcus.specht@ou.nl
2Goethe University Frankfurt, Germany
3German Institute for International Educational Research (DIPF), Germany
drachsler@dipf.de
Abstract. It has been long argued that learning analytics has the po-
tential to act as a “middle space” between the learning sciences and data
analytics, creating technical possibilities for exploring the vast amount
of data generated in online learning environments. One common learn-
ing analytics intervention is the learning dashboard, a support tool for
teachers and learners alike that allows them to gain insight into the
learning process. Although several related works have scrutinised the
state-of-the-art in the field of learning dashboards, none have addressed
the theoretical foundation that should inform the design of such inter-
ventions. In this systematic literature review, we analyse the extent to
which theories and models from learning sciences have been integrated
into the development of learning dashboards aimed at learners. Our crit-
ical examination reveals the most common educational concepts and the
context in which they have been applied. We find evidence that cur-
rent designs foster competition between learners rather than knowledge
mastery, offering misguided frames of reference for comparison.
Keywords: learning dashboards, learning theory, learning analytics, sys-
tematic review, learning science, social comparison, competition
1 Introduction
Learning Analytics (LA) emerged from the need to harness the potential of the
increasingly large data sets describing learner data generated by the widespread
use of online leaning environments and it has been defined as “the measurement,
collection, analysis, and reporting of data about learners and their contexts,
for purposes of understanding and optimising learning and the environments in
which it occurs” [37]. Ferguson [13] identified two main challenges when it comes
to learning analytics: (i) building strong connections to learning sciences and (ii)
focusing on the perspectives of learners.
There is a common notion in the LA community that learning analytics
research should be deeply grounded in learning sciences [28,29]. Suthers & Ver-
bert [38] labelled LA the “middle space” as it lies at the intersection between
PRE PRINT
technology and learning sciences. Moreover, LA should be seen as an educational
approach guided by pedagogy and not the other way around [19]. However, there
is a strong emphasis on the “analytics”, i.e. computation of the data and cre-
ation of predictive models, and not so much on the “learning”, i.e. applying
and researching LA in the learning context where student outcomes can be im-
proved [17].
One of the focuses of LA research is to empower teachers and learners to make
informed decisions about the learning process, mainly by visualising the collected
learner data through dashboards [9]. Learning analytics dashboards are “single
displays that aggregate different indicators about learner (s), learning process(es)
and/or learning context(s) into one or multiple visualizations” [35]. Dashboards
have been developed for different stakeholder groups, including learners, teach-
ers, researchers or administrative staff [35]. Charleer et al. [5] suggest that LA
dashboards could be used as powerful metacognitive tools for learners, triggering
them to reason about the effort invested in the learning activities and learning
outcomes. However, a large majority of dashboards are still aimed at teachers, or
at both teachers and learners [35]. Moreover, there has been very little research
in terms of what effects such tools have on learning [26].
As a first step towards building effective dashboards for learners, we need to
understand how learning sciences can be considered in the design and pedagogi-
cal use of learning dashboards. Following Suther and Verbert’s [38] position that
learning analytics research should be explicit about the theory or conception of
learning underlying the work, we sought out to investigate which educational
concepts constitute the theoretical foundation for the development of learning
dashboards aimed at learners.
A number of previous works reviewed LA dashboards from different per-
spectives, including their design and evaluation. Verbert et al. [42] introduced
a conceptual framework for analysing LA applications and reviewed 15 dash-
boards based on the target users, displayed data and the focus of the evaluation.
A follow-up review [43] extended this analysis to 24 dashboards, examining the
context in which the dashboards had been deployed, the data sources, the de-
vices used and the evaluation methodology. Yoo et al. [47] reviewed the design
and evaluation of 10 educational dashboards for teachers and students through
their proposed evaluative tool based on Few’s principles of dashboard design [15]
and Kirkpatrick’s four-level evaluation model [25]. A more recent systematic re-
view by Schwendimann et al. [35] of 55 dashboards looked at the context in
which dashboards had been deployed, their purpose, the displayed indicators,
the technologies used, the maturity of the evaluation and open issues.
The scope of all these reviews included learning analytics dashboards, regard-
less of their target users. Focusing on the challenges identified by Ferguson [13],
we narrow down our scope to LA dashboards aimed at learners in order to focus
on their perspective. A closely related work to this paper was published by Bod-
ily & Verbert [4]. They provided a systematic review that focused exclusively
on student-facing LA systems, including dashboards, educational recommender
systems, EDM systems, ITS and automated feedback systems. The systems were
PRE PRINT
analysed based on functionality, data sources, design analysis, perceived effects
on learners and actual effects.
Although other works looked into the learning theory foundations of game-
based learning [46], one major limitation of previous dashboard reviews is that
none investigate the connection to learning sciences. Moreover, [35] and [4] pro-
vide recommendations for the design of learner dashboards, but none suggest
the use of educational concepts as a basis for the design or evaluation of the
dashboards. Through this systematic literature review we aim to bridge this gap
by investigating the relation between educational concepts and the design of
learning dashboards. Dashboard design was previously examined by looking at
the type of data displayed on the dashboard and the type of charts or visuali-
sation that were used. However, in this study, we will specifically focus on how
the data presented on the dashboard is contextualised and framed to ease the
sense-making for the learners.
Throughout this literature review, we explore how educational concepts are
integrated into the design of learning dashboards. Our study is guided by the
following research question: According to which educational concepts are learning
analytics dashboards designed?
2 Methodology
Prior to the systematic review, we conducted an informative literature search in
order to get an overall picture of the field. We ran the systematic literature review
following the PRISMA statement [31] and we selected the following databases
which contain research in the field of Technology Enhanced Learning: ACM Dig-
ital Library, IEEEXplore, SpringerLink, Science Direct, Wiley Online Library,
Web of Science and EBSCOhost. Additionally, we included Google Scholar to
cover any other sources, limiting the number of retrieved results to 200. We
searched the selected databases using the following search query: “learning an-
alytics” AND (visualization OR visualisation OR dashboard OR widget). The
first term narrows down the search field to learning analytics, while the second
part of the query is meant to cover different terminologies used for this type of
intervention, addressing one of the limitations identified in [35]. Although the
scope of this review is limited to visualisations that have learners as end-users, it
was not possible to articulate this criterion in relevant search terms. Therefore,
the approach that we took was to built a query that retrieves all dashboards,
regardless of their target end-users, and remove the ones that fall out of our
scope in a later phase.
The queries were run on February 20th, 2017, collecting 1439 hits. Each result
was further screened for relevance, i.e. whether it described a learning dashboard
aimed at learners, by examining the title and the abstract, thus reducing the list
of potential candidate papers to 212. Eleven papers that we came across during
the informal check and fit the scope of our survey were also added to the set of
papers to be further examined. Next, we accessed the full text of each of these
PRE PRINT
223 studies in order to assess whether they are eligible for our study considering
the following criteria:
1. the paper’s full text is available in English;
2. the paper describes a fully developed dashboard, widget or visualisation, i.e.
we excluded theoretical papers, essays or literature reviews;
3. the target user group of the dashboard is learners;
4. the authors explicitly mention theoretical concepts for the design;
5. the paper includes an evaluation of the dashboard.
We identified 95 papers that satisfied the first three criteria. Only half of these
papers have theoretical grounding in educational concepts, suggesting a large gap
between learning sciences and this type of learning analytics interventions. The
focus of this study is set on 26 papers that describe dashboards that both rely
on educational concepts (criterion 4) and were empirically evaluated (criterion
5). The list of papers included in this review is available at bit.ly/LADashboards.
3 Results
We started this investigation by collecting the theoretical concepts and models
used in the dashboards and analysing the relationships between the purpose of
the dashboards and the concepts that were employed in the development of the
dashboard. Next, we looked at how the design of these dashboards integrate
different concepts from learning sciences.
3.1 Learning theories and models
By analysing the introduction, background and dashboard design sections of
each of the papers included in this study, we identified 17 theories, models and
concepts that we bundled into six clusters (see Table 1).
EC1: Cognitivism cluster relies upon the cognitivism paradigm which posits
that learning is an internal process, involving the use of memory, thinking,
metacognition and reflection [1]. This is the most represented category through
self-regulated learning (SRL), 16 papers citing the works of Zimmerman [48],
Pintrich [33] or Winne [44]. Deep vs surface learning theory explains differ-
ent approaches to learning, where deep learners seek to understand the mean-
ing behind the material and surface learners concentrate on reproducing the
main facts [21]. EC2: Constructivism cluster is rooted in the assumption that
learners are information constructors and learning is the product of social in-
teraction [1]. Social constructivist learning theory [24] and Paul-Elder's critical
thinking model [11] have been used mostly in dashboards aimed to offer learner
support in collaborative settings, while Engestr¨om’s activity theory [12] was used
as a pedagogical base for supporting university students overcome dyslexia. EC3:
Humanism cluster puts the learner at the centre of the learning process, seeking
to engage the person as a whole and focusing on the study of the self, motivation
PRE PRINT
Table 1. Six clusters presentation of educational concepts identified and the pa-
pers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Cluster Educational concept Freq. Papers
EC1: Cognitivism Self-regulated learning 16
D1; D4; D5; D7; D9; D11; D12;
D14; D15; D18; D20; D21; D22;
D23; D25; D26
Deep vs surface learning 2 D16; D19
EC2: Constructivism
Collaborative learning 6 D12; D13; D14; D16; D24; D26
Social constructivist
learning theory 4 D7; D13; D19; D22
Engestr¨
om activity theory 1 D12
Paul-Elder's critical
thinking model 1 D19
EC3: Humanism
Experiential learning 2 D4; D13
Learning dispositions 1 D2
21st century skills 4 D2; D11; D13; D19
Achievement goal orientation 3 D15; D19; D24
EC4: Descriptive
models Engagement model 1 D10
EC5: Instructional
design
Universal Design for Learning
instructional framework 1 D19
Formative assessment 3 D3; D6; D19
Bloom’s taxonomy 3 D3; D4; D22
EC6: Psychology
Ekman’s model for
emotion classification 1 D23
Social comparison 3 D8; D15; D25
Culture 1 D25
and goals [8]. More recent works focus on developing 21st century skills [40] and
learning dispositions [36]. Achievement goal orientation theory is concerned with
learners’ motivation for goal achievement [32]. EC4: Descriptive models cluster
includes the engagement model [16] which differentiates between behavioural,
emotional and cognitive engagement. Several papers also cover the pedagogical
use of dashboards, aligning the EC5: Instructional design in which the dashboard
is embedded with Bloom’s taxonomy [3], formative assessment [34] or Universal
Design for Learning framework [6]. While the majority of these clusters contain
concepts belonging to the learning sciences field, we also identified three con-
cepts that originate in the broader field of EC6: Psychology: Ekman’s model of
emotions and facial expressions [10], social comparison [14] and culture [18, 22].
3.2 Dashboard goals and educational concepts
In order to understand the reasons behind using these educational concepts, we
analysed the goals of the dashboards and looked at how their use was explained in
the papers. We extracted the goals of each dashboard and categorised them based
on the competence they aimed to affect in learners: metacognitive, cognitive,
PRE PRINT
Table 2. Competencies, the goals that are intended to affect each competence and the
papers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Competence Goal Freq. Papers
C1: Metacognitive
Improvemetacognitive
skills 4 D6; D7; D20; D23
Support awareness
and reflection 20
D1; D2; D3; D4; D6; D7; D9; D10;
D11; D12; D13; D14; D17; D18;
D20; D21; D22; D23; D25; D26
Monitor progress 8D7; D8; D11; D15; D19; D20;
D22; D23
Support planning 2 D20; D22
C2: Cognitive Support goal achievement3 D9; D18; D25
Improve performance 3 D16; D23; D24
C3: Behavioural
Improve retention or
engagement2 D10; D25
Improve online social
behaviour 7D7; D13; D14; D16; D19; D24;
D26
Improve help-seeking
behaviour 1 D17
Offer navigational support 2 D8; D15
C4: Emotional
Deactivate negative
emotions 1 D9
Increase motivation 4 D2; D8; D15; D19
C5: Self-regulation Support self-regulation 13
D1; D4; D7; D9; D11; D12;
D15; D19; D20; D21; D22;
D23; D25
behavioural or emotional (see Table 2). Most of the dashboards do not serve only
one goal, but rather aim to catalyse changes in multiple competencies. A fifth
category C5: Self-regulation was also added to account for papers that explicitly
described their goal as supporting self-regulation, a concept that involves all four
competencies [48].
Figure 1 illustrates the relation between the goals of the dashboards and the
educational concept clusters listed in Table 1. We can draw some interesting
observations from these connections. Firstly, the largest part of the visualisa-
tions aim to influence learners’ metacognitive competence, with the purpose of
supporting awareness and reflection. This aim is often motivated by SRL the-
ory, a learning concept that heavily relies on the assumption that actions are
consequences of thinking as SRL is achieved in cycles consisting of forethought,
performance and self-reflection [49]. SRL also motivates the goal of monitoring
progress and supporting planning, but to a lesser extent. Social constructivist
learning theory and collaborative learning also appear quite frequently in rela-
tion to metacognition, due to the collaborative setting in which the dashboards
were used. Dashboard developers argue that for effective collaboration, learners
need to be aware of their teammates’ learning behaviour, activities and out-
PRE PRINT
comes. Other concepts used for affecting the metacognitive level are formative
assessment as it implies evaluation of one’s performance, 21st century skills with
the focus on learning how to learn and social comparison as a means for framing
the evaluation of one’s performance.
Secondly, there is a strong emphasis on supporting the self-regulation com-
petence by using cognitivist concepts. The design of these dashboards is usually
informed by SRL theory. Constructivist concepts are also commonly used for the
development of these dashboards because the context in which these dashboards
were deployed is the online collaborative learning setting. Less used are instruc-
tional design concepts, more notable being the use of formative assessment as a
means for reflection and self-evaluation.
Thirdly, in order to affect the behavioural level, SRL is again one of the
most commonly used concepts, alongside social constructivism and collaborative
learning. Social comparison has a stronger presence on this level as it is used
to reveal the behaviour of peers as a source of suggestions on how learners
could improve. Surprisingly, very few dashboards aim to support learners on the
cognitive level, i.e. acquiring knowledge and improving performance, and the few
that do, rely mostly on SRL and social comparison. Finally, in order to animate
changes on the emotional level, dashboards build mostly on social comparison
and the modelling of learning dispositions and 21st century skills.
Fig. 1. The competence level targeted by the dashboards included in the review in
relation to the educational concept clusters that were used as a theoretical basis for
their development.
PRE PRINT
3.3 Reference frames
According to the framework for designing pedagogical interventions to support
student use of learning analytics proposed by Wise [45], learners need a “repre-
sentative reference frame” for interpreting their data. We analysed this aspect
by looking at how the information was contextualised on the dashboard based on
the dashboard goals. We identified three types of reference frames: i) social, i.e.
comparison with other peers, ii) achievement, i.e. in terms of goal achievement,
and iii) progress, i.e. comparison with an earlier self (see Table 3).
Apart from the origin of the reference frame, the three types are also char-
acterised by where in time the anchor for comparison is set. The social reference
frame focuses on the present, allowing learners to compare their current state to
the performance levels of their peers at the same point in time. The achievement
reference frame directs learner’ attention to the future, outlining goals and a fu-
ture state that learners aim for. Finally, the progress reference frame is anchored
in the past, as the learners use as an anchor point a past state to evaluate what
they achieved so far. In the following paragraphs we discuss in detail each type.
Social The most common frame was showing learners their data in compar-
ison to the whole class. We also identified cases where learners had access to
the data of individual members of their working groups in collaborative learn-
ing settings. In other cases, learners compared themselves to previous graduates
of the same course. In order to avoid the pitfalls of averages in heterogeneous
groups, D22 allowed learners a more specific reference: peers with similar goals
and knowledge. A few dashboards compared learners to the “top” students, while
on some dashboards learners had the option to choose against which group they
compare themselves. On one dashboard, learners compared their self-assessment
of group work performance with the assessment made by their peers. We also
looked at how the data of the reference groups is aggregated. Most of the dash-
boards displayed averages (16 dashboards), while only six showed data of indi-
viduals and three presented a learner’s ranking within the reference group.
Table 3. The reference frames for comparison and their frequency. The list of papers
included in this review is available at bit.ly/LADashboards
Type Reference frame Freq. Papers
Social
Class 15 D1; D3; D4; D5; D7; D8; D11; D15;
D16; D18; D19; D21; D22; D23; D24
Teammates 2 D14; D26
Previous graduates 2 D21; D25
Top peers 4 D8; D15; D16; D24
Peers with similar goals 1 D22
Achievement Learning outcomes 15 D2; D3; D4; D5; D6; D8; D9; D11;
D12; D15; D16; D20; D21; D22; D24
Learner goals 1 D22
Progress Self 10 D1; D2; D3; D4; D5; D10; D18; D23;
D25; D26
PRE PRINT
Fig. 2. The competence level targeted by the dashboards in relation to the three refer-
ence frames identified: social (S: red), achievement (A: blue) and progress (P: yellow).
Achievement The second way of framing the information displayed on the
dashboard is in terms of the achievement of the learning activity. Here, we dis-
tinguish between two types of goals: i) learning outcomes set by the teachers and
ii) learner goals set by the learners themselves. One purpose of presenting learn-
ers’ performance in relation to learning outcomes was to illustrate mastery and
skillfulness achievement. Content mastery was expressed through the use of key
concepts in forum discussion (D16, D24), performance in quizzes covering topics
(D5, D8, D9, D15) or different difficulty levels (D3). The acquisition of skills was
quantified through the number of courses covering those skills in the curriculum
objectives (D21), while learning dispositions were calculated from self-reported
data collected through questionnaires (D2). A second purpose for using teacher
defined goals is to support learners in planning their learning by offering them
a point of reference as to how much effort is required for the completion of a
learning activity (D11). Concerning the learner goals, our results were surprising.
Only one dashboard allowed learners enough freedom to set their own goals: on
D22, learners could establish their aimed level of knowledge and time investment
and follow their progress in comparison to their set targets.
Progress The third frame of reference refers to whether dashboards allow
learners to visualise their progress over time, by having access to their historical
data. This functionality directly supports the “execution and monitoring” phase
of the SRL cycle [48]. Our results show that only 10 dashboards offered this
feature, while the rest displayed only the current status of the learners.
PRE PRINT
4 Discussion
Through this literature review, we seek to investigate the relation between learn-
ing sciences and learning analytics by looking into which educational concepts
inform the design of learning analytics dashboards aimed at learners. Our in-
vestigation revealed that only 26 out of the 95 dashboard designs identified by
our search have grounding in learning sciences and have been evaluated. This
might indicate that the development of these tools is still driven by the need
to leverage the learning data available, rather than a clear pedagogical focus of
improving learning. The most common foundation for LA dashboard design is
self-regulated learning theory, used frequently to motivate dashboard goals that
aim to support awareness and trigger reflection. Two findings related to the use
of SRL are striking.
Firstly, very few papers have a secondary goal besides fostering awareness
and reflection. However, being aware does not imply that remedial actions are
being taken and learning outcomes are improved. Moreover, awareness and re-
flection are not concepts that can be measured objectively, making the evaluation
of such dashboards questionable. According to McAlpine & Weston, reflection
should be considered a mechanism through which learning and teaching can be
improved rather than an end in itself [30]. Thus, we argue that LA dashboards
should be designed and evaluated as pedagogical tools which catalyse changes
also in the cognitive, behavioural or emotional competencies, and not only on
the metacognitive level.
Secondly, since more than half of the analysed dashboards rely on SRL, we
took a closer look at how the different phases of the self-regulation cycle are
supported, i.e. fore-thought and planning, monitoring and self-evaluation [49].
The investigation of the reference frames used on the dashboards revealed that
there is little support for goal setting and planning as almost no dashboard al-
lowed learners to manage self-set goals. Moreover, tracking one’s own progress
over time was also not a very common feature. These two shortcomings sug-
gest that current dashboards are built mostly to support the “reflection and
self-evaluation” phase of SRL and neglect the others. This implies that apart
from a learning dashboard, online learning environments need to provide addi-
tional tools that facilitate learners to carry out all the phases of the SRL cycle,
supporting learners in subsequent steps once awareness has been realised. These
findings emphasise the need of designing LA dashboards as a tool embedded into
the instructional design, potentially solving problems related to low uptake of
LA dashboards [28].
Furthermore, our analysis revealed that social framing is more common than
achievement framing. Comparison with peers is usually used in order to motivate
students to work harder and increase their engagement, sometimes by “induc-
ing a feeling of being connected with and supported by their peers” [41]. When
looking at the theoretical concepts that inform the design of the studied dash-
boards, only two theories would justify the use of comparison with peers: social
comparison theory and achievement goal orientation theory.
PRE PRINT
Social comparison [14] states that we establish our self-worth by comparing
ourselves to others when there are no objective means of comparison. However,
empirical research in the face-to-face classroom has shown that comparison to
self-selected peers who perform slightly better has a beneficial effect on mid-
dle school students’ grades, whereas no effects were found when there was a
bigger gap in performance [23]. Despite the availability of such research, social
comparison theory is rarely used to inform the design of dashboards. Only 3
works rationalise the use of comparison by grounding it on social comparison
theory and validations of this theory in educational sciences [7, 20, 27]. More-
over, learners usually got to see their data in comparison to the average of their
peers. Averages are often misleading because they are skewed by data of inactive
learners and the diversity of learning goals among learners, offering a misguided
reference frame.
A second theory that might support the use of social comparison is achieve-
ment goal orientation theory. This theory distinguishes between mastery and per-
formance orientations as the motivation behind why one engages in an achieve-
ment task [32]. In contrast to learners who set mastery goals and focus on learn-
ing the material and mastering the tasks, learners who have performance goals
are more focused on demonstrating their ability by measuring skill in compari-
son to others. We found few dashboards that contextualised the data in terms
of goals achieved, while the majority used different groups of peers as a frame
of reference. This finding suggests that the design of current dashboards is more
appealing to performance oriented learners, neglecting learners who have a ten-
dency towards mastery. Indeed, as Beheshitha et al. [2] observed, learners that
considered the subject matter of the course more motivating than competition
between students were more inclined to rate negatively the visualisation based
on social comparison. We found only one dashboard proposal that catered to the
needs of learners with different achievement goal orientations. Mastery Grids [20]
provides an open learner model for mastery oriented learners on which they can
monitor their progress, as well as social comparison features for performance
oriented learners.
The lack of support for goal achievement and the prevalence of comparison
fosters competition in learners. On the long-term, there is the threat that by
constantly being exposed to motivational triggers that rely on social compari-
son, comparison to peers and “being better than others” becomes the norm in
terms of what defines a successful learner. We argue that learning and education
should be about mastering knowledge, acquiring skills and developing compe-
tencies. For this purpose, comparison should be used carefully in the design of
learning dashboards, and research needs to investigate the effects of social com-
parison and competition in LA dashboards. More attention should be given to
the different needs of learners and dashboards should be used as pedagogical
tools to motivate learners with different performance levels that respond differ-
ently to motivating factors. As Tan [39] envisioned, “differentiated instruction
can become an experienced reality for students, with purposefully-designed LA
serving to compress, rather than exacerbate, the learning and achievement gap
between thriving and struggling students”.
PRE PRINT
5 Conclusion
This paper presents the results of a systematic survey looking into the use of
educational concepts in learning analytics dashboards for learners. Our main
findings show that, firstly, self-regulated learning is the core theory that informs
the design of LA dashboards that aim to make learners aware of their learning
process by visualising their data. However, just making learners aware is not
enough. Dashboards should have a broader purpose, using awareness and re-
flection as means to improve cognitive, behavioural or emotional competencies.
Secondly, effective support for online learners that do not have well developed
SRL skills should also facilitate goal setting and planning, and monitoring and
self-evaluation. As dashboards mostly aim to increase awareness and trigger self-
reflection, different tools should complement dashboards and be seamlessly inte-
grated in the learning environment and the instructional design. Thirdly, there
is a strong emphasis on comparison with peers as opposed to using goal achieve-
ment as reference frame. However, there is evidence in educational sciences that
disproves the benefits of fostering competition in learning. Our findings suggest
that the design of LA dashboards needs better grounding in learning sciences.
Finally, we see the need to investigate the effectiveness of using educational
concepts in the design of LA dashboards by looking at how these tools were eval-
uated, what are the effects perceived by learners and how learning was improved.
Our study was limited by a narrow focus set within the LA field, a relatively
recent research area. Valuable proposals could also be found in related fields,
e.g. educational data mining. We plan to answer these research questions in the
future by extending this work.
References
1. Anderson, T.: The theory and practice of online learning. Athabasca University
Press (2008)
2. Beheshitha, S.S., Hatala, M., Gaˇsevi´c, D., Joksimovi´c, S.: The role of achievement
goal orientations when studying effect of learning analytics visualizations. In: Proc.
of LAK’16. pp. 54–63. ACM (2016)
3. Bloom, B., Krathwohl, D., Masia, B.: Bloom taxonomy of educational objectives.
Allyn and Bacon, Boston, MA. Copyright by Pearson Education. (1984)
4. Bodily, R., Verbert, K.: Trends and issues in student-facing learning analytics
reporting systems research. In: Proc. of LAK’17. pp. 309–318. ACM (2017)
5. Charleer, S., Klerkx, J., Duval, E., De Laet, T., Verbert, K.: Creating effective
learning analytics dashboards: Lessons learnt. In: Proc. of EC-TEL’16. pp. 42–56.
Springer (2016)
6. Corey, M.L., Leinenbach, M.T.: Universal design for learning: Theory and practice.
In: Proc. of 2004 Society for Information Technology and Teacher Education Int.
Conf. pp. 4919–4926 (2004)
7. Davis, D., Jivet, I., Kizilcec, R.F., Chen, G., Hauff, C., Houben, G.J.: Follow the
successful crowd: raising MOOC completion rates through social comparison at
scale. In: Proc. of LAK’17. pp. 454–463. ACM (2017)
8. DeCarvalho, R.J.: The humanistic paradigm in education. The Humanistic Psy-
chologist 19(1), 88 (1991)
PRE PRINT
9. Durall, E., Gros, B.: Learning analytics as a metacognitive tool. In: CSEDU (1).
pp. 380–384 (2014)
10. Ekman, P., Friesen, W.V.: Facial action coding system (1977)
11. Elder, L., Paul, R.: Critical thinking: Why we must transform our teaching. Journal
of Developmental Education 18(1), 34 (1994)
12. Engestr¨om, Y.: Expansive visibilization of work: An activity-theoretical perspec-
tive. Computer Supported Cooperative Work (CSCW) 8(1), 63–93 (1999)
13. Ferguson, R.: Learning analytics: drivers, developments and challenges. Interna-
tional Journal of Technology Enhanced Learning 4(5-6), 304–317 (2012)
14. Festinger, L.: A theory of social comparison processes. Human relations 7(2), 117–
140 (1954)
15. Few, S.: Information Dashboard Design: Displaying data for at-a-glance monitor-
ing. Analytics Press (2013)
16. Fredricks, J.A., Blumenfeld, P.C., Paris, A.H.: School engagement: Potential of the
concept, state of the evidence. Review of educational research 74(1), 59–109 (2004)
17. Gaˇsevi´c, D., Dawson, S., Siemens, G.: Lets not forget: Learning analytics are about
learning. TechTrends 59(1), 64–71 (2015)
18. Gelfand, M.J., Raver, J.L., Nishii, L., Leslie, L.M., Lun, J., Lim, B.C., Duan, L.,
Almaliach, A., Ang, S., Arnadottir, J., et al.: Differences between tight and loose
cultures: A 33-nation study. science 332(6033), 1100–1104 (2011)
19. Greller, W., Drachsler, H.: Translating learning into numbers: A generic framework
for learning analytics. Educational technology & society 15(3), 42–57 (2012)
20. Guerra, J., Hosseini, R., Somyurek, S., Brusilovsky, P.: An intelligent interface
for learning content: Combining an open learner model and social comparison to
support self-regulated learning and engagement. In: Proc. of IUI’16. pp. 152–163.
ACM (2016)
21. Haggis, T.: Constructing images of ourselves? A critical investigation into ’ap-
proaches to learning’ research in higher education. British Educational Research
Journal 29(1), 89–104 (2003)
22. Hofstede, G.: Cultures and organizations. intercultural cooperation and its impor-
tance for survival. software of the mind. London: Mc Iraw-Hill (1991)
23. Huguet, P., Galvaing, M.P., Monteil, J.M., Dumas, F.: Social presence effects in the
Stroop task: further evidence for an attentional view of social facilitation. Journal
of personality and social psychology 77(5), 1011 (1999)
24. Kim, B.: Social constructivism. Emerging perspectives on learning, teaching, and
technology 1(1), 16 (2001)
25. Kirkpatrick, D.L.: Evaluating training programs. Tata McGraw-Hill Education
(1975)
26. Klerkx, J., Verbert, K., Duval, E.: Enhancing learning with visualization tech-
niques. In: Handbook of research on educational communications and technology,
pp. 791–807. Springer (2014)
27. Loboda, T.D., Guerra, J., Hosseini, R., Brusilovsky, P.: Mastery grids: An open
source social educational progress visualization. In: Proc. of EC-TEL’14. pp. 235–
248. Springer (2014)
28. Lonn, S., Aguilar, S.J., Teasley, S.D.: Investigating student motivation in the con-
text of a learning analytics intervention during a summer bridge program. Com-
puters in Human Behavior 47, 90–97 (2015)
29. Marzouk, Z., Rakovic, M., Liaqat, A., Vytasek, J., Samadi, D., Stewart-Alonso, J.,
Ram, I., Woloshen, S., Winne, P.H., Nesbit, J.C.: What if learning analytics were
based on learning science? Australasian Journal of Educational Technology 32(6)
(2016)
PRE PRINT
30. McAlpine, L., Weston, C.: Reflection: Issues related to improving professors’ teach-
ing and students’ learning. Instructional Science 28(5), 363–385 (2000)
31. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., Group, P., et al.: Preferred
reporting items for systematic reviews and meta-analyses: the PRISMA statement.
PLoS med 6(7), e1000097 (2009)
32. Pintrich, P.R.: Multiple goals, multiple pathways: The role of goal orientation in
learning and achievement. Journal of educational psychology 92(3), 544 (2000)
33. Pintrich, P.R., De Groot, E.V.: Motivational and self-regulated learning compo-
nents of classroom academic performance. Journal of educational psychology 82(1),
33 (1990)
34. Sadler, D.R.: Formative assessment and the design of instructional systems. In-
structional science 18(2), 119–144 (1989)
35. Schwendimann, B., Rodriguez-Triana, M., Vozniuk, A., Prieto, L., Boroujeni, M.,
Holzer, A., Gillet, D., Dillenbourg, P.: Perceiving learning at a glance: A systematic
literature review of learning dashboard research. IEEE Transactions on Learning
Technologies (2016)
36. Shum, S.B., Crick, R.D.: Learning dispositions and transferable competencies: ped-
agogy, modelling and learning analytics. In: Proc. of LAK’12. pp. 92–101. ACM
(2012)
37. Siemens, G., Gaˇsevi´c, D.: Guest editorial-learning and knowledge analytics. Edu-
cational Technology & Society 15(3), 1–2 (2012)
38. Suthers, D., Verbert, K.: Learning analytics as a middle space. In: Proc. of LAK’13.
pp. 1–4. ACM (2013)
39. Tan, J.P.L., Yang, S., Koh, E., Jonathan, C.: Fostering 21st century literacies
through a collaborative critical reading and learning analytics environment: user-
perceived benefits and problematics. In: Proc. of LAK’16. pp. 430–434. ACM (2016)
40. Trilling, B., Fadel, C.: 21st century skills: Learning for life in our times. John Wiley
& Sons (2009)
41. Venant, R., Vidal, P., Broisin, J.: Evaluation of learner performance during practi-
cal activities: An experimentation in computer education. In: Proc. of ICALT’16.
pp. 237–241. IEEE (2016)
42. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
43. Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx,
J.: Learning dashboards: an overview and future research opportunities. Personal
and Ubiquitous Computing 18(6), 1499–1514 (2014)
44. Winne, P.H., Zimmerman, B.J.: Self-regulated learning viewed from models of
information processing. Self-regulated learning and academic achievement: Theo-
retical perspectives 2, 153–189 (2001)
45. Wise, A.F.: Designing pedagogical interventions to support student use of learning
analytics. In: Proc. of LAK’14. pp. 203–211. ACM (2014)
46. Wu, W.H., Hsiao, H.C., Wu, P.L., Lin, C.H., Huang, S.H.: Investigating the
learning-theory foundations of game-based learning: a meta-analysis. Journal of
Computer Assisted Learning 28(3), 265–279 (2012)
47. Yoo, Y., Lee, H., Jo, I.H., Park, Y.: Educational dashboards for smart learning: Re-
view of case studies. In: Emerging Issues in Smart Learning, pp. 145–155. Springer
(2015)
48. Zimmerman, B.J.: Self-regulated learning and academic achievement: An overview.
Educational psychologist 25(1), 3–17 (1990)
49. Zimmerman, B.J., Boekarts, M., Pintrich, P., Zeidner, M.: A social cognitive per-
spective. Handbook of self-regulation 13(1), 695–716 (2000)
PRE PRINT
... A tracking technology acts as a cybernetic system which collects information on specific indicators to regulate the decision-making process (Green & Welsh, 1988). This technology collects learners' interactions and feeds this information into their dashboards, to be displayed and compared with similar information collected from other learners (Jivet et al., 2017). Despite the benefits of these tracking dashboards, evidence shows that the learners do not use them. ...
... Harvey and Keyes (2019) also reported lower self-esteem among those students who (from information provided by the dashboard) believed their academic performance to be lower than their class average. Moreover, Jivet et al. (2017) showed that peer comparison could shift the students' focus from mastering knowledge and skills to comparing themselves against others. Some researchers, e.g., Toohey et al. (2019), have suggested that the peer comparison component in learning dashboards needs to be carefully designed to minimise its negative impact on students. ...
... This indicates that some learners engage in buying the application without sufficient reason. Viewing and copying the completed activities could develop learners' engagement to a deeper level because the learning content is meaningful and not a mere set of random tasks (Jivet et al., 2017). Wang et al. (2019) also reported that a higher level of difficulty in the online course is correlated with more rational herd behaviour. ...
Article
Full-text available
This research examines the effect of having a tracking technology in a learning management system (LMS) that reports the effect of perceiving other students’ interactions on a learner’s intention to keep using LMS in the future. The main underlying theory is herd behaviour theory which argues that crowd behaviour affects the perceptions of the observers. In this paper, we proposed and found that tracking technology will affect a learner’s perceptions of cognitive absorption and that perception of self-regulation from using an LMS. These perceptions are found to influence the learner’s intention to keep using the LMS in the future positively. This research developed a new tracking technology in response to weaknesses noted in the literature and validated by interviewing teachers. Its effects were tested on 151 university students taking a computer science module. This research contributes to knowledge by integrating herd behaviour theory into the design of LMS and offers a new perspective on learners’ interactions with educational technologies.
... In addition, there are concerns related to LA's weak theoretical and pedagogical rationale for learning (Tsai et al., 2020). LA holds the promise of renewing education in profound ways but to reach its full potential it must be connected to learning sciences and grounded in pedagogical reasonings, as pointed out by Gašević et al. (2017), Jivet et al., (2017Jivet et al., ( , 2018, Nunn et al. (2016) and Tsai et al. (2020). ...
... By making the process of learning and competence development more visible, LA was considered to promote students' self-awareness of their strengths and weaknesses, and to facilitate the taking of actions to meet possible development needs. However, this does not happen without reflection and understanding, as one of the study participants pointed out: Jivet et al. (2017) rightly argue that although utilization of LA data is often designed to foster awareness, being aware does not guarantee that necessary actions are taken to facilitate learning and intended outcomes are achieved. As they suggest, students should be encouraged to take subsequent steps, such as setting goals and tracking one's own progress. ...
... As pointed out by Sedrakyan et al. (2020), a student may want to spend more time on practicing a topic in which he/she lacks prior knowledge and spend less time on focusing on areas in which he/she has previous knowhow. However, there is very little evidence of emerging LA approaches that would support goal setting and planning in educational settings as current practice primarily focuses on triggering reflection and supporting awareness (Jivet et al., 2017). As emphasized by Jivet et al. (2017), an increasing emphasis should be placed on designing LA to guide students with different performance levels, needs and motivating factors. ...
Article
Full-text available
Higher education institutions are challenged to develop innovative educational solutions to meet the competence development requirements set by the emerging future. This qualitative case study aims to identify the future competences considered important for higher education students to acquire during their studies and how the development of these competences can be supported with learning analytics. Reflection on these issues is based on three dimensions (subject development, object, and social environment) of future competences. A special emphasis is placed on the views of 19 teaching professionals gathered from group interviews and analyzed through a qualitative content analysis. The findings indicate that subject development-related future competences, such as reflective competence, self-awareness and self-management, learning literacy, and personal agency and self-efficacy were strongly identified as necessary future competences. The potential of learning analytics to support their development was also widely recognized as it provides means to reflect on learning and competence development and increase one’s self-awareness of strengths and weaknesses. In addition, learning analytics was considered to promote goal-orientation, metacognition and learning to learn, active engagement as well as learning confidence. To deal with complex topics and tasks, students should also acquire object-related competences, such as changeability and digital competence. In addition, they need cooperation and communication competence as well as a developmental mindset to operate successfully in social environments. The use of learning analytics to support most of these object and social environment-related competences was considered promising as it enables the wide exploitation of digital tools and systems, the analysis and visualization of social interactions, and the formation of purposeful learning groups and communal development practices. However, concrete ways of applying learning analytics were largely unacknowledged. This study provides useful insights on the relationship of important future competences and learning analytics while expanding on previous research and conceptual modelling. The findings support professionals working at higher education institutions in facilitating successful conditions for the development of future competences and in advancing purposeful use of learning analytics.
... LA dashboards combine automated analysis techniques with interactive visualisations for effective understanding, reasoning, and decision-making based on large, complex datasets on student activity (Schwendimann et al., 2016;Jivet et al., 2017). Teachers can use the insights gained from these dashboards as tools for evaluating and reflecting on their teaching practice (Keim et al., 2008), and track students' social and cognitive progress (Van Leeuwen et al., 2015;Bakharia & Dawson, 2011). ...
... teachers) (Dollinger et al., 2019). Second, dashboards are only minimally aligned with learning theory (Gasevic et al., 2016), which makes it difficult to choose the nature of the data to collect and visualise to teachers (Jivet et al., 2017). This means that more work is needed to design LA dashboards grounded within the learning sciences, with the hope of increasing their relevance to teachers' pedagogical needs. ...
Article
Full-text available
Despite the potential of learning analytics (LA) to support teachers’ everyday practice, its adoption has not been fully embraced due to the limited involvement of teachers as co-designers of LA systems and interventions. This is the focus of the study described in this paper. Following a design-based research (DBR) approach and guided by concepts from the socio-cultural perspective and human-computer interaction (HCI), we design, test, and evaluate a teacher-facing LA dashboard, the Canvas Discussion Analytics Dashboard (CADA), in real educational settings. The goal of this dashboard is to support teachers’ roles in online environments through insights into students’ participation and discourse patterns. We evaluate CADA through 10 in-depth interviews with university teachers to examine their experiences using CADA in seven blended undergraduate and graduate courses over a one-year period. The findings suggest that engaging teachers throughout the analytics tool design process and giving them control/agency over LA tools can favour their adoption in practice. Additionally, the alignment of dashboard metrics with relevant theoretical constructs allows teachers to monitor the learning designs and make course design changes on the fly. The teachers in this study emphasise the need for LA dashboards to provide actionable insights by moving beyond what things are towards how things should b e. This study has several contributions. First, we make an artefact contribution (e.g. CADA), an LA dashboard to support teachers with insights into students’ online discussions. Second, by leveraging theory, and working with the teachers to develop and implement a dashboard in authentic teaching environments, we make an empirical, theoretical and methodological contribution to the field of learning analytics and technology enhanced learning. We synthesise these through practical design and implementation considerations for researchers, dashboard developers, and higher education institutions.
... From SDT theory learners feel intrinsically motivated when they gain competence, autonomy, and relatedness. From these two theories a good feedback module should choose a good frame of reference thoughtfully, i.e. allows comparison with peers, achievement and progress [30]. From these options learners should be given an opportunity to choose their preferred option. ...
Conference Paper
Full-text available
School readiness predicts both school and life success, so measuring it effectively is extremely important. Current school readiness tests focus on pre-academic skills; however, mastery motivation (MM: persistent, focus on trying to do a task, and executive functions (EF; planful self-control) are also crucial. The purpose of this paper is to provide an overview of the chronological development of Finding Out Children’s Unique Strengths (FOCUS App) which was initially designed for Hungarian and US cultures. FOCUS App is a game-like tablet-based assessment of two Approaches to Learning (ATL) domains, Mastery Motivation (MM) and Executive Functions (EF) and pre-academic skills for 3-8-year-old children. To measure these competencies, FOCUS has a total of seven tasks. Tasks 1 and 2 assess two pre-academic skills. Tasks 3-5 are letter and number search tasks designed to assess MM, operationalised as the child’s persistence during moderately challenging tasks. Tasks 6 and 7 are primarily designed to assess EFs and provide MM measures based on a modified version of the dimensional card change sorting tasks (DCCS). Task 3-7 assess Approaches to Learning. The development of the FOCUS App started with face to face testing using paper-based prototypes designed to simulate the computer tablet screens. These pilot tests were successful in assessing number and letter search but not emotions. The level of difficulty was improved, and computer-based tasks were developed. The first version of FOCUS was released in 2017 for testing, piloting and evaluation. This version was web-based, developed on the .net platform and collected data on pre-academic skills, MM and EF. This version was in English and Hungarian. Due to internet connectivity challenges in schools, the second version was released in 2018, which was offline, tablet-based, and built on the Android platform. This version collected video recordings that enable the assessment of participants’ task persistence and emotional expressions. Two other adaptations have been designed, the Hebrew and Kenyan versions. We are now designing a feedback module that will provide real-time results to the user during school readiness tests. FOCUS App has several advantages over other similar apps. First, it assesses more than one domain. Second, it has adopted the Evidence Centred Design which provides evidence for tasks undertaken by children. Third, it is computer game-based to make it fun for the children and to automate the process of data collection using a narrator that gives instructions. Moreover, the tasks are individualised, thus increasing the ability of the app to offer individualised interventions. Lastly, unlike other tools that have adopted the Dimensional Change Card Sort task (DCCS) the FOCUS app does not use reaction time to measure difficulty at higher levels.
... Suthers and Verbert [4] stated that the learning analytics research should investigate which educational concepts constitute the theoretical foundation for developing student-facing learning dashboards. Existing learning dashboards also need to be evaluated for the effectiveness of the design and level of engagement with the learners [5]. Designing a learning dashboard that draws from previous studies allows us to identify the features that are most important to learners as well as the dashboards' limitations. ...
Chapter
Online education is gaining popularity because of its flexibility, accessibility , multimedia usage, and many other benefits that students enjoy. However, online learners, particularly in self-paced online learning (SPOL) courses, confront some inherent learning barriers, such as low learning awareness and engagement , lack of academic intervention, and lack of motivation. These barriers may negatively affect learners' progress and academic performance in SPOL, especially in Science, Technology, Engineering, and Mathematics (STEM) courses, which require a more collaborative, supportive, and engaging environment for the learners to become successful. Recently, various artificial intelligence (AI) technologies have shown great potential in removing such barriers. In this chapter, we propose a methodology of intelligent learning dashboard focus-ing on SPOL and discuss three aspects: how to construct mechanisms for adap-tive formative assessment and student engagement detection with the state-of-the-art AI techniques, how to design and integrate these technologies in intelligent learning dashboards, and how to include these mechanisms in the course learning design loop to ensure data collection and pedagogical connection.
... In Figures 1-5, we show some visualizations of LAD's. Jivet et al. (2017) also pointed out that awareness alone is not enough: LAD's need to build on the awareness created to improve cognitive, behavioral or emotional competencies, guide learners who lack self-regulated learning skills, and should be complementary to other learning tools. In this work, we build on these ideas to evaluate how education blockchain visualizations are currently being implemented. ...
Article
Full-text available
The use of blockchain in education has become one of the trending topics in education technology research. However, only a handful of education blockchain solutions have provided a measure of the impact on students' learning outcomes, teaching, or administrative processes. This work reviews how academic data stored on the blockchain is being visualized across various education blockchain solutions. We argue that education's uniqueness requires a different visualization approach that supports students' learning activities, advances teaching methods, and facilitates administrative procedures. We identify a consistent trend where most of the proposed education blockchain solutions focus on credentials collection and do not provide a way to make sense of the blockchain's data. Thus, we conducted a needs analysis by interviewing four teachers to understand essential features when accessing distributed academic data, report these results and use them to inform the features of our proposed visualizations. Our unique contributions include: presenting typical use cases of distributed learning records from multiple education institutes and demonstrating how past learning records of students stored on the blockchain can be visualized to support current learning. We also propose a method of visualization to increase the data awareness of information owners through the blockchain. ARTICLE HISTORY
Article
Hybrid systems combining artificial and human intelligence hold great promise for training human skills. In this paper, I position the concept of Hybrid Human-AI Regulation and illustrate this with an example of a first prototype of a Hybrid Human-AI Regulation (HHAIR) system. HHAIR supports self-regulated learning (SRL) in the context of adaptive learning technologies (ALTs) with the aim to develop learners' self-regulated learning skills. This prototype targets young learners (10–14 years) for whom SRL skills are critical in today's society. Many of these learners use ALTs to learn mathematics and languages every day in school. ALTs optimize learning based on learners' performance data, but even the most sophisticated ALTs fail to support SRL. In fact, most ALTs take over (offload) regulation from learners. In contrast, HHAIR positions hybrid regulation as a collaborative task of the learner and the AI which is gradually transferred from AI-regulation to self-regulation. Learners will increasingly regulate their own learning progressing through different degrees of hybrid regulation. In this way HHAIR supports optimized learning and the transfer and development of SRL skills for lifelong learning (future learning). The HHAIR concept is novel in proposing a hybrid intelligence approach training human SRL skills with AI. This paper outlines theoretical foundations from SRL theory, hybrid intelligence and learning analytics. A first prototype in the context of ALTs for young learners is described as an example of hybrid human-AI regulation and future advancement is discussed. In this way, foundational theoretical, empirical, and design work are combined in articulating the concept of Hybrid Human-AI Regulation which features forward adaptive support for SRL and transfer of control between human and AI over regulation.
Article
Clickstream data have been used increasingly to present students in online courses with analytics about their learning process to support self-regulation. Drawing on self-regulated learning theory and attribution theory, we hypothesize that providing students with analytics on their own effort along with the effort and performance of relevant peers will help students attribute their performance to factors under their control and thus positively influence their subsequent behavior and performance. To test the effect of the analytics and verify the proposed mechanism, we conducted an experiment in an online undergraduate course in which students were randomly assigned to receive theoretically inert questions (control condition), attribution questions (active control condition), and the analytics with attribution questions (treatment condition). The intervention significantly increased effort attribution, reduced ability attribution, and improved subsequent effort for a subgroup of students who self-reported low performance, although there was no significant impact on their performance.
Article
Full-text available
This paper presents a systematic literature review of the state-of-the-art of research on learning dashboards in the fields of Learning Analytics and Educational Data Mining. Research on learning dashboards aims to identify what data is meaningful to different stakeholders and how data can be presented to support sense-making processes. Learning dashboards are becoming popular due to the increased use of educational technologies, such as Learning Management Systems (LMS) and Massive Open Online Courses (MOOCs). The initial search of five main academic databases and GScholar resulted in 346 papers out of which 55 papers were included in the final analysis. Our review distinguishes different kinds of research studies as well as various aspects of learning dashboards and their maturity regarding evaluation. As the research field is still relatively young, most studies are exploratory and proof-of-concept. The review concludes by offering a definition for learning dashboards and by outlining open issues and future lines of work in the area of learning dashboards. There is a need for longitudinal research in authentic settings and studies that systematically compare different dashboard designs.
Conference Paper
Full-text available
The affordances of learning analytics (LA) are being increasingly harnessed to enhance 21st century (21C) pedagogy and learning. Relatively rare, however, are use cases and empirically based understandings of students’ actual experiences with LA tools and environments at fostering 21C literacies, especially in secondary schooling and Asian education contexts. This paper addresses this knowledge gap by 1) presenting a first iteration design of a computer-supported collaborative critical reading and LA environment and its 16-week implementation in a Singapore high school; and 2) foregrounding students’ quantitative and qualitative accounts of the benefits and problematics associated with this learning innovation. We focus the analytic lens on the LA dashboard components that provided visualizations of students’ reading achievement, 21C learning dispositions, critical literacy competencies and social learning network positioning within the class. The paper aims to provide insights into the potentialities, paradoxes and pathways forward for designing LA that take into consideration the voices of learners as critical stakeholders.
Conference Paper
We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not meet the inclusion criteria, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student-facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work.
Conference Paper
Social comparison theory asserts that we establish our social and personal worth by comparing ourselves to others. In in-person learning environments, social comparison offers students critical feedback on how to behave and be successful. By contrast, online learning environments afford fewer social cues to facilitate social comparison. Can increased availability of such cues promote effective self-regulatory behavior and achievement in Massive Open Online Courses (MOOCs)? We developed a personalized feedback system that facilitates social comparison with previously successful learners based on an interactive visualization of multiple behavioral indicators. Across four randomized controlled trials in MOOCs (overall N = 33, 726), we find: (1) the availability of social comparison cues significantly increases completion rates, (2) this type of feedback benefits highly educated learners, and (3) learners' cultural context plays a significant role in their course engagement and achievement.
Article
Learning analytics are often formatted as visualisations developed from traced data collected as students study in online learning environments. Optimal analytics inform and motivate students' decisions about adaptations that improve their learning. We observe that designs for learning often neglect theories and empirical findings in learning science that explain how students learn. We present six learning analytics that reflect what is known in six areas (we call them cases) of theory and research findings in the learning sciences: Setting goals and monitoring progress, distributed practice, retrieval practice, prior knowledge for reading, comparative evaluation of writing, and collaborative learning. Our designs demonstrate learning analytics can be grounded in research on self-regulated learning and self-determination. We propose designs for learning analytics in general should guide students toward more effective self-regulated learning and promote motivation through perceptions of autonomy, competence, and relatedness.
Conference Paper
Learning Analytics (LA) dashboards help raise student and teacher awareness regarding learner activities. In blog-supported and inquiry-based learning courses, LA data is not limited to student activities, but also contains an abundance of digital learner artefacts, such as blog posts, hypotheses, and mind-maps. Exploring peer activities and artefacts can help students gain new insights and perspectives on learning efforts and outcomes, but requires effort. To help facilitate and promote this exploration, we present the lessons learnt during and guidelines derived from the design, deployment and evaluation of five dashboards.
Chapter
An educational dashboard is a display which visualizes the results of educational data mining in a useful way. Educational data mining and visualization techniques allow teachers and students to monitor and reflect on their online teaching and learning behavior patterns. Previous literature has included such information in the dashboard to support students’ self-knowledge, self-evaluation, self-motivation, and social awareness. Further, educational dashboards are expected to support the smart learning environment, in the perspective that students receive personalized and automatically-generated information on a real-time base, by use of the log files in the Learning Management System (LMS). In this study, we reviewed ten case studies that deal with development and evaluation of such a tool, for supporting students and teachers through educational data mining techniques and visualization technologies. In the present study, a conceptual framework based on Few’s principles of dashboard design and Kirkpatrick’s fourlevel evaluation model was developed to review educational dashboards. Ultimately, this study is expected to evaluate the current state of educational dashboard development and suggest an evaluative tool to judge whether or not the dashboard function is working properly, in both a pedagogical and visual way.