Content uploaded by Ioana Jivet
Author content
All content in this area was uploaded by Ioana Jivet on Jul 04, 2017
Content may be subject to copyright.
Awareness is not enough. Pitfalls of learning
analytics dashboards in the educational practice
Ioana Jivet1, Maren Scheffel1, Hendrik Drachsler2,3,1, and Marcus Specht1
1Open Universiteit, Valkenburgerweg 177, 6419AT Heerlen, NL
ioana.jivet@ou.nl, maren.scheffel@ou.nl, hendrik.drachsler@ou.nl,
marcus.specht@ou.nl
2Goethe University Frankfurt, Germany
3German Institute for International Educational Research (DIPF), Germany
drachsler@dipf.de
Abstract. It has been long argued that learning analytics has the po-
tential to act as a “middle space” between the learning sciences and data
analytics, creating technical possibilities for exploring the vast amount
of data generated in online learning environments. One common learn-
ing analytics intervention is the learning dashboard, a support tool for
teachers and learners alike that allows them to gain insight into the
learning process. Although several related works have scrutinised the
state-of-the-art in the field of learning dashboards, none have addressed
the theoretical foundation that should inform the design of such inter-
ventions. In this systematic literature review, we analyse the extent to
which theories and models from learning sciences have been integrated
into the development of learning dashboards aimed at learners. Our crit-
ical examination reveals the most common educational concepts and the
context in which they have been applied. We find evidence that cur-
rent designs foster competition between learners rather than knowledge
mastery, offering misguided frames of reference for comparison.
Keywords: learning dashboards, learning theory, learning analytics, sys-
tematic review, learning science, social comparison, competition
1 Introduction
Learning Analytics (LA) emerged from the need to harness the potential of the
increasingly large data sets describing learner data generated by the widespread
use of online leaning environments and it has been defined as “the measurement,
collection, analysis, and reporting of data about learners and their contexts,
for purposes of understanding and optimising learning and the environments in
which it occurs” [37]. Ferguson [13] identified two main challenges when it comes
to learning analytics: (i) building strong connections to learning sciences and (ii)
focusing on the perspectives of learners.
There is a common notion in the LA community that learning analytics
research should be deeply grounded in learning sciences [28,29]. Suthers & Ver-
bert [38] labelled LA the “middle space” as it lies at the intersection between
PRE PRINT
technology and learning sciences. Moreover, LA should be seen as an educational
approach guided by pedagogy and not the other way around [19]. However, there
is a strong emphasis on the “analytics”, i.e. computation of the data and cre-
ation of predictive models, and not so much on the “learning”, i.e. applying
and researching LA in the learning context where student outcomes can be im-
proved [17].
One of the focuses of LA research is to empower teachers and learners to make
informed decisions about the learning process, mainly by visualising the collected
learner data through dashboards [9]. Learning analytics dashboards are “single
displays that aggregate different indicators about learner (s), learning process(es)
and/or learning context(s) into one or multiple visualizations” [35]. Dashboards
have been developed for different stakeholder groups, including learners, teach-
ers, researchers or administrative staff [35]. Charleer et al. [5] suggest that LA
dashboards could be used as powerful metacognitive tools for learners, triggering
them to reason about the effort invested in the learning activities and learning
outcomes. However, a large majority of dashboards are still aimed at teachers, or
at both teachers and learners [35]. Moreover, there has been very little research
in terms of what effects such tools have on learning [26].
As a first step towards building effective dashboards for learners, we need to
understand how learning sciences can be considered in the design and pedagogi-
cal use of learning dashboards. Following Suther and Verbert’s [38] position that
learning analytics research should be explicit about the theory or conception of
learning underlying the work, we sought out to investigate which educational
concepts constitute the theoretical foundation for the development of learning
dashboards aimed at learners.
A number of previous works reviewed LA dashboards from different per-
spectives, including their design and evaluation. Verbert et al. [42] introduced
a conceptual framework for analysing LA applications and reviewed 15 dash-
boards based on the target users, displayed data and the focus of the evaluation.
A follow-up review [43] extended this analysis to 24 dashboards, examining the
context in which the dashboards had been deployed, the data sources, the de-
vices used and the evaluation methodology. Yoo et al. [47] reviewed the design
and evaluation of 10 educational dashboards for teachers and students through
their proposed evaluative tool based on Few’s principles of dashboard design [15]
and Kirkpatrick’s four-level evaluation model [25]. A more recent systematic re-
view by Schwendimann et al. [35] of 55 dashboards looked at the context in
which dashboards had been deployed, their purpose, the displayed indicators,
the technologies used, the maturity of the evaluation and open issues.
The scope of all these reviews included learning analytics dashboards, regard-
less of their target users. Focusing on the challenges identified by Ferguson [13],
we narrow down our scope to LA dashboards aimed at learners in order to focus
on their perspective. A closely related work to this paper was published by Bod-
ily & Verbert [4]. They provided a systematic review that focused exclusively
on student-facing LA systems, including dashboards, educational recommender
systems, EDM systems, ITS and automated feedback systems. The systems were
PRE PRINT
analysed based on functionality, data sources, design analysis, perceived effects
on learners and actual effects.
Although other works looked into the learning theory foundations of game-
based learning [46], one major limitation of previous dashboard reviews is that
none investigate the connection to learning sciences. Moreover, [35] and [4] pro-
vide recommendations for the design of learner dashboards, but none suggest
the use of educational concepts as a basis for the design or evaluation of the
dashboards. Through this systematic literature review we aim to bridge this gap
by investigating the relation between educational concepts and the design of
learning dashboards. Dashboard design was previously examined by looking at
the type of data displayed on the dashboard and the type of charts or visuali-
sation that were used. However, in this study, we will specifically focus on how
the data presented on the dashboard is contextualised and framed to ease the
sense-making for the learners.
Throughout this literature review, we explore how educational concepts are
integrated into the design of learning dashboards. Our study is guided by the
following research question: According to which educational concepts are learning
analytics dashboards designed?
2 Methodology
Prior to the systematic review, we conducted an informative literature search in
order to get an overall picture of the field. We ran the systematic literature review
following the PRISMA statement [31] and we selected the following databases
which contain research in the field of Technology Enhanced Learning: ACM Dig-
ital Library, IEEEXplore, SpringerLink, Science Direct, Wiley Online Library,
Web of Science and EBSCOhost. Additionally, we included Google Scholar to
cover any other sources, limiting the number of retrieved results to 200. We
searched the selected databases using the following search query: “learning an-
alytics” AND (visualization OR visualisation OR dashboard OR widget). The
first term narrows down the search field to learning analytics, while the second
part of the query is meant to cover different terminologies used for this type of
intervention, addressing one of the limitations identified in [35]. Although the
scope of this review is limited to visualisations that have learners as end-users, it
was not possible to articulate this criterion in relevant search terms. Therefore,
the approach that we took was to built a query that retrieves all dashboards,
regardless of their target end-users, and remove the ones that fall out of our
scope in a later phase.
The queries were run on February 20th, 2017, collecting 1439 hits. Each result
was further screened for relevance, i.e. whether it described a learning dashboard
aimed at learners, by examining the title and the abstract, thus reducing the list
of potential candidate papers to 212. Eleven papers that we came across during
the informal check and fit the scope of our survey were also added to the set of
papers to be further examined. Next, we accessed the full text of each of these
PRE PRINT
223 studies in order to assess whether they are eligible for our study considering
the following criteria:
1. the paper’s full text is available in English;
2. the paper describes a fully developed dashboard, widget or visualisation, i.e.
we excluded theoretical papers, essays or literature reviews;
3. the target user group of the dashboard is learners;
4. the authors explicitly mention theoretical concepts for the design;
5. the paper includes an evaluation of the dashboard.
We identified 95 papers that satisfied the first three criteria. Only half of these
papers have theoretical grounding in educational concepts, suggesting a large gap
between learning sciences and this type of learning analytics interventions. The
focus of this study is set on 26 papers that describe dashboards that both rely
on educational concepts (criterion 4) and were empirically evaluated (criterion
5). The list of papers included in this review is available at bit.ly/LADashboards.
3 Results
We started this investigation by collecting the theoretical concepts and models
used in the dashboards and analysing the relationships between the purpose of
the dashboards and the concepts that were employed in the development of the
dashboard. Next, we looked at how the design of these dashboards integrate
different concepts from learning sciences.
3.1 Learning theories and models
By analysing the introduction, background and dashboard design sections of
each of the papers included in this study, we identified 17 theories, models and
concepts that we bundled into six clusters (see Table 1).
EC1: Cognitivism cluster relies upon the cognitivism paradigm which posits
that learning is an internal process, involving the use of memory, thinking,
metacognition and reflection [1]. This is the most represented category through
self-regulated learning (SRL), 16 papers citing the works of Zimmerman [48],
Pintrich [33] or Winne [44]. Deep vs surface learning theory explains differ-
ent approaches to learning, where deep learners seek to understand the mean-
ing behind the material and surface learners concentrate on reproducing the
main facts [21]. EC2: Constructivism cluster is rooted in the assumption that
learners are information constructors and learning is the product of social in-
teraction [1]. Social constructivist learning theory [24] and Paul-Elder's critical
thinking model [11] have been used mostly in dashboards aimed to offer learner
support in collaborative settings, while Engestr¨om’s activity theory [12] was used
as a pedagogical base for supporting university students overcome dyslexia. EC3:
Humanism cluster puts the learner at the centre of the learning process, seeking
to engage the person as a whole and focusing on the study of the self, motivation
PRE PRINT
Table 1. Six clusters presentation of educational concepts identified and the pa-
pers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Cluster Educational concept Freq. Papers
EC1: Cognitivism Self-regulated learning 16
D1; D4; D5; D7; D9; D11; D12;
D14; D15; D18; D20; D21; D22;
D23; D25; D26
Deep vs surface learning 2 D16; D19
EC2: Constructivism
Collaborative learning 6 D12; D13; D14; D16; D24; D26
Social constructivist
learning theory 4 D7; D13; D19; D22
Engestr¨
om activity theory 1 D12
Paul-Elder's critical
thinking model 1 D19
EC3: Humanism
Experiential learning 2 D4; D13
Learning dispositions 1 D2
21st century skills 4 D2; D11; D13; D19
Achievement goal orientation 3 D15; D19; D24
EC4: Descriptive
models Engagement model 1 D10
EC5: Instructional
design
Universal Design for Learning
instructional framework 1 D19
Formative assessment 3 D3; D6; D19
Bloom’s taxonomy 3 D3; D4; D22
EC6: Psychology
Ekman’s model for
emotion classification 1 D23
Social comparison 3 D8; D15; D25
Culture 1 D25
and goals [8]. More recent works focus on developing 21st century skills [40] and
learning dispositions [36]. Achievement goal orientation theory is concerned with
learners’ motivation for goal achievement [32]. EC4: Descriptive models cluster
includes the engagement model [16] which differentiates between behavioural,
emotional and cognitive engagement. Several papers also cover the pedagogical
use of dashboards, aligning the EC5: Instructional design in which the dashboard
is embedded with Bloom’s taxonomy [3], formative assessment [34] or Universal
Design for Learning framework [6]. While the majority of these clusters contain
concepts belonging to the learning sciences field, we also identified three con-
cepts that originate in the broader field of EC6: Psychology: Ekman’s model of
emotions and facial expressions [10], social comparison [14] and culture [18, 22].
3.2 Dashboard goals and educational concepts
In order to understand the reasons behind using these educational concepts, we
analysed the goals of the dashboards and looked at how their use was explained in
the papers. We extracted the goals of each dashboard and categorised them based
on the competence they aimed to affect in learners: metacognitive, cognitive,
PRE PRINT
Table 2. Competencies, the goals that are intended to affect each competence and the
papers in which they appear. The list of papers included in this review is available at
bit.ly/LADashboards
Competence Goal Freq. Papers
C1: Metacognitive
Improvemetacognitive
skills 4 D6; D7; D20; D23
Support awareness
and reflection 20
D1; D2; D3; D4; D6; D7; D9; D10;
D11; D12; D13; D14; D17; D18;
D20; D21; D22; D23; D25; D26
Monitor progress 8D7; D8; D11; D15; D19; D20;
D22; D23
Support planning 2 D20; D22
C2: Cognitive Support goal achievement3 D9; D18; D25
Improve performance 3 D16; D23; D24
C3: Behavioural
Improve retention or
engagement2 D10; D25
Improve online social
behaviour 7D7; D13; D14; D16; D19; D24;
D26
Improve help-seeking
behaviour 1 D17
Offer navigational support 2 D8; D15
C4: Emotional
Deactivate negative
emotions 1 D9
Increase motivation 4 D2; D8; D15; D19
C5: Self-regulation Support self-regulation 13
D1; D4; D7; D9; D11; D12;
D15; D19; D20; D21; D22;
D23; D25
behavioural or emotional (see Table 2). Most of the dashboards do not serve only
one goal, but rather aim to catalyse changes in multiple competencies. A fifth
category C5: Self-regulation was also added to account for papers that explicitly
described their goal as supporting self-regulation, a concept that involves all four
competencies [48].
Figure 1 illustrates the relation between the goals of the dashboards and the
educational concept clusters listed in Table 1. We can draw some interesting
observations from these connections. Firstly, the largest part of the visualisa-
tions aim to influence learners’ metacognitive competence, with the purpose of
supporting awareness and reflection. This aim is often motivated by SRL the-
ory, a learning concept that heavily relies on the assumption that actions are
consequences of thinking as SRL is achieved in cycles consisting of forethought,
performance and self-reflection [49]. SRL also motivates the goal of monitoring
progress and supporting planning, but to a lesser extent. Social constructivist
learning theory and collaborative learning also appear quite frequently in rela-
tion to metacognition, due to the collaborative setting in which the dashboards
were used. Dashboard developers argue that for effective collaboration, learners
need to be aware of their teammates’ learning behaviour, activities and out-
PRE PRINT
comes. Other concepts used for affecting the metacognitive level are formative
assessment as it implies evaluation of one’s performance, 21st century skills with
the focus on learning how to learn and social comparison as a means for framing
the evaluation of one’s performance.
Secondly, there is a strong emphasis on supporting the self-regulation com-
petence by using cognitivist concepts. The design of these dashboards is usually
informed by SRL theory. Constructivist concepts are also commonly used for the
development of these dashboards because the context in which these dashboards
were deployed is the online collaborative learning setting. Less used are instruc-
tional design concepts, more notable being the use of formative assessment as a
means for reflection and self-evaluation.
Thirdly, in order to affect the behavioural level, SRL is again one of the
most commonly used concepts, alongside social constructivism and collaborative
learning. Social comparison has a stronger presence on this level as it is used
to reveal the behaviour of peers as a source of suggestions on how learners
could improve. Surprisingly, very few dashboards aim to support learners on the
cognitive level, i.e. acquiring knowledge and improving performance, and the few
that do, rely mostly on SRL and social comparison. Finally, in order to animate
changes on the emotional level, dashboards build mostly on social comparison
and the modelling of learning dispositions and 21st century skills.
Fig. 1. The competence level targeted by the dashboards included in the review in
relation to the educational concept clusters that were used as a theoretical basis for
their development.
PRE PRINT
3.3 Reference frames
According to the framework for designing pedagogical interventions to support
student use of learning analytics proposed by Wise [45], learners need a “repre-
sentative reference frame” for interpreting their data. We analysed this aspect
by looking at how the information was contextualised on the dashboard based on
the dashboard goals. We identified three types of reference frames: i) social, i.e.
comparison with other peers, ii) achievement, i.e. in terms of goal achievement,
and iii) progress, i.e. comparison with an earlier self (see Table 3).
Apart from the origin of the reference frame, the three types are also char-
acterised by where in time the anchor for comparison is set. The social reference
frame focuses on the present, allowing learners to compare their current state to
the performance levels of their peers at the same point in time. The achievement
reference frame directs learner’ attention to the future, outlining goals and a fu-
ture state that learners aim for. Finally, the progress reference frame is anchored
in the past, as the learners use as an anchor point a past state to evaluate what
they achieved so far. In the following paragraphs we discuss in detail each type.
Social The most common frame was showing learners their data in compar-
ison to the whole class. We also identified cases where learners had access to
the data of individual members of their working groups in collaborative learn-
ing settings. In other cases, learners compared themselves to previous graduates
of the same course. In order to avoid the pitfalls of averages in heterogeneous
groups, D22 allowed learners a more specific reference: peers with similar goals
and knowledge. A few dashboards compared learners to the “top” students, while
on some dashboards learners had the option to choose against which group they
compare themselves. On one dashboard, learners compared their self-assessment
of group work performance with the assessment made by their peers. We also
looked at how the data of the reference groups is aggregated. Most of the dash-
boards displayed averages (16 dashboards), while only six showed data of indi-
viduals and three presented a learner’s ranking within the reference group.
Table 3. The reference frames for comparison and their frequency. The list of papers
included in this review is available at bit.ly/LADashboards
Type Reference frame Freq. Papers
Social
Class 15 D1; D3; D4; D5; D7; D8; D11; D15;
D16; D18; D19; D21; D22; D23; D24
Teammates 2 D14; D26
Previous graduates 2 D21; D25
Top peers 4 D8; D15; D16; D24
Peers with similar goals 1 D22
Achievement Learning outcomes 15 D2; D3; D4; D5; D6; D8; D9; D11;
D12; D15; D16; D20; D21; D22; D24
Learner goals 1 D22
Progress Self 10 D1; D2; D3; D4; D5; D10; D18; D23;
D25; D26
PRE PRINT
Fig. 2. The competence level targeted by the dashboards in relation to the three refer-
ence frames identified: social (S: red), achievement (A: blue) and progress (P: yellow).
Achievement The second way of framing the information displayed on the
dashboard is in terms of the achievement of the learning activity. Here, we dis-
tinguish between two types of goals: i) learning outcomes set by the teachers and
ii) learner goals set by the learners themselves. One purpose of presenting learn-
ers’ performance in relation to learning outcomes was to illustrate mastery and
skillfulness achievement. Content mastery was expressed through the use of key
concepts in forum discussion (D16, D24), performance in quizzes covering topics
(D5, D8, D9, D15) or different difficulty levels (D3). The acquisition of skills was
quantified through the number of courses covering those skills in the curriculum
objectives (D21), while learning dispositions were calculated from self-reported
data collected through questionnaires (D2). A second purpose for using teacher
defined goals is to support learners in planning their learning by offering them
a point of reference as to how much effort is required for the completion of a
learning activity (D11). Concerning the learner goals, our results were surprising.
Only one dashboard allowed learners enough freedom to set their own goals: on
D22, learners could establish their aimed level of knowledge and time investment
and follow their progress in comparison to their set targets.
Progress The third frame of reference refers to whether dashboards allow
learners to visualise their progress over time, by having access to their historical
data. This functionality directly supports the “execution and monitoring” phase
of the SRL cycle [48]. Our results show that only 10 dashboards offered this
feature, while the rest displayed only the current status of the learners.
PRE PRINT
4 Discussion
Through this literature review, we seek to investigate the relation between learn-
ing sciences and learning analytics by looking into which educational concepts
inform the design of learning analytics dashboards aimed at learners. Our in-
vestigation revealed that only 26 out of the 95 dashboard designs identified by
our search have grounding in learning sciences and have been evaluated. This
might indicate that the development of these tools is still driven by the need
to leverage the learning data available, rather than a clear pedagogical focus of
improving learning. The most common foundation for LA dashboard design is
self-regulated learning theory, used frequently to motivate dashboard goals that
aim to support awareness and trigger reflection. Two findings related to the use
of SRL are striking.
Firstly, very few papers have a secondary goal besides fostering awareness
and reflection. However, being aware does not imply that remedial actions are
being taken and learning outcomes are improved. Moreover, awareness and re-
flection are not concepts that can be measured objectively, making the evaluation
of such dashboards questionable. According to McAlpine & Weston, reflection
should be considered a mechanism through which learning and teaching can be
improved rather than an end in itself [30]. Thus, we argue that LA dashboards
should be designed and evaluated as pedagogical tools which catalyse changes
also in the cognitive, behavioural or emotional competencies, and not only on
the metacognitive level.
Secondly, since more than half of the analysed dashboards rely on SRL, we
took a closer look at how the different phases of the self-regulation cycle are
supported, i.e. fore-thought and planning, monitoring and self-evaluation [49].
The investigation of the reference frames used on the dashboards revealed that
there is little support for goal setting and planning as almost no dashboard al-
lowed learners to manage self-set goals. Moreover, tracking one’s own progress
over time was also not a very common feature. These two shortcomings sug-
gest that current dashboards are built mostly to support the “reflection and
self-evaluation” phase of SRL and neglect the others. This implies that apart
from a learning dashboard, online learning environments need to provide addi-
tional tools that facilitate learners to carry out all the phases of the SRL cycle,
supporting learners in subsequent steps once awareness has been realised. These
findings emphasise the need of designing LA dashboards as a tool embedded into
the instructional design, potentially solving problems related to low uptake of
LA dashboards [28].
Furthermore, our analysis revealed that social framing is more common than
achievement framing. Comparison with peers is usually used in order to motivate
students to work harder and increase their engagement, sometimes by “induc-
ing a feeling of being connected with and supported by their peers” [41]. When
looking at the theoretical concepts that inform the design of the studied dash-
boards, only two theories would justify the use of comparison with peers: social
comparison theory and achievement goal orientation theory.
PRE PRINT
Social comparison [14] states that we establish our self-worth by comparing
ourselves to others when there are no objective means of comparison. However,
empirical research in the face-to-face classroom has shown that comparison to
self-selected peers who perform slightly better has a beneficial effect on mid-
dle school students’ grades, whereas no effects were found when there was a
bigger gap in performance [23]. Despite the availability of such research, social
comparison theory is rarely used to inform the design of dashboards. Only 3
works rationalise the use of comparison by grounding it on social comparison
theory and validations of this theory in educational sciences [7, 20, 27]. More-
over, learners usually got to see their data in comparison to the average of their
peers. Averages are often misleading because they are skewed by data of inactive
learners and the diversity of learning goals among learners, offering a misguided
reference frame.
A second theory that might support the use of social comparison is achieve-
ment goal orientation theory. This theory distinguishes between mastery and per-
formance orientations as the motivation behind why one engages in an achieve-
ment task [32]. In contrast to learners who set mastery goals and focus on learn-
ing the material and mastering the tasks, learners who have performance goals
are more focused on demonstrating their ability by measuring skill in compari-
son to others. We found few dashboards that contextualised the data in terms
of goals achieved, while the majority used different groups of peers as a frame
of reference. This finding suggests that the design of current dashboards is more
appealing to performance oriented learners, neglecting learners who have a ten-
dency towards mastery. Indeed, as Beheshitha et al. [2] observed, learners that
considered the subject matter of the course more motivating than competition
between students were more inclined to rate negatively the visualisation based
on social comparison. We found only one dashboard proposal that catered to the
needs of learners with different achievement goal orientations. Mastery Grids [20]
provides an open learner model for mastery oriented learners on which they can
monitor their progress, as well as social comparison features for performance
oriented learners.
The lack of support for goal achievement and the prevalence of comparison
fosters competition in learners. On the long-term, there is the threat that by
constantly being exposed to motivational triggers that rely on social compari-
son, comparison to peers and “being better than others” becomes the norm in
terms of what defines a successful learner. We argue that learning and education
should be about mastering knowledge, acquiring skills and developing compe-
tencies. For this purpose, comparison should be used carefully in the design of
learning dashboards, and research needs to investigate the effects of social com-
parison and competition in LA dashboards. More attention should be given to
the different needs of learners and dashboards should be used as pedagogical
tools to motivate learners with different performance levels that respond differ-
ently to motivating factors. As Tan [39] envisioned, “differentiated instruction
can become an experienced reality for students, with purposefully-designed LA
serving to compress, rather than exacerbate, the learning and achievement gap
between thriving and struggling students”.
PRE PRINT
5 Conclusion
This paper presents the results of a systematic survey looking into the use of
educational concepts in learning analytics dashboards for learners. Our main
findings show that, firstly, self-regulated learning is the core theory that informs
the design of LA dashboards that aim to make learners aware of their learning
process by visualising their data. However, just making learners aware is not
enough. Dashboards should have a broader purpose, using awareness and re-
flection as means to improve cognitive, behavioural or emotional competencies.
Secondly, effective support for online learners that do not have well developed
SRL skills should also facilitate goal setting and planning, and monitoring and
self-evaluation. As dashboards mostly aim to increase awareness and trigger self-
reflection, different tools should complement dashboards and be seamlessly inte-
grated in the learning environment and the instructional design. Thirdly, there
is a strong emphasis on comparison with peers as opposed to using goal achieve-
ment as reference frame. However, there is evidence in educational sciences that
disproves the benefits of fostering competition in learning. Our findings suggest
that the design of LA dashboards needs better grounding in learning sciences.
Finally, we see the need to investigate the effectiveness of using educational
concepts in the design of LA dashboards by looking at how these tools were eval-
uated, what are the effects perceived by learners and how learning was improved.
Our study was limited by a narrow focus set within the LA field, a relatively
recent research area. Valuable proposals could also be found in related fields,
e.g. educational data mining. We plan to answer these research questions in the
future by extending this work.
References
1. Anderson, T.: The theory and practice of online learning. Athabasca University
Press (2008)
2. Beheshitha, S.S., Hatala, M., Gaˇsevi´c, D., Joksimovi´c, S.: The role of achievement
goal orientations when studying effect of learning analytics visualizations. In: Proc.
of LAK’16. pp. 54–63. ACM (2016)
3. Bloom, B., Krathwohl, D., Masia, B.: Bloom taxonomy of educational objectives.
Allyn and Bacon, Boston, MA. Copyright by Pearson Education. (1984)
4. Bodily, R., Verbert, K.: Trends and issues in student-facing learning analytics
reporting systems research. In: Proc. of LAK’17. pp. 309–318. ACM (2017)
5. Charleer, S., Klerkx, J., Duval, E., De Laet, T., Verbert, K.: Creating effective
learning analytics dashboards: Lessons learnt. In: Proc. of EC-TEL’16. pp. 42–56.
Springer (2016)
6. Corey, M.L., Leinenbach, M.T.: Universal design for learning: Theory and practice.
In: Proc. of 2004 Society for Information Technology and Teacher Education Int.
Conf. pp. 4919–4926 (2004)
7. Davis, D., Jivet, I., Kizilcec, R.F., Chen, G., Hauff, C., Houben, G.J.: Follow the
successful crowd: raising MOOC completion rates through social comparison at
scale. In: Proc. of LAK’17. pp. 454–463. ACM (2017)
8. DeCarvalho, R.J.: The humanistic paradigm in education. The Humanistic Psy-
chologist 19(1), 88 (1991)
PRE PRINT
9. Durall, E., Gros, B.: Learning analytics as a metacognitive tool. In: CSEDU (1).
pp. 380–384 (2014)
10. Ekman, P., Friesen, W.V.: Facial action coding system (1977)
11. Elder, L., Paul, R.: Critical thinking: Why we must transform our teaching. Journal
of Developmental Education 18(1), 34 (1994)
12. Engestr¨om, Y.: Expansive visibilization of work: An activity-theoretical perspec-
tive. Computer Supported Cooperative Work (CSCW) 8(1), 63–93 (1999)
13. Ferguson, R.: Learning analytics: drivers, developments and challenges. Interna-
tional Journal of Technology Enhanced Learning 4(5-6), 304–317 (2012)
14. Festinger, L.: A theory of social comparison processes. Human relations 7(2), 117–
140 (1954)
15. Few, S.: Information Dashboard Design: Displaying data for at-a-glance monitor-
ing. Analytics Press (2013)
16. Fredricks, J.A., Blumenfeld, P.C., Paris, A.H.: School engagement: Potential of the
concept, state of the evidence. Review of educational research 74(1), 59–109 (2004)
17. Gaˇsevi´c, D., Dawson, S., Siemens, G.: Lets not forget: Learning analytics are about
learning. TechTrends 59(1), 64–71 (2015)
18. Gelfand, M.J., Raver, J.L., Nishii, L., Leslie, L.M., Lun, J., Lim, B.C., Duan, L.,
Almaliach, A., Ang, S., Arnadottir, J., et al.: Differences between tight and loose
cultures: A 33-nation study. science 332(6033), 1100–1104 (2011)
19. Greller, W., Drachsler, H.: Translating learning into numbers: A generic framework
for learning analytics. Educational technology & society 15(3), 42–57 (2012)
20. Guerra, J., Hosseini, R., Somyurek, S., Brusilovsky, P.: An intelligent interface
for learning content: Combining an open learner model and social comparison to
support self-regulated learning and engagement. In: Proc. of IUI’16. pp. 152–163.
ACM (2016)
21. Haggis, T.: Constructing images of ourselves? A critical investigation into ’ap-
proaches to learning’ research in higher education. British Educational Research
Journal 29(1), 89–104 (2003)
22. Hofstede, G.: Cultures and organizations. intercultural cooperation and its impor-
tance for survival. software of the mind. London: Mc Iraw-Hill (1991)
23. Huguet, P., Galvaing, M.P., Monteil, J.M., Dumas, F.: Social presence effects in the
Stroop task: further evidence for an attentional view of social facilitation. Journal
of personality and social psychology 77(5), 1011 (1999)
24. Kim, B.: Social constructivism. Emerging perspectives on learning, teaching, and
technology 1(1), 16 (2001)
25. Kirkpatrick, D.L.: Evaluating training programs. Tata McGraw-Hill Education
(1975)
26. Klerkx, J., Verbert, K., Duval, E.: Enhancing learning with visualization tech-
niques. In: Handbook of research on educational communications and technology,
pp. 791–807. Springer (2014)
27. Loboda, T.D., Guerra, J., Hosseini, R., Brusilovsky, P.: Mastery grids: An open
source social educational progress visualization. In: Proc. of EC-TEL’14. pp. 235–
248. Springer (2014)
28. Lonn, S., Aguilar, S.J., Teasley, S.D.: Investigating student motivation in the con-
text of a learning analytics intervention during a summer bridge program. Com-
puters in Human Behavior 47, 90–97 (2015)
29. Marzouk, Z., Rakovic, M., Liaqat, A., Vytasek, J., Samadi, D., Stewart-Alonso, J.,
Ram, I., Woloshen, S., Winne, P.H., Nesbit, J.C.: What if learning analytics were
based on learning science? Australasian Journal of Educational Technology 32(6)
(2016)
PRE PRINT
30. McAlpine, L., Weston, C.: Reflection: Issues related to improving professors’ teach-
ing and students’ learning. Instructional Science 28(5), 363–385 (2000)
31. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., Group, P., et al.: Preferred
reporting items for systematic reviews and meta-analyses: the PRISMA statement.
PLoS med 6(7), e1000097 (2009)
32. Pintrich, P.R.: Multiple goals, multiple pathways: The role of goal orientation in
learning and achievement. Journal of educational psychology 92(3), 544 (2000)
33. Pintrich, P.R., De Groot, E.V.: Motivational and self-regulated learning compo-
nents of classroom academic performance. Journal of educational psychology 82(1),
33 (1990)
34. Sadler, D.R.: Formative assessment and the design of instructional systems. In-
structional science 18(2), 119–144 (1989)
35. Schwendimann, B., Rodriguez-Triana, M., Vozniuk, A., Prieto, L., Boroujeni, M.,
Holzer, A., Gillet, D., Dillenbourg, P.: Perceiving learning at a glance: A systematic
literature review of learning dashboard research. IEEE Transactions on Learning
Technologies (2016)
36. Shum, S.B., Crick, R.D.: Learning dispositions and transferable competencies: ped-
agogy, modelling and learning analytics. In: Proc. of LAK’12. pp. 92–101. ACM
(2012)
37. Siemens, G., Gaˇsevi´c, D.: Guest editorial-learning and knowledge analytics. Edu-
cational Technology & Society 15(3), 1–2 (2012)
38. Suthers, D., Verbert, K.: Learning analytics as a middle space. In: Proc. of LAK’13.
pp. 1–4. ACM (2013)
39. Tan, J.P.L., Yang, S., Koh, E., Jonathan, C.: Fostering 21st century literacies
through a collaborative critical reading and learning analytics environment: user-
perceived benefits and problematics. In: Proc. of LAK’16. pp. 430–434. ACM (2016)
40. Trilling, B., Fadel, C.: 21st century skills: Learning for life in our times. John Wiley
& Sons (2009)
41. Venant, R., Vidal, P., Broisin, J.: Evaluation of learner performance during practi-
cal activities: An experimentation in computer education. In: Proc. of ICALT’16.
pp. 237–241. IEEE (2016)
42. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
43. Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx,
J.: Learning dashboards: an overview and future research opportunities. Personal
and Ubiquitous Computing 18(6), 1499–1514 (2014)
44. Winne, P.H., Zimmerman, B.J.: Self-regulated learning viewed from models of
information processing. Self-regulated learning and academic achievement: Theo-
retical perspectives 2, 153–189 (2001)
45. Wise, A.F.: Designing pedagogical interventions to support student use of learning
analytics. In: Proc. of LAK’14. pp. 203–211. ACM (2014)
46. Wu, W.H., Hsiao, H.C., Wu, P.L., Lin, C.H., Huang, S.H.: Investigating the
learning-theory foundations of game-based learning: a meta-analysis. Journal of
Computer Assisted Learning 28(3), 265–279 (2012)
47. Yoo, Y., Lee, H., Jo, I.H., Park, Y.: Educational dashboards for smart learning: Re-
view of case studies. In: Emerging Issues in Smart Learning, pp. 145–155. Springer
(2015)
48. Zimmerman, B.J.: Self-regulated learning and academic achievement: An overview.
Educational psychologist 25(1), 3–17 (1990)
49. Zimmerman, B.J., Boekarts, M., Pintrich, P., Zeidner, M.: A social cognitive per-
spective. Handbook of self-regulation 13(1), 695–716 (2000)
PRE PRINT