Conference PaperPDF Available

Learning Analytics as a Service for Empowered Learners: From Data Subjects to Controllers


Abstract and Figures

As Learning Analytics (LA) in the higher education setting increasingly transitions from a field of research to an implemented matter of fact of the learner's experience, the demand of practical guidelines to support its development is rising. LA Policies bring together different perspectives, like the ethical and legal dimensions, into frameworks to guide the way. Usually the first time learners get in touch with LA is at the act of consenting to the LA tool. Utilising an ethical (TRUESSEC) and a legal framework (GDPR), we question whether sincere consent is possible in the higher education setting. Drawing upon this premise, we then show how it might be possible to recognise the autonomy of the learner by providing LA as a service, rather than an intervention. This could indicate a paradigm shift towards the learner as empowered demander. At last, we show how this might be incorporated within the GDPR by also recognising the demand of the higher education institutions to use the learner's data at the same time. These considerations will in the future influence the development of our own LA policy: a LA criteria catalogue.
Content may be subject to copyright.
Learning Analytics as a Service for Empowered Learners
From Data Subjects to Controllers
Nicole Gosch
Dept. Law and IT, University of Graz,
Graz, Austria
David Andrews
Dept. Law and IT, University of Graz,
Graz, Austria
Carla Barreiros
Institute of Interactive Systems and
Data Science, Graz University of
Technology, Graz, Austria;
Know-Center GmbH, Graz, Austria
Philipp Leitner
Dept. Educational Technology, Graz
University of Technology, Graz,
Elisabeth Staudegger
Dept. Law and IT, University of Graz,
Graz, Austria
Martin Ebner
Dept. Educational Technology, Graz
University of Technology, Graz,
Stefanie Lindstaedt
Institute of Interactive Systems and
Data Science, Graz University of
Technology, Graz, Austria;
Know-Center GmbH, Graz, Austria
As Learning Analytics (LA) in the higher education setting increas-
ingly transitions from a eld of research to an implemented matter
of fact of the learner’s experience, the demand of practical guide-
lines to support its development is rising. LA Policies bring together
dierent perspectives, like the ethical and legal dimensions, into
frameworks to guide the way. Usually the rst time learners get in
touch with LA is at the act of consenting to the LA tool. Utilising
an ethical (TRUESSEC) and a legal framework (GDPR), we ques-
tion whether sincere consent is possible in the higher education
setting. Drawing upon this premise, we then show how it might
be possible to recognise the autonomy of the learner by providing
LA as a service, rather than an intervention. This could indicate
a paradigm shift towards the learner as empowered demander. At
last, we show how this might be incorporated within the GDPR by
also recognising the demand of the higher education institutions to
use the learner’s data at the same time. These considerations will
in the future inuence the development of our own LA policy: a
LA criteria catalogue.
Authors contributed equally to this research.
This work is licensed under a Creative Commons Attribution International 4.0 License.
LAK21, April 12–16, 2021, Irvine, CA, USA
© 2021 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-8935-8/21/04.
Applied computing
Education; E-learning;
Security and
Human and societal aspects of security and privacy;
Social aspects of security and privacy.
Learning Analytics, Higher Education, Ethics, Trust, GDPR, LA
ACM Reference Format:
Nicole Gosch, David Andrews, Carla Barreiros, Philipp Leitner, Elisabeth
Staudegger, Martin Ebner, and Stefanie Lindstaedt. 2021. Learning Analytics
as a Service for Empowered Learners: From Data Subjects to Controllers.
In LAK21: 11th International Learning Analytics and Knowledge Conference
(LAK21), April 12–16, 2021, Irvine, CA, USA. ACM, New York, NY, USA,
7 pages.
Learning Analytics (LA) collects, measures, analyses, and reports
data about learners to leverage human decisions to improve learning
and environments [
]. LA research is an emergent interdisciplinary
eld that has brought together educators, psychologists, and data
In recent years there is an increasing interest among higher edu-
cation institutions (HEI) in adopting LA to enhance the quality of
teaching and learning [
]. With the digital transformation, higher
education institutions are collecting data as never before, e.g., ad-
ministrative data, interaction data with the learning management
systems and other digital learning tools. However, there are of-
ten barriers that prevent HEI to use this data eectively [
]. The
challenges associated with LA implementation relate to technical,
cultural, and social aspects [
]. Furthermore, Prinsloo and Slade
LAK21, April 12–16, 2021, Irvine, CA, USA Gosch et al.
] point out that the collection, use, and dissemination of data
requires an institutional policy that aligns with national and inter-
national legislative frameworks to ensure an enabling environment
for LA. It is essential to establish a set of guidelines and principles to
guide stakeholders (i.e., institutions, faculty, learners, researchers)
and encourage ethical use of data within an educational system
where power is unequally distributed among dierent stakeholders
]. LA research is governed by regulations for data protection
and research ethics, and the involvement of the stakeholders is
based on consent to participate in the research activities. How-
ever, moving research outcomes out of the labs to be implemented
as tools in the institutional LA systems, implies dealing with the
constraints of the legislative framework. In the European Union
(EU), this is predominantly the General Data Protection Regulation
(GDPR) [
]. In its core, the GDPR seeks to empower European
citizens regarding their data and aims to create a common legal
frame across EU nations. GDPR key requirements regarding privacy
and data protection include: freely given and informed consent of
data subjects for data processing, data anonymisation to protect
privacy, data breach notications, and provisions ensuring safe
international data transfer. Core principles of the GDPR, e.g. au-
tonomy and freely given and informed consent, also are important
values in other legal data protection frameworks. Several initiatives
and EU projects address ethics and privacy concerning the design
and implementation of LA in higher education institutions. An
example of such a project is SHEILA - Supporting Higher Educa-
tion to Integrate Learning Analytics [
], which proposes a policy
development framework intended to assist European universities to
use and safeguard the digital data about their students. We pursue a
slightly dierent approach: Building on the rule of law and a strong
European law enforcement system we go a step further and address
the concept of trust in LA, by fostering just that on a multitude of
ethical dimensions.
In this paper, we will rst have a look at existing LA policies.
These try to incorporate multi-perspective reections into practical
guidelines on how to deploy and maintain LA systems. In the centre
of any LA policy are the ethical and legal frameworks. Based on
ethical and legal aspects, our research aims to build a new LA policy
by developing a criteria and indicators catalogue dedicated to trust
in LA. The criteria and indicators should operationalise the ethical
and legal dimensions by making assessable whether features of the
LA technology enhances pre-dened trust relevant core areas (e.g.
autonomy or privacy). The rst contact point of a learner with the
LA system is typically the act of consent. In the second part of this
paper, we explore how trustworthy LA systems can be promoted
by ethical frameworks and question whether sincere consent is
possible in the higher education setting and which implications
this consent could have. The third and last part of this paper deals
with the most relevant legal framework in the EU, the GDPR. It
outlines how a conceptual change, in the form of a paradigm shift,
could create a system that allows LA processing for the benet of
learners without their consent. Furthermore, it is explored how the
HEI yet could be able to use the accruing data for other purposes
than for the concrete individual learner.
Following existing approaches learning analytics depends on the
constant collection of data from learners, which can create a feeling
of discomfort and lead to resistance in the learners’ adoption of the
LA systems provided by the HEI [
]. Therefore, it is unsurprising
that a critical issue for the success of LA in HEI directly relates to
data privacy and ethical issues. LA policies play an essential role,
as they reect the ethical core values, the legal frame, and clarify
the institution reasons to engage with LA. A LA policy should con-
tain a set of principles that denes a course of action for LA in an
institution, promoting in this way transparency and trust. Next, we
present examples of projects whose outcomes brought insights and
guidelines on how institutions could develop their own LA policies.
The SHEILA framework was created within the context of the EU
SHEILA project [
]. The framework aims to support HEI in the LA
policy development process. The SHEILA framework consists of six
dimensions: political context; key stakeholders; desired behaviour
changes, engagement strategy; internal capacity to eect change;
and monitoring and learning frameworks. These six dimensions
are then decomposed in three key elements, the strategic action
points, the potential challenges, and a set of guiding questions to be
addressed in a LA policy [
]. The LEA’s Box was a EU project [
focused on research and development of a LA toolbox for teacher
and learners. The LEA’s box privacy and data protection framework
was created to integrate ethical considerations and data privacy
and protection, as well as to deal with these issues in the context of
scientic research. Even though the framework is tailored to meet
the needs of the project, it provides useful insights . The framework
refers to a set of eight principles, i.e., data privacy, purpose and
data ownership, consent, transparency and trust, access and control,
accountability and assessment, data quality, and data management
]. The EU project LACE – Learning Analytics Community Ex-
change [
] aimed to bring together research, policy and practice
on the topic of LA, and covered topics such as data sharing, open
LA, and implication on issues such as ethics and data privacy. The
DELICATE checklistwas proposed to guide the use of educational
data. DELICATE stands for Determination (why LA?), Explain (be
open about the goals), Legitimate (why is one allowed to have
students’ data?), Involve (all stakeholders), Consent (make a con-
tract with the data subject), Anonymise (individual not retrievable),
Technical (procedure to ensure privacy), External (working with
external providers) [8].
Moreover, practitioners oering educational related digital ser-
vices to educational institutions are or should be involved [
JISC is a not-for-prot organization for digital educational services
and solutions, which deal with venders and publishers on behalf of
UK educational institutions. The JISC code of practice for learning
analytics refers to a set of crucial areas for a LA policy, such as
responsibility, privacy, transparency and consent. They also pro-
posed a data protection framework, which considered specic issues
related to historical data and the consent validity for alumni [
LA policies bring together dierent perspectives to guide the
design and implementation of LA systems. Two core parts of LA
Learning Analytics as a Service for Empowered Learners LAK21, April 12–16, 2021, Irvine, CA, USA
policies are the ethical and legal frameworks. Where ethical frame-
works operate on a meta-level, in the spheres of values and princi-
ples, legal frameworks try to incorporate those considerations on
concrete legal bases (in the EU, mainly the GDPR [
]. The follow-
ing section outlines how an ethical framework can be used to foster
trust in LA and thereby question the practice of consent regarding
the value of autonomy.
Learning Analytics aims to understand and optimise the learning
process of learners. The learner should, therefore be the focus of
any intervention. However, plenty of ethical issues and pitfalls line
the path. Problems arise in how to justify the collection of the data,
how to store and protect the data and what conclusions to draw
from them [
]. As analytic tools become more powerful and re-
identication of the individual more probable, it is necessary to
focus on privacy issues [
]. Slade and Tait [
] identify ten
ethical core issues of LA, including transparency, data ownership
and control, accessibility of data, validity and reliability of data,
institutional responsibility and obligation to act, communication,
cultural values, inclusion, consent, and student agency and respon-
sibility. Guidance for the proper usage of LA is needed to prevent
LA to fail upon these obstacles. Ethical frameworks provide values
and principles that one can use to build and maintain LA measures,
both useful to the learner and ethically legitimate. Frameworks
for information and communications technology (ICT) products in
general [
] and for learners specically [
] are available
in the literature.
3.1 Building Trustworthy Interventions
Trust and trustworthiness have been identied by several researchers
as the fundamental issues for successful ICT [
]. Trust-
worthiness is an attribute of something, that can be increased by
performing adequate actions. High trustworthiness leads moral
subjects to trust certain things or persons. Trustworthiness can be
objectively determined, and trust subjectively obtained. Regarding
LA, high trustworthiness leads to more trust in LA tools. When
trusting the system, learners will more likely actively participate
in LA and take interventions seriously. This leads to better data
and therefore, to better analysis, which, together with trusting the
process, increases the learners’ gain. Hence, trust is immanent to
ensure an accepted and successful tool.
One framework focussing on trustworthiness as the overarching
principle is the Horizon 2020 project “TRUst-Enhancing certied
Solutions for Security and protection of Citizens’ rights in digital
Europe” [37]. Even though this framework was developed for ICT
products in general, it can be applied to the learners’ eld. Building
upon the European values and fundamental rights, six core areas
of trustworthiness in an interdisciplinary approach were identied.
The core areas are transparency, privacy, anti-discrimination, auton-
omy, respect, and protection. A criteria catalogue with corresponding
manifold true-or-false indicators was developed to operationalise
the core areas. To build trustworthy LA systems, one must address
the core areas by meaningful actions, e.g. explaining how the LA
tools function to address transparency or implementing state-of-
the-art security standards to enhance protection. When it comes
to learners’ consent, one specic core area is aected in particular:
3.2 Can Genuine Consent Be Given Within the
Higher Education Setting?
Autonomy expresses the ability of an individual to make their own
uncoerced decisions by freedom of choice. When striving to build
trustworthy LA technology, autonomy as a principle acts as an en-
abler for the learner to be recognised as a moral agent [
] and thus
as somebody whose decisions should be taken seriously. Promoting
autonomy on all levels results in a growth in trustworthiness. When
regarding the LA process, the rst choice a learner could be faced
with is to give consent to participate in the LA system. Past work
outlined the importance to seek informed consent with the learner,
to justify the use of LA [
] . Informed consent means
that the learner is informed about all the implications and rights
that follow by the consent. Only then learners can decide freely.
Thus, informed consent should not only be ticking just another box.
However, not only has there been criticism on how informed one
can actually be in an emerging complex world [
], also there is
the question of whether the learner is able to make an uncoerced
decision in the higher education setting.
People enrol to HEI to receive higher education, resulting in
wider knowledge and possibly better job prospects. Also, these
institutions act as gatekeepers to higher degrees. Although a large
number of HEI exist, from our perspective, the actual choice is often
limited due to the personal living circumstances. So, the learner
has to follow the rules of the institution and progress through the
curriculum of his or her study; otherwise, he or she will not achieve
a degree. This highlights an inequality of power [
], whereby the
HEI has the upper hand. Consent from an individual is needed
when the impulse to use LA comes from outside the individual,
here the HEI. So, whenever an HEI seeks to gain consent from a
learner, it actually nudges him or her to consent, just by the existing
power imbalance. The asynchronous power balance undermines
the autonomy of the learner and does not leave him or her a real
free choice upon using the LA technology. Thus, it is at least ques-
tionable whether sincere consent can be given within the higher
education setting. This makes it worth considering if there could be
another way of implementing LA in the HEI setting, that primarily
serves the interests of the learners.
In recent times some researchers have seen learners as mere
consumers, rather than co-producers of their learning environment
]. The HEI seeks for consent to justify the use of LA
technology, and the learner agrees, upon a questionable free choice.
If the learner is the focus of the LA system, another approach
should be considered. To fully commit to the autonomy and free
will of the learner, a real free choice must be oered. To achieve
this, we suggest a paradigm shift where the learner is seen as an
empowered demander. This would result in an increased amount of
autonomy and therefore trust in the intervention and most likely
benet the cause of LA systems. One way to apply this approach
is to oer LA as a service for empowered learners. LA as a service
can be described as a LA software, in whatever form HEI oers it
LAK21, April 12–16, 2021, Irvine, CA, USA Gosch et al.
(e.g., dedicated LA system, LA features incorporated in learning
management systems), which any learner can freely subscribe as a
mean to improve their learning process. The HEI would than act
as a service provider to solely focus on the learner’s needs. The
adoption of a LA service, thereby, must be optional and without
nudging for compliance. Even though this approach safeguards
the learners, it also limits the use of the collected data for other
conceivable reasons by the HEI (e.g., adjust curriculum, reformulate
studies program, implement measures to decrease drop-outs).
The next section views at how the GDPR could enable this ap-
proach by shifting the learner from the data subject to the data
controller. Furthermore, we outline a lawful approach for HEI on
how to use LA data for other legitimate purposes than the ones
focussed exclusively on the learner.
Data protection law plays a vital role in the design of LA Services.
Hoel and Chen [
] point out that data protection must be consid-
ered through the entire LA process, and that principles of "data
protection by design" and "data protection by default" should be
followed. The authors state in the "Requirements for LA Systems"
that the gathering and measurement of data depend on the con-
sent of the learner. However, the voluntary nature of consent is a
dicult issue not only from an ethical point of view. Article 4 (11)
of the GDPR [
] denes the "consent" of the data subject as "any
freely given specic, informed and unequivocal expression of will
in the form of a declaration or other unequivocal armative act by
which the data subject signies his or her consent to the processing
of personal data relating to him or her". In particular, eective con-
sent requires that the individual is free to decide and has control
over the data [
]. If not only the learners but also the HEI or
faculty benet from the use of an LA service, it is likely to exist an
indirect compulsion for learners to participate. In connection with
the voluntary nature of consent, problems may also arise if there
is a power imbalance between the controller and the data subject
]. This problem is relevant when using LA in HEI settings.
Whether the voluntary nature of the consent in connection with
LA at HEI can be guaranteed is therefore questionable. Only the
European Court of Justice can nally clarify this question.
4.1 From Data Subjects to Controllers
In order to create a LA system dedicated to learners, we consider
a paradigm shift in our research, which is depicted in Figure 1.
Instead of seeking uncertain and potentially invalid consent, we
empower learners to remain the "masters" of their data. Learners
assume the role of the controller as dened by Article 4 (7) GDPR
], where each learner can decide on the purposes and means
of processing [
]. We envision LA as a service for learners only,
where learners are going to be free to decide if they want to use
the LA service. In consequence, the HEI may only use this data
as a pure processor, as dened by Article 4 (8) GDPR [
] based
on documented instructions from the controller (
learner). The
foundation of this relationship between the HEI and the learners
would be a processing agreement [
]. From a data protection per-
spective, learners would no longer be in the role of the data subject,
which means that their rights guaranteed under data protection
law to the data subject no longer apply. However, data protection
law is still applicable in this context, only in a dierent way. The
HEI as a processor would have to observe several data protection
requirements [
]. In addition to general obligations, such as keep-
ing records of processing activities, designating a data protection
ocer, providing technical and organisational measures and the
obligation to report data protection violations, the processor is di-
rectly responsible to the controller. For example, if the HEI would
like to use a sub-processor (i.e. an external person or body who
processes data on behalf of the HEI), this must be approved by the
learners. The most important aspect of this paradigm shift would
be that the data provided by the learners may not be used for pur-
poses other than those intended by the learners; otherwise, the HEI
itself will become the controller. A system that is detached from
any other purpose that the HEI may wish to pursue and is used
exclusively for the benet of the learners, a system in which the
learners themselves have full control over what happens to their
data, would be a desirable alternative to obtaining consent.
4.2 The Free Flow of Data: Using Personal Data
Without Harming the Individual
Would this paradigm shift mean that the economic and social value
of the data and the benet to the general public is lost? Or are there
ways within this approach in which this data can be used for other
purposes? After all, there are legitimate concerns of HEI to use
this data, for example, to improve curricula or to allow lecturers
to improve their teaching. When considering this question, the
second primary objective of the GDPR must not be forgotten. In
addition to the protection of individuals concerning the processing
of personal data, Article 1 (3) GDPR [
] provides the free movement
of personal data within the Union. This second goal was included
in the GDPR because the European Union has recognised the value
of personal data and explicitly wants to allow its broad/extensive
use without harming the individual involved. Therefore, the GDPR
should be interpreted not only as a protection law but also as a sort
of economic law and as a tool to make personal data economically
usable while respecting the protection of the data subjects.
The critical element that combines these two interests and objec-
tives of the GDPR in our view is the anonymisation. Anonymisation
is the process by which data personal are rendered in such a manner
that the data subject is not or no longer identiable . Anonymisation
enables data processing outside the provisions of data protection
law. Only after the personal data from the LA system has been
anonymised is it allowed to use it for any possible purpose, as stated
in Recital 26 [
]. There are signicant demands on anonymisation:
Personal data shall only be considered anonymous or anonymised
if, taking all means into account, it is reasonably unlikely that a
person will be identied by the controller or by anyone else. As long
as – and as soon as – the data exists in personal form, it is subject to
the full protection of the GDPR [
] and may not be shared without
a specic lawful basis. In addition to anonymisation in itself, an
aggregation of the data, which can be a method of anonymisation,
may also be considered an option to use data. Inevitably, not each
aggregation leads to anonymisation, so it remains a question of the
individual case whether data is still considered as personal data [
Learning Analytics as a Service for Empowered Learners LAK21, April 12–16, 2021, Irvine, CA, USA
Figure 1: Shift of the learner from the data subject to the controller.
4.3 The Process of Anonymisation as a
Processing Activity That Requires a Lawful
Precisely at this point in the process, where the personal data from
the LA service should be used for other purposes, it is necessary
to look at this transformation: how to get from personal data to
anonymous or aggregated data. Figure 2 depicts this process. The
HEI, as a processor, already would have the stored data at its phys-
ical disposal. However, just because it has access to the data on
the basis of the contractual relationship with the learners within
the framework of the LA service does not mean that it can merely
process the data for its own purposes without legal consequences. If
the HEI intends to process the data for its own purposes, this would
again lead to a change of role under data protection law. The HEI
becomes the controller and the learners are now the data subjects,
with all the implications of data protection law. Anonymised data
falls outside the scope of the GDPR, but the anonymisation process
itself is relevant to data protection law [2].
Anonymisation and de-identication can be a technical and or-
ganisational measure to protect personal data, especially in LA
]. A processor may, for example, anonymise data within the
processing activities carried out on behalf of the controller in order
to ensure a higher level of protection when processing the data. In
this case, anonymisation would be covered by the data processing
agreement and does not require a separate legal basis. However, if
a processor anonymises personal data in order to process them for
his own purposes, anonymisation must be regarded as a processing
activity that requires a dierent lawful basis under Art 6 GDPR
]. The purpose of anonymisation that is relevant under data
protection law is not the removal of the personal reference itself,
but the actual interest of the controller that is behind it. In order to
be able to decide whether anonymisation is legally admissible, it is
necessary to know the purpose of the processing. Furthermore, the
technical process behind anonymisation must be disclosed. Only
then is it possible to legally and ethically evaluate not only the
processing but also whether anonymisation in the sense of data
protection law has been achieved.
Drawing upon the premise that genuine consent is questionable, we
develop our approach to shift the role of the learner to be the con-
troller of her data. Thus, LA could be provided by HEI as a service
to serve learners in the rst place. The process of anonymisation
is required to address the desire of the HEI to be able to use the
collected data for extended purposes. In our research, we want to
explore whether and how this approach of empowering learners by
giving them the opportunity to choose LA as a service can prove
itself in practice. Problems could arise in this respect, especially
concerning the broad interpretation of the joint controllership of
the GDPR by the European Court of Justice, which could lead to
learners being joint controllers together with the HEI [
]. We
will therefore investigate how privacy by design and default can
LAK21, April 12–16, 2021, Irvine, CA, USA Gosch et al.
Figure 2: Process of using data from Learning Analytics as a Service.
be used to create a system that gives learners the decision power
about the purposes and means of processing and inspire others in
the LA community to join us in thinking about this approach.
Another main focus of our research will be to nd out how the
data from the LA service can be used for other legitimate purposes
of the HEI in accordance with data protection law. In particular, we
will address the question of how anonymisation and aggregation
of personal data can be achieved. Furthermore, the proper, lawful
basis for the process of anonymisation, i.e. the transition from per-
sonal data to anonymised data, must be identied. This conceptual
approach is intended to enable a product that, due to its privacy
friendliness and trustworthiness, can be used by learners without
any concern. However, LA as a service can only be successful, if
there is a strong emphasis in creating a LA culture in the institu-
tion. We intent to promote LA by e.g., actively involving learners
in the system design process, organizing workshops, information
We use the European GDPR to exemplify our approach, however
we are convinced that the basic concept – building on genuine
voluntariness – is transferable to other legal frameworks.
The developed work presented here was co-funded by the Federal
Ministry of Education, Science and Research, Austria, as part of
the 2019 call for proposals for digital and social transformation in
higher education for the project “Learning Analytics” (2021-2024,
partner organisations: University of Vienna, Graz University of
Technology, University of Graz).
AI HLEG. 2019. Ethics Guidelines for Trustworthy AI. Retrieved October 25,
2020 from market/en/news/ethics-guidelines-
Article 29 Data Protection Working Party. 2014. Opinion 05/2014 on Anonymisa-
tion Techniques. WP 216, Adopted on 10 April 2014.
Jacqueline Bichsel. 2012. Analytics in Higher Education: Benets, Barriers,
Progress, and Recommendations.
Louise Bunce, Amy Baird, and Siân E. Jones. 2017. The student-as-consumer
approach in higher education and its eects on academic performance. Studies
in Higher Education 42, 11, 1958–1978.
Louise Bunce, Amy Baird, and Siân E. Jones. 2017. The student-as-consumer
approach in higher education and its eects on academic performance. Studies
in Higher Education 42, 11, 1958–1978.
Fred H. Cate and Viktor Mayer-Schönberger. 2013. Notice and consent in a world
of Big Data. International Data Privacy Law 3, 2, 67–73.
Andrew N. Cormack. 2016. A Data Protection Framework for Learning Analytics.
Journal of Learning Analytics 3, 1.
Hendrik Drachsler and Wolfgang Greller. 2016. Privacy and analytics. In Proceed-
ings of the Sixth International Conference on Learning Analytics & Knowledge -
LAK ’16. ACM Press, New York, New York, USA, 89–98.
[9] European Court of Justice. 2018. Facebook Fanpage. C-210/16.
[10] European Court of Justice. 2019. Fashion ID. C-40/17.
European Data Protection Board. 2020. Guidelines 05/2020 on consent under
Regulation 2016/679. Version 1.1 - Adopted on 4 May 2020.
European Data Protection Board. 2020. Guidelines 07/2020 on the concepts of
controller and processor in the GDPR. Version 1.0, Adopted on 02 September
European Union. 2016. Consolidated text: Regulation (EU) 2016/679 of the Euro-
pean Parliament and of the Council of 27 April 2016 on the protection of natural
Learning Analytics as a Service for Empowered Learners LAK21, April 12–16, 2021, Irvine, CA, USA
persons with regard to the processing of personal data and on the free move-
ment of such data, and repealing Directive 95/46/EC (General Data Protection
Regulation) (Text with EEA relevance). RL 2016/679/EU.
Tore Hoel, Dai Griths, and Weiqin Chen. The inuence of data protection and
privacy frameworks on the design of learning analytics systems. In Proceedings
of the seventh international learning analytics & knowledge conference, 243–252.
Kyle M. L. Jones. 2019. Learning analytics and higher education: a proposed model
for establishing informed consent mechanisms to promote student privacy and
autonomy. Int J Educ Technol High Educ 16, 1.
019-0155- 0
Mohammad Khalil and Martin Ebner. 2016. De-Identication in Learning Analyt-
ics. Learning Analytics 3, 1.
LACE. Learning Analytics Community Exchange. 2020. Retrieved October 20,
2020 from
LEA’s Box. LEarning Analytics Toolbox. 2020. Retrieved October 20, 2020 from
Philipp Leitner, Mohammad Khalil, and Martin Ebner. 2017. Learning Analytics
in Higher Education—A Literature Review. In Learning Analytics: Fundaments,
Applications, and Trends, A. Peña-Ayala, Ed. Studies in Systems, Decision and
Control. Springer International Publishing, Cham, 1–23.
978-3- 319-52977-6_1
Phillip D. Long, George Siemens, Grainne Conole, and Dragan Gašević. 2011.
Proceedings of the 1st International Conference on Learning Analytics and
Alistair McCulloch. 2009. The student as co-producer: learning from public
administration about the student–university relationship. Studies in Higher
Education 34, 2, 171–183.
Elizabeth Nixon, Mike Molesworth, and Richard Scullion. 2010. The marketisation
of higher education. The student as consumer. Routledge, Abingdon, Oxon, New
York, NY.
Abelardo Pardo and George Siemens. 2014. Ethical and privacy principles for
learning analytics. Br J Educ Technol 45, 3, 438–450.
Paul Prinsloo. 2018. Student Consent in Learning Analytics. In Learning Analytics
in Higher Education, J. Lester, Ed. Routledge, 118–139.
Paul Prinsloo and Sharon Slade. 2013. An evaluation of policy frameworks for
addressing ethical considerations in learning analytics. In Proceedings of the
Third International Conference on Learning Analytics and Knowledge - LAK
’13. ACM Press, New York, New York, USA, 240.
Lynne D. Roberts, Joel A. Howell, Kristen Seaman, and David C. Gibson. 2016.
Student Attitudes toward Learning Analytics in Higher Education: "The Fitbit
Version of the Learning World". Frontiers in psychology 7, 1959.
Carol Robinson. 2012. Student engagement. Jnl of Applied Research in HE 4, 2,
Niall Sclater. 2016. Developinga Co de of Practice for Learning Analytics. Learning
Analytics 3, 1.
[29] SHEILA Project. 2020. Retrieved October 20, 2020 from
George Siemens. 2013. Learning Analytics. American Behavioral Scientist 57, 10,
[31] George Siemens, Shane Dawson, and Grace Lynch. 2013. Improving the quality
and productivity of the higher education sector. Policy and Strategy for Systems-
Level Deployment of Learning Analytics. Society for Learning Analytics Research.
Sharon Slade and Paul Prinsloo. 2013. Learning Analytics. Ethical Issues and
Dilemmas. American Behavioral Scientist 57, 10, 1510–1529.
Sharon Slade and Paul Prinsloo. 2014. Student perspectives on the use of their
data: between intrusion, surveillance and care. Challenges for Research into Open
& Distance Learning: Doing Things Better – Doing Better Things, 291–300.
Sharon Slade, Paul Prinsloo, and Mohammad Khalil. 2019. Learning analytics at
the intersections of student trust, disclosure and benet. In Proceedings of the
9th International Conference on Learning Analytics & Knowledge. ACM, New
York, NY, USA, 235–244.
Sharon Slade and Alan Tait. 2019. Global guidelines: Ethics in learning analytics.
Retrieved 25.20.2020 from of-the-
guidelines-is- to-identify-which- core-principles-relating- to-ethics-are- core-
to-all- and-where-there- is-legitimate-dierentiation- due-to-separate- legal-or-
more-broadly- cultural-env-5mppk
Christina M. Steiner, Michael D. Kickmeier-Rust, and Dietrich Albert. 2016. LEA
in Private: A Privacy and Data Protection Framework for a Learning Analytics
Toolbox. Learning Analytics 3, 1.
[37] 2018. Cybersecurity and privacy Criteria Catalogue
for assurance and certication. Retrieved October 25, 2020 from
Yi-Shan Tsai, Pedro M. Moreno-Marcos, Ioana Jivet, Maren Scheel, Kairit Tam-
mets, Kaire Kollom, and Dragan Gašević. 2018. The SHEILA Framework: Inform-
ing Institutional Strategies and Policy Processes of Learning Analytics. Learning
Analytics 5, 3.
George Siemens. 2012. Learning analytics: envisioning a research discipline
and a domain of practice. In Proceedings of the 2nd International Conference
on Learning Analytics and Knowledge (LAK ’12). Association for Computing
Machinery, New York, NY, USA, 4–8.
Charles Lang, Leah P. Macfadyen, Sharon Slade, Paul Prinsloo, and Niall Sclater.
2018. The complexities of developing a personal code of ethics for learning
analytics practitioners: implications for institutions and the eld. In Proceedings
of the 8th International Conference on Learning Analytics and Knowledge (LAK
’18). Association for Computing Machinery, New York, NY, USA, 436–440. https:
Leah Macfadyen. 2017. What Does a Learning Analytics Practitioner Need to
... To do so, this study addresses one of the data subject stakeholders in LA, namely students, whose direct engagement in the design of LA services is critical to their improved learning in education settings (Ochoa & Wise, 2021). Engaging them as data subjects, especially regarding a thorough understanding of their concerns about LA privacy implications, is an essential first step toward the development of effective privacy-enhancing practices that would protect students and ultimately empower them by enabling their agency in LA (Ahn et al., 2021;Gosch et al., 2021;Roberts et al., 2016;Tsai et al., 2020). ...
... This is important to the successful implementation of LA systems that are directly dependent on student data (Li et al., 2021). Accordingly, higher education institutions need to further develop and adopt effective privacy-enhancing student practices and management strategies (eg, through relevant policy work) that mitigate students' perceptions of privacy risks and enhance perceptions of their ability to control their personal information, which will empower them as data controllers as compared to data subjects (Gosch et al., 2021). ...
Full-text available
Understanding students’ privacy concerns is an essential first step towards effective privacy-enhancing practices in learning analytics (LA). In this study, we develop and validate a model to explore the students’ privacy concerns (SPICE) regarding LA practice in higher education. The SPICE model considers privacy concerns as a central construct between two antecedents—perceived privacy risk and perceived privacy control, and two outcomes—trusting beliefs and non-self-disclosure behaviours. To validate the model, data through an online survey were collected, and 132 students from three Swedish universities participated in the study. Partial Least Square results show that the model accounts for high variance in privacy concerns, trusting beliefs, and non-self-disclosure behaviours. They also illustrate that students’ perceived privacy risk is a firm predictor of their privacy concerns. The students’ privacy concerns and perceived privacy risk were found to affect their non-self-disclosure behaviours. Finally, the results show that the students’ perceptions of privacy control and privacy risks determine their trusting beliefs. The study results contribute to understand the relationships between students’ privacy concerns, trust, and non-self-disclosure behaviours in the LA context. A set of relevant implications for LA systems’ design and privacy-enhancing practices’ development in higher education is offered.
... HEIs must facilitate and promote the responsible use of new educational technologies, such as LA. However, HEIs should offer learning analytics as a purely voluntary service [8], which teachers and students can adopt to enhance the teaching and learning processes. In this context, HEIs are responsible for offering further training measures for teachers and providing didactic support to ensure the pedagogically meaningful use of LA data. ...
Full-text available
Learning analytics (LA) is an emerging field of science due to its great potential to better understand, support and improve the learning and teaching process. Many higher education institutions (HEIs) have already included LA in their digitalisation strategies. This process has been additionally accelerated during the COVID-19 pandemic when HEIs transitioned from face-2-face learning environments to hybrid and e-learning environments and entirely relied on technology to continue operating. Undoubtedly, there was never a time when so much student data was collected, analysed, and reported, which brings numerous ethical and data protection concerns to the forefront. For example, a critical issue when implementing LA is to determine which data should be processed to fulfil pedagogical purposes while making sure that LA is in line with ethical principles and data protection law, such as the European General Data Protection Regulation (GDPR). This article contributes to the discussion on how to design LA applications that are not only useful and innovative but also trustworthy and enable higher education learners to make data-informed decisions about their learning process. For that purpose, we first present the idea and methodology behind the development of our interdisciplinary Criteria Catalogue for trustworthy LA applications intended for students. The Criteria Catalogue is a new normative framework that supports students to assess the trustworthiness of LA applications. It consists of seven defined Core Areas (i.e., autonomy, protection, respect, non-discrimination, responsibility and accountability, transparency, and privacy and good data governance) and corresponding criteria and indicators. Next, we apply this normative framework to learning diaries as a specific LA application. Our goal is to demonstrate how ethical and legal aspects could be translated into specific recommendations and design implications that should accompany the whole lifecycle of LA applications.
Full-text available
The digitalisation of learning, teaching, and study processes has a major impact on possible evaluations and uses of data, for example with regard to individual learning recommendations, prognosis, or assessments. This also gives rise to ethical issues centered around digital teaching and possible challenges of data use. One possible approach to this challenge might be to install a Digital Ethics Officer (DEO), whose future profile this paper outlines for a Educational Technology unit of a Higher Education Institution (HEI). Therefore, an introductory overview of the tasks and roles of Ethics Officers (EO) is given based on the literature. The authors then describe the current ethics program of a university of technology and collect current and potential ethical issues from the field of educational technologies. Based on this, a first professional profile for a DEO at an educational technology unit of a university is described. From the authors’ point of view, the article thus prepares important considerations and steps for the future of this position.
Full-text available
Conference Paper
While learning analytics frameworks precede the official launch of learning analytics in 2011, there has been a proliferation of learning analytics frameworks since. This systematic review of learning analytics frameworks between 2011 and 2021 in three databases resulted in an initial corpus of 268 articles and conference proceeding papers based on the occurrence of "learning analytics" and "framework" in titles, keywords and abstracts. The final corpus of 46 frameworks were analysed using a coding scheme derived from purposefully selected learning analytics frameworks. The results found that learning analytics frameworks share a number of elements and characteristics such as source, development and application focus, a form of representation, data sources and types, focus and context. Less than half of the frameworks consider student data privacy and ethics. Finally, while design and process elements of these frameworks may be transferable and scalable to other contexts, users in different contexts will be best-placed to determine their transferability/scalability.
Educational Data Mining has turned into an effective technique for revealing relationships hidden in educational data and predicting students’ learning outcomes. One can analyze data extracted from the students’ activity, educational and social behavior, and academic background. The outcomes which are produced are, the following: A personalized learning procedure, a feasible engagement with students’ behavior, a predictable interaction of the students with the learning processes and data. In the current work, we apply several supervised methods aiming at predicting the students’ academic performance. We prove that the use of the default parameters of learning algorithms on a voting generalization procedure of the three most accurate classifiers, can produce better results than any single tuned learning algorithm.
Full-text available
Ethical issues relating to applications of learning analytics (with a focus on open, distance learning) in higher education
Full-text available
Abstract By tracking, aggregating, and analyzing student profiles along with students’ digital and analog behaviors captured in information systems, universities are beginning to open the black box of education using learning analytics technologies. However, the increase in and usage of sensitive and personal student data present unique privacy concerns. I argue that privacy-as-control of personal information is autonomy promoting, and that students should be informed about these information flows and to what ends their institution is using them. Informed consent is one mechanism by which to accomplish these goals, but Big Data practices challenge the efficacy of this strategy. To ensure the usefulness of informed consent, I argue for the development of Platform for Privacy Preferences (P3P) technology and assert that privacy dashboards will enable student control and consent mechanisms, while providing an opportunity for institutions to justify their practices according to existing norms and values.
Full-text available
Conference Paper
Evidence suggests that individuals are often willing to exchange personal data for (real or perceived) benefits. Such an exchange may be impacted by their trust in a particular context and their (real or perceived) control over their data. Students remain concerned about the scope and detail of surveillance of their learning behavior, their privacy, their control over what data are collected, the purpose of the collection, and the implications of any analysis. Questions arise as to the extent to which students are aware of the benefits and risks inherent in the exchange of their data, and whether they are willing to exchange personal data for more effective and supported learning experiences. This study reports on the views of entry level students at the Open University (OU) in 2018. The primary aim is to explore differences between stated attitudes to privacy and their online behaviors, and whether these same attitudes extend to their university's uses of their (personal) data. The analysis indicates, inter alia, that there is no obvious relationship between how often students are online or their awareness of/concerns about privacy issues in online contexts and what they actually do to protect themselves. Significantly though, the findings indicate that students overwhelmingly have an inherent trust in their university to use their data appropriately and ethically. Based on the findings, we outline a number of issues for consideration by higher education institutions, such as the need for transparency (of purpose and scope), the provision of some element of student control, and an acknowledgment of the exchange value of information in the nexus of the privacy calculus.
Full-text available
This paper introduces a learning analytics policy and strategy framework developed by a cross-European research project team -- SHEILA (Supporting Higher Education to Integrate Learning Analytics), based on interviews with 78 senior managers from 51 European higher education institutions across 16 countries. The framework was developed adapting the RAPID Outcome Mapping Approach (ROMA), which is designed to develop effective strategies and evidence-based policy in complex environments. This paper presents four case studies to illustrate the development process of the SHEILA framework and how it can be used iteratively to inform strategic planning and policy processes in real world environments, particularly for large-scale implementation in higher education contexts. To this end, the selected cases were analyzed at two stages, each a year apart, to investigate the progression of adoption approaches that were followed to solve existing challenges, and identify new challenges that could be addressed by following the SHEILA framework.
Full-text available
Conference Paper
The question captured in the title of this paper was asked by an audience member in a panel session at the 2014 Learning Analytics Summer Institute, hosted by the Society for Learning Analytics Research1. More accurately, the question was: “What does a data scientist working in a learning context need to know?” One respondent quipped in response: “learn Python and learn R!”. Subsequent debate about this technically-oriented answer spilled over into an email thread in the Learning Analytics Google Group, a “news and discussion group for conceptual and practical use of analytics in education, workplace learning, and informal learning”, which currently has more than 1300 members worldwide. This paper briefly reviews the historical development of the field of learning analytics, in an effort to explain the perceived priority of technology skills. Selected contributions to the online discussion are offered, in the hope that they can inform ongoing efforts to develop a coherent, comprehensive and relevant learning analytics curriculum.
Full-text available
This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Few would contest the impact of technology on modern day society. There are, however, wide-ranging opinions and contestations regarding the social and ethical implications of the increasing entanglement of our lives with technology (Introna, 2017; Marx, 2016; Robertson & Travaglia, 2017). Introna (2017) suggests that “At the center of this technology/society interrelationship we find many complex questions about the nature of the human, the technical, agency, autonomy, freedom and much more” (para. 1). Central to our entanglement in this techno-societal complex is the issue of the use of personal data and the scope and limitations of individuals’ agency (a) to make rational, informed choices regarding consent to having their data collected, analyzed, and used (Prinsloo & Slade, 2015); (b) for freely gifting (Kitchin, 2013, p. 263) unrequested data in ways that suggest digital promiscuity (Payne, 2014); and (c) to negotiate terms and conditions around receiving benefits in exchange for personal data, in a phenomenon known as the “privacy calculus” (Knijnenburg, Raybourn, Cherry, Wilkinson, Sivakumar, & Sloan, 2017, para. 1). It is also important to note the increasing automated and directed surveillance of digital users without their knowledge or consent (Kitchin, 2013), which raises “unprecedented challenges to how we currently elicit, secure, and sustain user consent” (Luger, Rodden, Jirotka, & Edwards, 2014, p. 613). The Big Data revolution (Kitchin, 2014) with its accompanying generative mechanisms for extracting data “has become an idea with social and political power in its own right” (Robertson & Travaglia, 2017, para. 1). The reductive quantification of complex social phenomena and the combination of different datasets suggest the need to (re)consider the notion of consent, the scope and limitations of informed consent, in general (Wilson, 2017), and, more specifically, consent in the context of higher education. As the volume, velocity, and variety in data have increased, institutions, including higher education organizations, are increasingly enlarging their capacity to facilitate the tracking of students on an unprecedented scale. As such, “The privacy and ethical issues that emerge in this context are tightly interconnected with other aspects such as trust, accountability, and transparency” (Pardo & Siemens, 2014 p. 438). In deciding whether providing individuals (i.e., students) control over their personal data is a “true remedy or fairy tale” (Lazaro & Le Métayer, 2015), we suspect that the devil lies in the details. In this chapter, we provide a broad overview of ethical considerations in the collection, analysis, and use of student data before investigating specific issues surrounding and informing the notion and scope of student consent. We map a broad framework of considerations and consider the ethical implications of allowing students to opt out of all or some aspects of the collection of their data.
Conference Paper
This paper introduces a learning analytics policy development framework developed by a cross-European research project team - SHEILA (Supporting Higher Education to Integrate Learning Analytics), based on interviews with 78 senior managers from 51 European higher education institutions across 16 countries. The framework was developed using the RAPID Outcome Mapping Approach (ROMA), which is designed to develop effective strategies and evidence-based policy in complex environments. This paper presents three case studies to illustrate the development process of the SHEILA policy framework, which can be used to inform strategic planning and policy processes in real world environments, particularly for large-scale implementation in higher education contexts.
Conference Paper
In this paper we explore the potential role, value and utility of a personal code of ethics (COE) for learning analytics practitioners, and in particular we consider whether such a COE might usefully mediate individual actions and choices in relation to a more abstract institutional COE. While several institutional COEs now exist, little attention has been paid to detailing the ethical responsibilities of individual practitioners. To investigate the problems associated with developing and implementing a personal COE, we drafted an LA Practitioner COE based on other professional codes, and invited feedback from a range of learning analytics stakeholders and practitioners: ethicists, students, researchers and technology executives. Three main themes emerged from their reflections: 1. A need to balance real world demands with abstract principles, 2. The limits to individual accountability within the learning analytics space, and 3. The continuing value of debate around an aspirational code of ethics within the field of learning analytics.
Conference Paper
Learning analytics open up a complex landscape of privacy and policy issues, which, in turn, influence how learning analytics systems and practices are designed. Research and development is governed by regulations for data storage and management, and by research ethics. Consequently, when moving solutions out the research labs implementers meet constraints defined in national laws and justified in privacy frameworks. This paper explores how the OECD, APEC and EU privacy frameworks seek to regulate data privacy, with significant implications for the discourse of learning, and ultimately, an impact on the design of tools, architectures and practices that now are on the drawing board. A detailed list of requirements for learning analytics systems is developed, based on the new legal requirements defined in the European General Data Protection Regulation, which from 2018 will be enforced as European law. The paper also gives an initial account of how the privacy discourse in Europe, Japan, South-Korea and China is developing and reflects upon the possible impact of the different privacy frameworks on the design of LA privacy solutions in these countries. This research contributes to knowledge of how concerns about privacy and data protection related to educational data can drive a discourse on new approaches to privacy engineering based on the principles of Privacy by Design. For the LAK community, this study represents the first attempt to conceptualise the issues of privacy and learning analytics in a cross-cultural context. The paper concludes with a plan to follow up this research on privacy policies and learning analytics systems development with a new international study.