ChapterPDF Available

Learning Analytics Challenges to Overcome in Higher Education Institutions



While a large number of scientific publications explain the development of prototypes or the implementation of case studies in detail, descriptions of the challenges and proper solutions when implementing learning analytics initiatives are rare. In this chapter, we provide a practical tool that can be used to identify risks and challenges that arise when implementing learning analytics (LA) initiatives and discuss how to approach these to find acceptable solutions. In this way, implementers are given the opportunity to handle challenges early on and avoid being surprised at a critical moment in the project, which will save time, resources, and effort. We are aware that all aspects needed to successfully carry out learning analytics initiatives are co-dependent. Nonetheless, we identified and categorized the criteria necessary for implementing successful learning analytics initiatives. We conclude this chapter with an overview of the challenges faced and possible approaches that can be taken to facilitate the successful implementation of learning analytics.
Draft, originally published in: Leitner P., Ebner M., Ebner M. (2019) Learning Analytics Challenges to Overcome
in Higher Education Institutions. In: Ifenthaler D., Mah DK., Yau JK. (eds) Utilizing Learning Analytics to Support
Study Success. Springer, Cham
Philipp Leitner, Graz University of Technology, Graz, Austria, e-mail:
Markus Ebner, Graz University of Technology, Graz, Austria
Martin Ebner, Graz University of Technology, Graz, Austria
Abstract: While a large number of scientific publications explain the development of prototypes or the implementation of
case studies in detail, descriptions of the challenges and proper solutions when implementing learning analytics
initiatives are rare. In this chapter, we provide a practical tool that can be used to identify risks and challenges that
arise when implementing LA initiatives and discuss how to approach these to find acceptable solutions. In this
way, implementers are given the opportunity to handle challenges early on and avoid being surprised at a critical
moment in the project, which will save time, resources and effort. We are aware that all aspects needed to
successfully carry out learning analytics initiatives are co-dependent. Nonetheless, we identified and categorized
the criteria necessary for implementing successful LA initiatives. We conclude this chapter with an overview of
the challenges faced and possible approaches that can be taken to facilitate the successful implementation of
Learning Analytics.
Key words: learning analytics, implementation, challenge, higher education
Over the past decade, Learning Analytics (LA) have received more and more attention as a rapidly
growing and promising research field in the area of Technology-Enhanced Learning (TEL) (Ferguson,
2012; Khalil & Ebner, 2015). Since it was first mentioned in the Horizon Report of 2012 (Johnson et al.,
2012), different tools have been used and initiatives carried out concerning different aspects of LA.
Thereby, LA is now finally reaching the point at which it will affect research and practice, as well as
policy- and decision-making (Gašević, 2015).
Currently, many different definitions for the term Learning Analytics are accepted. Long & Siemens
(2011) defined it as “the measurement, collection, analysis, and reporting of data about learners and their
contexts, for purposes of understanding and optimizing learning and the environment in which it occurs.”
Duval (2012) summarized LA by saying “learning analytics is about collecting traces that learners leave
behind and using those traces to improve learning.” Despite the different approaches, all definitions of LA
indicate that it should provide actionable insights (Siemens et al., 2011).
Therefore, the purpose should remain in focus when implementing LA initiatives. Obviously, the
potential actions strongly depend on the utilization of data and the information contained. However, what
kind of data representation is necessary to implement LA in an institution, and what ethical and moral
aspects need to be considered? Currently, the members of the European Union are particularly strongly
affected by the enforcement of the EU General Data Protection Regulation (GDPR) (Leitner et al., 2018).
The issues of data ownership and privacy are becoming increasingly significant (Drachsler & Greller,
2016). Therefore, the location and accessibility of the data needs to be kept in mind (Leitner et al., 2018).
For example, where is the data stored? On an internal or external server hosted by a service provider?
Additionally, many LA projects do not move past the prototype phase because of issues related to
transferability and scalability (Leitner et al., 2017). These aspects should already be considered at the
beginning of the development.
The goal of this study was to provide a practical tool that can be used to identify risks and challenges
that arise when implementing LA initiatives and how to approach these. This gives implementers the
opportunity to deal with these problems at an early stage and, thereby, not lose time or invest effort
needlessly later on when the realization of the initiative becomes critical. In this study, we identified and
categorized seven criteria for implementing successful LA initiatives. Although we are aware that these
areas are co-dependent, we addressed them individually throughout this study.
In the remainder of this chapter, we showcase relevant and related work, placing an emphasis on similar
research questions, and extract relevant problems that generally emerge during the implementation of LA
and identify possible solutions. In Section 3, an overview is provided of the seven areas that are most
significant when implementing LA projects. The reason behind choosing these areas is described in
greater detail. In subsections 3.1 to 3.7, we describe what we consider to be part of these areas and what
we explicitly exclude, which challenges exist and which approaches to solve appear promising in more
detail. Finally, we conclude with a discussion and remarks about future work.
In this chapter, the results of a survey of previous work regarding possible challenges and solutions when
implementing LA initiatives are presented. We read the literature to find work on similar topics and
determine how the authors met these challenges and what kind of solutions and/or framework they
used/proposed. The literature review of Leitner et al. (2017) showed that, in the last few years, various
publications have been published in which parts of the challenges summarized in our seven main
categories are described. In her paper, Ferguson (2012) documented the concerns about ethics and privacy
which began to surface once tools used to analyze student data became more powerful and readily
available. She additionally addressed four challenges, one of which was the development of a set of
ethical guidelines. Prior to this, Campbell (2007) had already defined a framework for locating potential
areas of misunderstanding in LA, which he based on definitions, values, principles and loyalties. Later, to
clearly differentiate between ethics and privacy, Drachsler & Greller (2016) defined ethics as “the
philosophy of moral that involves systematizing, defending, and recommending concepts of right and
wrong conduct. In that sense, ethics is rather different to privacy. In fact, privacy is a living concept made
out of personal boundary negotiations with the surrounding ethical environment.” Ferguson et al. (2016)
summarized the challenges presented by the special issue of ethics and privacy in LA in 21 points, as
shown in Table 1.
Table 1. Learning Analytics Challenges and Dimensions (Ferguson et al., 2016)
[Add table 1 here]
The first six challenges are related to helping learners achieve success during their studies. Therefore, the
data should or even better must be complete, accurate and up-to-date. It is the learner’s responsibility
to ensure this. On the other hand, the institutions also have a responsibility to ensure a state-of-the-art,
valid and reliable evaluation process, which is carried out in an understandable way. Challenge seven,
originally derived from the field of the medical sciences (Murray, 1990), however, relates to the issue that
informed consent is also needed today with regard to LA. Students should be involved as collaborators
and, therefore, give their informed consent to data access. The obtained analysis from the data is then used
to support learning and improve the learner’s chances of success. Challenges eight to ten are concerned
with the rights and interests of students and teachers, as well as the responsibility held by educational
institutions to safeguard and protected these. Providing access to data, as well as allowing the possibility
to make corrections and/or file a complaint, also play important roles (Rodríguez-Triana, et al., 2016). The
next two challenges are concerned with providing equal access to education for everyone (Challenge
eleven) and a fair and equally-applied legal system for all citizens (Challenge twelve). Challenges thirteen
to nineteen are related to data protection and place a focus on the legal responsibility for data security. The
harvested data are the property of another person, and the institution must assure data protection and
security. The last two challenges are concerned with the privacy of data and how data should be used and
treated (cf. Ferguson et al., 2016).
To meet these challenges, the scientific community already takes a variety of approaches with regard to
data protection and ethics in connection with LA (Ferguson et al. 2016): For example, a code of conduct
was developed that can be used as a taxonomy of ethical, legal and logistical issues for LA (Sclater, 2016).
Rodríguez-Triana et al. (2016) expanded the recommendations of Sclater's (2016) code and added consent,
transparency, access, accountability, data protection, validity and avoidance of adverse effects. A
framework for privacy and data protection has been proposed by Steiner et al. (2016). Cormack (2016) has
published a paper which deals with European data protection practices, and in particular with the
transparent communication of data usage. The codes of conduct and frameworks developed so far have
been supplemented by Berg et al. (2016) with tools and approaches that enable us to put them into
practice. Khalil & Ebner (2016) focused on the de-identification and anonymization of data for analysis
within LA. An examination of the study conducted by Hoel & Chen (2016) shows that the discussion on
data exchange and Big Data in education is still at an early stage. Prinsloo & Slade (2016) addressed the
rights and problems of students as well as the supervising institutions, arguing that the primary
responsibility for LA system providers is to promote individual autonomy and provide each individual
learner with enough information to make informed decisions (cf. Ferguson et al. 2016). To help
institutions enter the area of LA, Drachsler & Greller (2016) developed a checklist (the DELICATE
checklist), which helps users identify and examine possible problems and obstacles that could hinder the
introduction of LA in the education sector in advance. The term DELICATE stands for the eight points
that need to be considered if one wants to use LA (see Drachsler & Greller, 2016).
In the context of the SHEILA (Supporting Higher Education to Integrate Learning Analytics) project, a
team of research and institutional leaders in LA is currently developing a policy framework for formative
assessment and personalized learning. They have used the Rapid Outcome Mapping Approach (ROMA)
and validated their outputs through case studies. Their focus has been placed on the development of a
policy agenda for higher educational institutions by taking advantage of direct engagement with the
different stakeholders (Macfadyen, 2014). Tsai & Gasevic (2017) identified several challenges related to
strategic planning and policy:
Challenge 1 - Shortage of Leadership: The leadership lacks the capabilities to guarantee the
implementation of LA in the environment of the institution. Therefore, different stakeholders and their
interests must be taken into account to ensure their commitment to the topic. Otherwise, these
stakeholders may become stoppers.
Challenge 2 - Shortage of Equal Engagement: There are gaps between the various stakeholders within
institutions with regards to understanding LA. Teams who work in technical areas showed the highest
level of understanding, while other teams did not know much about LA. This can be seen as a barrier
for the institutional acceptance of LA.
Challenge 3 - Shortage of Pedagogy-Based Approaches: When designing LA tools, it is also important
to include pedagogical approaches in the LA process. Institutions tend to focus more on technical
aspects rather than pedagogical aspects.
Challenge 4 - Shortage of Sufficient Training: As highlighted in Challenge 2, there is a lack of
understanding of how LA can be beneficial to all stakeholders. A good staff training program, which
helps them improve their skill sets on this topic, is key to success.
Challenge 5 - Shortage of Studies Empirically Validating the Impact: A budget must be allocated to
support LA. Therefore, senior staff members need a basis for the decision-making to do so. However,
the evaluation of the success of LA seems to be a challenging task.
Challenge 6 - Shortage of Learning Analytics Specific Policies: Institutions have regulations regarding
data and ethics. However, few institutions have codes of practice for LA. This lack of clear guidance
regarding LA practice needs to be addressed.
Furthermore, Tsai & Gasevic (2017) reviewed eight policies (Jisc, LACE, LEA’s Box, NUS, NTU, OU,
CSU, USyd) concerning their suitability based on the six abovementioned challenges. Although the
policies partially lack pedagogical approaches, guidance for the development of data literacy and
evaluations of the effectiveness, they serve as valuable references for institutions interested in establishing
LA in their field of work. Particularly institutions that are interested in developing their own practice
guidelines for LA (Tsai & Gasevic, 2017) can benefit from the findings.
In our research, we found that several publications have focused on different aspects of this topic.
Overall, it can be said that the creation of clear guidelines based on a code of practice is needed when
planning to introduce LA in an institution. Our knowledge and thoughts are summarized in seven main
categories and presented in the next section.
Bearing in mind the related work, the issues identified during previous research, as well as our own
experiences with implementing LA projects and initiatives in higher education (De Laet et al., 2018;
Leitner & Ebner, 2017; Leitner et al, 2018), we developed a framework for LA implementations. Based on
the results of the literature review and a workshop with LA specialists, stakeholders and researchers,
different issues were identified. These core issues were discussed, verified and could be categorized into
seven main areas (Figure 1).
[Add Fig 1. here]
Figure 1: Framework with seven main categories for LA initiatives
In the following subsections, we explain the seven categories in detail, pointing out the challenges they
present and providing possible solutions.
3.1 Purpose and gain
The expectations related to improving learning and teaching when talking about LA in higher education
are extremely high. However, at an institutional level, the line between LA and Academic Analytics is
blurred. Therefore, it is advisable to distinguish between the different stakeholders with regard to the
various goals and perspectives of stakeholders such as learners, educators, researchers and administrators.
The goal of the learners is to improve their performance. LA supports this by providing adaptive
feedback, recommendations and individual responses on their learning performance (Romero & Ventura,
The educators are interested in understanding the students’ learning processes, understanding social,
cognitive and behavioral aspects, reflecting on their teaching methods and performance, as well as
optimizing their instructions to achieve a better learning outcome (Leitner et al., 2017). They want to be
able to assess the students' activities more effectively and draw conclusions to find out where they need to
take more action to improve the students’ learning performance.
Researchers use the data to develop theoretical models for new and improved teaching and learning
methods. This includes pursuing the goal to predict future learning paths and support the needs of learners
more appropriately. Educational technologists and researchers in the field of pedagogy review existing
didactical models and develop new didactical ones by carrying out field studies in classrooms. For this
reason, they conduct research continuously and adapt LA techniques based on the data collected to meet
the new expectations of the younger generation.
Administrators are interested in implementing their agendas in a more efficient environment. Their
aim is to offer students a more pleasant and efficient learning environment. Additional goals are to reduce
the failure rates and numbers of drop-outs, increase performance and, thus, optimize and improve the
curricula. The government is responsible for the enforcement of data privacy and data protection issues.
Challenges may occur when dealing with the different stakeholders. If stakeholders are confronted with
hard facts without being asking for their thoughts and opinion first, they may rebel. Additionally, despite
the generally positive intentions of those introducing LA into institutions, stakeholders often have their
own thoughts about LA. Students and teachers might be afraid that the results of the analytics, if made
public, would put them in bad positions. Or even worse, the representatives of the different stakeholders
have their own in-game and expect to use the results to expose their counterparts. Therefore, it is
necessary to make the goals of the LA initiative transparent, clarifying exactly what is going to happen
with the information and explicitly what is not.
When the purpose of the LA initiative is very clear from the beginning, this does not seem to be a
problem. However, if it is not, the situation might become complicated when attempting to explain the LA
initiative to the stakeholders. Fuzziness not only puts you in a weak negotiating position, but can also
become a major problem when the stakeholders try to bring you over onto their side. Therefore,
implementers need to specify and adhere to the ultimate objective of the LA initiative.
3.2 Representation and actions
The purpose of LA is to use the data collected to optimize the students' learning processes and improve
teaching. The aim is to make learning itself more predictable and visible. Actions derived from this can
serve as a basis for developing a catalogue of measures to support risk groups and provide them with
better assistance during their study. Based on this, recommendations are made to support learners and
encourage them to reflect on their behaviors. The information is provided within a suitable environment
and clearly visualized as being included in the student’s personalized learning process. The
personalization of the working environment and the associated advantages are placed in the foreground.
This should have the effect of motivating the learner in terms of improving their attitude. The feedback
received is intended to stimulate reflection and lead to a shift in goals and the associated improvement in
learning success.
Choosing the right environment for the learner’s feedback and the correct visualization technique can
present a large challenge for all parties involved. Due to the quantity of data harvested and the focus
placed on quantitative metrics, teachers sometimes consider LA to be antithetical to an educational sense
of teaching. Dashboards with performance metrics are becoming increasingly popular in these contexts
(Clow, 2013). The interpretation of this data can sometimes seem incredibly difficult if it has not been
properly prepared before it is presented to the student. Therefore, it can be better not to provide the student
with all information related to the learning outcome. A mentor can discuss the results with the student.
However, university staff who are acting as mentors need specialized training so they can interpret the
data and pedagogical and psychological skills to discuss his/her results with the student and provide
deeper insights about the data.
3.3 Data
Universities and schools are constantly analyzing data from their students for a variety of reasons. LA can,
therefore, be seen as an innovative continuation of this principle, applied to make use of the advantages of
modern technology and the various data sources available today. The data can be examined and analyzed
for their impact in the learning context to improve the quality of learning and teaching, as well as enhance
the chances of the students’ success. Of course, universities require the individual’s permission to collect
and evaluate sensitive data for the purpose of LA. Students must be made aware of the purpose of
collecting and the process of analyzing the data. Consent is mandatory for the use of these data, which
then can be used as a basis for strategic decisions by the various stakeholders. The teacher can monitor
their students and analyze the behavior and actions of their students during their interactions with the
learning management system, providing insights into their learning culture. For example, whether the
students submitted all assignments or how actively they engage in their studies. Derived models can be
used to provide better student support so that they can reach their goals more efficiently.
Students leave various data traces while using the university infrastructure. The data collected will be
used together with statistical models and methods for the purpose of LA when a benefit for student
learning is expected. Students may want to know why they have been categorized as potential risk
candidates in specific courses. Therefore, the data and models used must be communicated and explained
to them by trained staff in an comprehensible way to provide them with guidance. Access to that data must
be secured, and only a few staff members are allowed to have access permissions to students’ data. The
institutions must enact policies that address data protection and access. Students must be informed of who
has access to the data.
The data used will not only have impact on the individual student, but also influence the practice of
teaching at the university. Therefore, the data have to be re-evaluated over time and adjusted to meet the
new demands. Furthermore, to ensure the best support and quality of the data, students need to keep their
data up-to-date. Giving them the (proactive) opportunity to check and update their data supports them and
the university during this process. Additionally, all of these points must comply with the GDPR and local
data protection acts.
A policy needs to be created for LA that aligns with the organization’s core principles. Transparent
communication about where the data are stored, what is being done to ensure data security and privacy
and how the data are evaluated and used (and by whom) is essential. Responsible handling of the students’
data by all stakeholders, especially university staff members, is important. Further training and skill-
building for responsible tutors/mentors in interpretation of the students’ data and action-taking in this
context are required. Interventions should be recommended to the student, which are based on the
collected data, and must be delivered in a transparent and comprehensible way (e.g., which methods and
models have been used) to ensure broad students acceptance and engagement. Students need to clearly
understand how the data are interpreted and manipulated and which techniques are used to ensure optimal
handling and verifiable recommendations.
3.4 IT infrastructure
IT infrastructure refers to a set of information technology (IT) components, such as hardware, software,
network resources and services that are the foundation for the operation and management of an enterprise
IT environment (Laan, 2011). This infrastructure allows organizations in higher education to deliver IT
services to its students, teachers and administrative staff. This IT infrastructure is usually internal and
deployed with in-house facilities, but it is possible to commission an external provider. However, IT
infrastructure is the basis for any LA measurements and, therefore, has to be considered carefully.
Why is it important to think about the IT infrastructure? To understand its relevance it is necessary to
know where the data is located. Therefore, we can distinguish between two different scenarios. First, the
data are stored and processed in a university-owned service center. Thereby, the responsibilities and
liabilities are located at the university itself, and national and organizational rules must be obeyed. This
scenario has the advantage that access to the data and the data ownership are located at the university,
which makes it easier to work with the data. However, it also presents some disadvantages, such as the
fact that initiatives with special technology requirements need to comply with the standardized rules held
by the internal service provider. Also, the cost-benefit ratio should be kept in mind because hosting and
support services privately might be more expensive than outsourcing.
The second scenario concerns working with external service providers. In this scenario, individual
solutions can be applied, as many providers are available that might meet the specific needs. In contrast to
the internal service center of a university, external service providers can concentrate their efforts on their
smaller and highly specialized digital product. Furthermore, the costs that arise can easily be estimated
and should be much lower than providing a private, individual solution. The negative aspects of working
with an external service provider are related to issues of access and data ownership as well as meeting the
necessary security standards when working with sensitive data, such as student performance data.
Regardless of whether one works with an internal or external service provider, it takes time to establish
the appropriate basis. Therefore, efforts should be made from the beginning to search for possible
solutions to set up the necessary IT infrastructure and contact and establish connections with relevant
people (Leitner et al., 2018). This will save time and resources when the implementation of an LA
initiative becomes critical.
3.5 Development and operation
This category combines the process of developing and operating LA initiatives. It includes a wide range of
different developments, from designing a simple questionnaire to developing an enterprise software
solution. Additionally, the activities cover research on and the development, prototyping, modification,
reuse, re-engineering, monitoring and maintenance of LA initiatives or projects.
Once the first prototype has been produced, implemented in a real-life context and evaluated, the
discussion can proceed to the next step. How can the prototype be realized? How can it move from the
prototype phase to the production phase? These are quite critical questions because new tasks and
challenges will appear. For example, the scalability of the implementation has to be taken into account.
The number of learners may differ arbitrarily, and this can lead to a complete new concept for the existing
IT infrastructure. Furthermore, processes which were first created manually must be redefined so that they
can be performed at least semi-automatically or completely automatically.
Even if student data is stored, this is typically done via different information systems. Normally, several
information systems are responsible for performing different tasks and, therefore, storing the data in
different formats, on different servers and with different data owners. The efforts that are required to
receive and manage all data can be stressful and tedious. Additionally, converting raw data into a useful
format can be another big challenge. This is a highly complicated process, which needs thorough planning
and a consistent final implementation. Additionally, the implementation should include working in
different layers and should probably be implemented in a modular manner. In doing so, any changes in the
circumstances of the different, associated information systems can easily be adapted.
From the first stages of any learning measurement, we suggest that the scope should be specified in
detail. Will the LA be established merely for testing and to obtain initial impressions or will it be
implemented at a university-wide level? Scalability is maybe one of the most frequently underestimated
problems in today’s IT industry. Furthermore, we strongly emphasize planning the LA implementation
beforehand, so that the costs can be estimated as exactly as possible. A distinction must be made as to
whether processes have to be carried out manually, semi-automatically or fully automatically.
3.6 Privacy
The term privacy is defined as an intrinsic part of a person’s identity and integrity and constitutes one of
the basic human rights in developed countries, as it should be an established element of the legal systems
(Drachsler & Greller, 2016). All LA implementations have to ensure the privacy of the involved parties.
Learners must trust the final systems and, therefore, keeping information private is of the utmost
importance. Additionally, depending on the country where the higher education institution is situated,
different regulations in addition to the General Data Protection Regulation (GDPR), which is applicable in
Europe, are enforced. Organizations have to deal with different tasks while finding a suitable legal
framework that covers the GDPR. They could take another tack and start to minimize the data harvested
and/or take actions to anonymize or pseudo-anonymize their data.
Nevertheless, even when keeping privacy in mind when handling LA initiatives from the beginning, the
situation can become highly complex. For example, by merging different data sources, new and surprising
results can be visualized, and, therefore, new insights which were never intended can be provided. Due to
the fact that universities are huge institutions, there is a high risk that unauthorized people receive access
to these data interpretations.
Another large problem that is closely related to privacy is the fact that a person/learner is reduced to
their stored data. Society is made up of individuals, so every situation has to be considered in a
differentiated way. For example, an activity profile can be created, but we will never know exactly how
those activities actually took place and to which extent. The reduction of people to categories and profiles
can be particularly dangerous, because a learning individual could be reduced to a merely few parameters.
Since society seems to like to fall back on so-called facts, the derivation of causal connections on the basis
of learning algorithms always needs to be critically questioned. This also means that gaps in data need to
be analyzed and handled.
Finally, the general lifetime of personal data is a topic that requires further discussion. The data may be
interesting at a time when the activities and learning outcomes are relevant, but the data may no longer be
relevant in the future. Arguments for keeping data could be presented for the purposes of training
algorithms and machine learning. Improvements could also be made by providing a larger data resource.
However, these steps should only be carried out with absolute anonymization of the data. Khalil & Ebner
have shown how this should be done (Khalil & Ebner, 2016).
First, privacy is a fundamental right of every person and must be respected. This means that any LA
implementation must take this into account from the very beginning. However, this is often difficult or
perhaps not clarified at all, because complex situations can arise as a result of data mergers. Therefore, we
suggest working with the highest possible level of transparency, because this encourages confidence: The
learners know what happens to their data and what statements can be made. At the same time,
unauthorized people cannot be allowed to access the data, and the personnel need to be well-trained in
terms of data interpretation, but also know how to deal with questions about privacy. If doubts with regard
to privacy arise, the LA measure must always be omitted.
Finally, we would like to point out once again that the mere use of data i.e. factswill not be
sufficient to adequately represent such a complex situation as a learning process. LA is only an auxiliary
tool that can be used to gain a better understanding of this process.
3.7 Ethics
Ethics is defined as a moral code of norms and conventions that involves systematizing, defending and
recommending concepts of right and wrong conduct. It exists external to a person in society (Drachsler &
Greller, 2016). In the context of LA, various ethical and practical concerns arise as the potential exists to
harvest personalized data and intervene at an individual level (Prinsloo & Slade, 2015). Therefore, privacy
pose as a major challenge for the implementation of LA initiatives.
Additionally, working with sensitive data presents a particular challenge. Sensitive data includes
information on medical conditions, financial information, religious beliefs, or sexual orientation, but also
about student performance. If made public, such information could result in harm to that particular person.
Therefore, it is necessary to ensure restrictions on who has access to the information and for which
purpose(s) it is used.
Some questions arise when looking at data from an ethical point of view. First, which data of a person
are permitted to be harvested, used and processed, regardless of whether they are a student or an educator?
Second, which information can be communicated to someone, and what may be the resulting
consequences? These are increasing concerns in the context of ethics, because LA enables the
improvement of accuracy of the predictions for different learning profiles by combining different data
sources. The LA implementers must find a suitable way to meet high ethical standards and ensure a
beneficial outcome for all stakeholders.
Another important point is the option to opt-in and opt-out for participants from harvesting, storing and
processing the individual data of a single person. However, how should institutions relying on LA deal
with students who take the right to opt-out? When implementing LA in an institution, it is advisable to
involve all stakeholders at an early state in the process of creating rules and the legal framework for the
use of data. Transparency is key, as well as understanding the different needs of the interesting groups
involved in the process. All intentions, goals and benefits for harvesting and using the data have to be
explained in a clear and comprehensible way to all stakeholders. The consent for using the data begins
with the login into the system, which tracks data from their users. During this process, the consent of all
parties involved must be communicated. In this context, the areas in which the data will be used must be
clearly communicated. During discussions, the possibilities of interpretation of the provided information
need to be described to prevent misunderstandings and incorrect decisions. As a precautionary measure,
the institutions can introduce codes of conduct and procedures that provide initial support on this subject.
At the Learning Analytics and Knowledge conference 2018 (LAK18) in Sydney, a draft code of ethics
v1.0 was presented (Lang et al. 2018). This document may be considered as a foundation for ethical
matters when implementing LA initiatives. Additionally, a legal counsel could offer their advice when the
interpretation of a topic or situation seems unclear. The European Learning Analytics Exchange (LACE)
project offers workshops on ethics and privacy in LA (EP4LA). The LACE project also plays a key role in
advancing the issues on the ethical dilemmas of using LA.
Within higher education institutes, researchers are still full of enthusiasm and excitement about LA and its
potential. Furthermore, LA is now at the point at which affects research, practice, policy- and decision-
making equally (Gašević, 2015).
However, to facilitate successful LA initiatives, a few things have to be kept in mind. In this chapter,
we presented seven main criteria, which can be used for initial orientation when implementing LA. The
order of appearance was intentionally chosen, although the order of application depends on the
We hope that the classification of the seven main criteria, the presented challenges and the approaches
that can be taken to overcome them will be helpful to implementers of LA initiatives. We are aware that
the presented examples cover only a small range of the challenges an implementer might encounter, but
we hope the results of this study can help researchers and educators understand the bigger picture and
become aware of other potential issues.
In future research, we plan to investigate the seven categories in more detail to identify different
examples and validate our framework to foster future LA measurements.
Berg, A. M., Mol, S. T., Kismihók, G., & Sclater, N. (2016). The Role of a Reference Synthetic Data Generator within the Field
of Learning Analytics. Journal of Learning Analytics, 3(1), 107-128.
Campbell, J.P. (2007). Utilizing Student Data within the Course Management System to Determine Undergraduate Student
Academic Success: An Exploratory Study, PhD, Purdue University.
Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6) pp. 683695.
Cormack, A. N. (2016). A data protection framework for learning analytics. Journal of Learning Analytics, 3(1), 91-106.
De Laet, T., Broos, T., Verbert, K., van Staalduinen, J-P., Ebner, M. & Leitner, P. (2018). Proceedings 8th International
Conference on Learning Analytcis & Knowledge. Sydney, p. 602-606
Drachsler, H., & Greller, W. (2016). Privacy and analytics - it’s a DELICATE issue: A checklist to establish trusted learning
analytics. Proceedings of the 6th International Conference on Learning Analytics and Knowledge, 89-96.
Duval, E. (2012). Learning Analytics and Educational Data Mining, In: Erik Duval’s Weblog, 30 January 2012, Retrieved from accessed on 16 March 2018.
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced
Learning (IJTEL), 4(5/6), 304317.
Ferguson, R., Hoel, T., Scheffel, M., & Drachsler, H. (2016). Guest editorial: Ethics and privacy in learning analytics. Journal of
learning analytics, 3(1), 5-15.
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.
Hoel, T., & Chen, W. (2016). Privacy-driven design of learning analytics applications - exploring the design space of solutions for
data sharing and interoperability. Journal of Learning Analytics, 3(1), 139-158.
Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2012). The NMC Horizon Report: 2012 Higher
Education Edition. The New Media Consortium.
Khalil, M., & Ebner, M. (2015). Learning analytics: principles and constraints. In Proceedings of world conference on
educational multimedia, hypermedia and telecommunications, 1326-1336.
Khalil, M. & Ebner, M. (2016) De-Identification in Learning Analytics. Journal of Learning Analytics. 3(1). pp. 129 - 138
Laan, S. (2011). IT Infrastructure Architecture: Infrastructure Building Blocks and Concepts. Lulu Press.
Lang, C., Macfadyen, L. P., Slade, S., Prinsloo, P., & Sclater, N. (2018, March). The complexities of developing a personal code
of ethics for learning analytics practitioners: implications for institutions and the field. In Proceedings of the 8th International
Conference on Learning Analytics and Knowledge (pp. 436-440). ACM.
Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education - a literature review. In Learning Analytics:
Fundaments, Applications, and Trends (pp. 1-23). Springer, Cham.
Leitner, P. & Ebner, M. 12 Jul 2017 Learning and Collaboration Technologies. Technology in Education: 4th International
Conference, LCT 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, July 9-14, 2017, Proceedings, Part
II. Zaphiris, P. & Ioannou, A. (eds.). Cham: Springer International Publishing AG, p. 293-301 9 p.
Leitner, P., Broos, T., & Ebner, M. (2018). Lessons Learned when transferring Learning Analytics Interventions across
Institutions. In Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (pp. 621-629).
Long, P. & Siemens, G., (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30.
Retrieved from accessed on 16
March 2018.
Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Embracing big data in complex educational systems: The
learning analytics imperative and the policy challenge. Research & Practice in Assessment, 9. pp. 17-28.
Murray, P. M. (1990). The history of informed consent. The Iowa Orthopaedic Journal, 10, 104109.
Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. In Proceedings of the fifth
international conference on learning analytics and knowledge (pp. 83-92). ACM.
Prinsloo, P., & Slade, S. (2016). Student vulnerability, agency and learning analytics: an exploration. Journal of Learning
Analytics, 3(1), 159-182.
Rodríguez-Triana, M. J., Martínez-Monés, A., & Villagrá-Sobrino, S. (2016). Learning analytics in small-scale teacher-led
innovations: ethical and data privacy issues. Journal of Learning Analytics, 3(1), 43-65.
Romero C, Ventura S (2013) Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery,
3(1), pp 12-27
Sclater, N. (2016). Developing a Code of Practice for learning analytics. Journal of Learning Analytics, 3(1), 16-42.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., Duval, E., Verbert, K., Baker,
RSJD (2011). Open learning analytics: an integrated & modularized platform. Proposal to design, implement and evaluate an
open platform to integrate heterogeneous learning analytics techniques.
Steiner, C. M., Kickmeier-Rust, M. D., & Albert, D. (2016). LEA in private: a privacy and data protection framework for a
learning analytics toolbox. Journal of Learning Analytics, 3(1), 66-90.
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education---challenges and policies: a review of eight learning
analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
... Зазначено, що для успішності проєктів навчальної аналітики студенти повинні дати інформовану згоду на доступ до даних про їх дії в електронних середовищах навчання [20]. Важливим моментом є надання можливості вибору та відмови учасників від збирання, зберігання та оброблення індивідуальних даних стосовно їх особи. ...
Full-text available
Стаття присвячена дослідженню проблем впровадження Learning Analytics – навчальної аналітики у сферу вищої освіти. Розкрито зміст поняття «Learning Analytics», проаналізовано досвід її впровадження у діяльність вищих навчальних закладів країн світу. Установлено, що навчальна аналітика як галузь наукового дослідження є поєднанням інформаційних технологій, цифрового викладання і навчання та методів інтелектуального аналізу даних, що обумовлює специфіку її формування та проблематику. Виявлено задачі, які вона дозволяє розв’язувати стосовно різних аспектів електронного навчання: прогнозування, виявлення структури, виявлення зв’язків та асоціацій на основі аналізу цифрових слідів студентів у освітніх електронних середовищах. З’ясовано перспективні напрями досліджень на сучасному етапі. Установлено, що впровадження у діяльність закладів вищої освіти основних типів навчальної аналітики: описової, прогностичної та пропонуючої, дає можливість отримувати інформацію про поточний стан електронного навчання та оперативно приймати рішення стосовно його корекції й оптимізації. Сформульовано перелік проблем, пов’язаних зі стратегічним плануванням і політикою впровадження навчальної аналітики у діяльність вищих навчальних закладів: недосконалість керівництва в реалізації проектів; нерівномірне залучення різних зацікавлених сторін; недостатній рівень педагогічних підходів при інтерпретації отримуваних даних; недостатній рівень підготовки персоналу; недостатня кількість досліджень, емпірично підтверджуючих вплив на ефективність навчального процесу; недосконалість нормативного регулювання. Показано, що ці проблеми є міждисциплінарними, а їх вирішення потребує тісної співпраці та узгоджених дій адміністраторів, ІТ-фахівців, викладачів та педагогів-дослідників упродовж усіх етапів реалізації проекту. Теоретично обґрунтовано пропозиції щодо заходів, спрямованих на подолання міждисциплінарного бар’єру у процесі розробки та експлуатації проектів навчальної аналітики: чіткість та прозорість цілей і ініціатив; задоволення потреб усіх зацікавлених сторін; забезпечення необхідної ІТ-інфраструктури; підготовка співробітників, які будуть надавати допомогу в інтерпретації отриманих результатів; забезпечення безпеки конфіденційних даних; розробка нормативних положень стосовно функціонування та використання навчальної аналітики.
The institutional adoption of learning analytics in the Netherlands is still low. This chapter presents a study on the challenges that Dutch higher educational institutions encounter when adopting learning analytics. The literature describes possible challenges regarding assets, data governance, data literacy, data quality, organizational culture, pedagogical grounding, privacy and ethics, and technical issues. Eight interviews with practitioners from four universities verified that all these challenges are causing problems for Dutch institutions as well. The practitioners provided recommendations on how to overcome these adoption challenges. Higher educational institutions need to demonstrate the value of learning analytics, provide users with training, clearly identify users' needs, and establish a ‘one-stop-shop' that acts as a single contact point within the organization. Combined with recommendations already present in the literature, this helps accelerate the successful adoption of learning analytics by Dutch higher educational institutions.
Full-text available
Learning analytics (LA) is an emerging area that has had extensive development in higher education in recent years, focused both on the learning process of students within subjects and on monitoring their trajectories in training programmes. However, most of the developments remain in the pilot phase without reaching institutional adoption. This paper reports the results of a systematic review of the literature carried out with the aim of identifying the factors that influence the adoption of LA, as well as the existing strategies that facilitate such adoption in higher education institutions. The results show that factors for LA adoption are situated in multiple dimensions, including stakeholders at different levels, institutional and pedagogical processes, technical limitations and ethical considerations. This work contributes with a consolidated list of 14 critical factors to be considered to effectively adopt LA tools, addressing planning, strong leadership, collaboration and prioritizing senior management commitment, goal setting, cross‐organizational design, educational process redesign, system integration and legacy system linkage. On the other hand, a variety of frameworks, models and approaches are distinguished in the literature to help the adoption of LA, however, none of them cover all the factors involved in such adoption. Therefore, we provide a compilation of strategies that have been used in the literature to reduce the gaps associated with the different factors described. Practitioner notes What is already known about this topic LA is a field of research that has had a lot of interest and a wide variety of tools have been developed. Still, it is criticized that they originate more from data availability than from the needs of students, teachers and decision makers. LA tools are mostly evaluated on their usability aspects and to a lesser extent on their usefulness to impact learning processes. The adoption processes of LA in higher education are oriented to specific aspects such as the relevance of the visualizations in the different contexts of application, but they do not address extensively other important aspects involved in such adoption. What this paper adds This article seeks to synthesize what has been investigated in the literature regarding the factors that affect the adoption of LA by higher education institutions. Describe the 14 critical factors identified from the literature to address adoption of multiple dimensions that include stakeholders at different levels, institutional contexts and ethical considerations. A compilation of strategies that have been used in the literature to reduce the gaps associated with the different factors are described, and the aspects or dimensions that are not addressed are highlighted. Implications for practice and/or policy The synthesis of LA adoption factors and the approach strategies present in the literature allow practitioners to focus their efforts on the key aspects to consider in their LA adoption processes. Future work could shift from a pure design perspective to strengthen an engineering perspective in the adoption practices. From the point of view of the policies associated with the adoption of LA in higher education institutions, it is concluded that it is necessary to approach the process from a comprehensive perspective considering all the dimensions involved.
Purpose This study aims to focus on providing a computerized classification testing (CCT) system that can easily be embedded as a self-assessment feature into the existing legacy environment of a higher education institution, empowering students with self-assessments to monitor their learning progress and following strict data protection regulations. The purpose of this study is to investigate the use of two different versions (without dashboard vs with dashboard) of the CCT system during the course of a semester; to examine changes in the intended use and perceived usefulness of two different versions (without dashboard vs with dashboard) of the CCT system; and to compare the self-reported confidence levels of two different versions (without dashboard vs with dashboard) of the CCT system. Design/methodology/approach A total of N = 194 students from a higher education institution in the area of economic and business education participated in the study. The participants were provided access to the CCT system as an opportunity to self-assess their domain knowledge in five areas throughout the semester. An algorithm was implemented to classify learners into master and nonmaster. A total of nine metrics were implemented for classifying the performance of learners. Instruments for collecting co-variates included the study interest questionnaire (Cronbach’s a = 0. 90), the achievement motivation inventory (Cronbach’s a = 0. 94), measures focusing on perceived usefulness and demographic data. Findings The findings indicate that the students used the CCT system intensively throughout the semester. Students in a cohort with a dashboard available interacted more with the CCT system than students in a cohort without a dashboard. Further, findings showed that students with a dashboard available reported significantly higher confidence levels in the CCT system than participants without a dashboard. Originality/value The design of digitally supported learning environments requires valid formative (self-)assessment data to better support the current needs of the learner. While the findings of the current study are limited concerning one study cohort and a limited number of self-assessment areas, the CCT system is being further developed for seamless integration of self-assessment and related feedback to further reveal unforeseen opportunities for future student cohorts.
Full-text available
Human-centred design is a well-established approach within research fields such as human-computer interaction, ergonomics, and human factors. Recently Learning Analytics (LA) researchers and practitioners have manifested great interest in exploring methods and techniques associated with this approach to manage the design process in ways that can enhance human interaction with LA technology. The project “Learning Analytics – Students in Focus” aims to use student-related data to support the learning and teaching process in a higher educational context. Our interdisciplinary team investigates LA tools that leverage students’ academic success by acquiring or developing self-regulated learning skills. We adopted a Human-Centred Learning Analytics (HCLA) approach involving students, teachers, and other educational stakeholders in the iterative design of our LA tools. This article contributes to the discussion on how to design LA tools using a human-centred approach. We describe the analysis, design, implementation, and evaluation process of three LA tools comprised in our students’ dashboard, i.e., the planner, the activity graph, and the learning diary. In addition, we present key results gained in several empirical studies which had an implication on the tools’ design. Finally, we provide insights about our experience with the HCLA approach, pointing out benefits and limitations in practice.KeywordsHuman-centred learning analyticsSelf-regulated learningLearning analytics dashboard
Full-text available
Background Learning Analytics (LA) is an emerging field concerned with measuring, collecting, and analysing data about learners and their contexts to gain insights into learning processes. As the technology of Learning Analytics is evolving, many systems are being implemented. In this context, it is essential to understand stakeholders' expectations of LA across Higher Education Institutions (HEIs) for large‐scale implementations that take their needs into account. Objectives This study aims to contribute to knowledge about individual LA expectations of European higher education students. It may facilitate the strategy of stakeholder buy‐in, the transfer of LA insights across HEIs, and the development of international best practices and guidelines. Methods To this end, the study employs a ‘Student Expectations of Learning Analytics Questionnaire’ (SELAQ) survey of 417 students at the Goethe University Frankfurt (Germany) Based on this data, Multiple Linear Regressions are applied to determine how these students position themselves compared to students from Madrid (Spain), Edinburgh (United Kingdom) and the Netherlands, where SELAQ had already been implemented at HEIs. Results and Conclusions The results show that students' expectations at Goethe University Frankfurt are rather homogeneous regarding ‘LA Ethics and Privacy’ and ‘LA Service Features’. Furthermore, we found that European students generally show a consistent pattern of expectations of LA with a high degree of similarity across the HEIs examined. European HEIs face challenges more similar than anticipated. The HEI experience with implementing LA can be more easily transferred to other HEIs, suggesting standardized LA rather than tailor‐made solutions designed from scratch.
Full-text available
Schön, Sandra; Leitner, Philipp; Lindner, Jakob & Ebner, Martin (2023). Learning Analytics in Hochschulen und Künstliche Intelligenz. Eine Übersicht über Einsatzmöglichkeiten, erste Erfahrungen und Entwicklungen von KI-Anwendungen zur Unterstützung des Lernens und Lehrens. In: Tobias Schmohl, Alice Watanabe, Kathrin Schelling (Hg.), Künstliche Intelligenz in der Hochschulbildung, Bielefeld: transkript, S. 27-49. Online zugänglich unter: Erschienen unter der Lizenz CC BY SA 4.0 International ( nses/by-sa/4.0/
This article aims at offering an overview of educational data mining and learning analytics and highlighting their essential role in improving 21st century education. Hence, it analyzes their concepts, goes over the recent literature, and extracts invaluable information according to the results and outcomes of related studies. Furthermore, it discusses their use in educational settings, examines their benefits, and suggests ways to address some of the current open challenges, issues, and limitations. In addition, it presents the summary of the main findings, draws conclusions, and provides directions for future research. Based on the results, educational data mining and learning analytics fields can significantly influence the current educational system and provide opportunities for new learner-centered tools and smart learning environments that provide customized experiences and meet students' specific needs to be developed. Finally, it was evident that for the complete adoption of these fields, a data-driven culture should be followed by the educational institutions.
In der Wissenschaft sind Erkenntnisziele, aber auch ein spezieller Weltaufschluss angelegt. Diesen zu vermitteln, ist Aufgabe der Wissenschaftsdidaktik. Was aber bedeutet es, Wissenschaft institutionell zu einem Gegenstand des Lehrens und Lernens zu machen? Die Beitragenden des Bandes liefern eine disziplinenübergreifende Einführung in die Wissenschaftsdidaktik, die sich mit grundlegenden konzeptionellen Fragen sowie Einordnungs- und Deutungsversuchen aus verschiedenen Perspektiven befasst. Hochschullehrende sowie praktisch und forschend tätige Personen in der Bildungswissenschaft finden hier leichten Zugang zur Wissenschaftsdidaktik und ihren innovativen Erkenntnispotenzialen.
Conference Paper
Full-text available
Learning Analytics is a promising research field, which is advancing quickly. Therefore, it finally impacts research, practice, policy, and decision making in the field of education. Nonetheless, there are still influencing obstacles when establishing Learning Analytics initiatives on higher education level. Besides the much discussed ethical and moral concerns, there is also the matter of data privacy. In 2015, the European collaboration project STELA started with the main goal to enhance the Successful Transition from secondary to higher Education by means of Learning Analytics. Together, the partner universities develop, test, and assess Learning Analytics approaches that focus on providing feedback to students. Some promising approaches are then shared between the partner universities. Therefore, the transferability of the Learning Analytics initiatives is of great significance. During the duration of our project, we found a variety of difficulties, we had to overcome to transfer one of those Learning Analytics initiatives, the Learning Tracker from one partner to the other. Despite, some of the difficulties can be categorized as small, all of them needed our attention and were time consuming. In this paper, we present the lessons learned while solving these obstacles.
Conference Paper
Full-text available
This article introduces the goal and activities of the LAK 2018 half-day workshop on the involvement of stakeholders for achieving learning analytics at scale. The goal of the half-day workshop is to gather different stakeholders to discuss at-scale learning analytics interventions. In particular the workshop focuses on learning analytics applications and learning dashboards that go beyond the implementation in a single course or context, but that have at least the potential for scaling across different courses, programs, and institutes. The main theme of the workshop is to explore how the involvement of different stakeholders can strengthen or hinder learning analytics at scale. The key findings, recommendations, and conclusions of the workshop will be presented in a summarizing report, which will be shaped as a SWOT analysis for stakeholder involvement for achieving learning analytics at scale.
Conference Paper
Full-text available
In this paper, we discuss the design, development, and implementation of a Learning Analytics (LA) dashboard in the area of Higher Education (HE). The dashboard meets the demands of the different stakeholders, maximizes the mainstreaming potential and transferability to other contexts, and is developed in the path of Open Source. The research concentrates on developing an appropriate concept to fulfil its objectives and finding a suitable technology stack. Therefore, we determine the capabilities and functionalities of the dashboard for the different stakeholders. This is of significant importance as it identifies which data can be collected, which feedback can be given, and which functionalities are provided. A key approach in the development of the dashboard is the modularity. This leads us to a design with three modules: the data collection, the search and information processing, and the data presentation. Based on these modules, we present the steps of finding a fitting Open Source technology stack for our concept and discuss pros and cons trough out the process.
Conference Paper
Full-text available
This paper presents the results of a review of eight policies for learning analytics of relevance for higher education, and discusses how these policies have tried to address prominent challenges in the adoption of learning analytics, as identified in the literature. The results show that more considerations need to be given to establishing communication channels among stakeholders and adopting pedagogy-based approaches to learning analytics. It also reveals the shortage of guidance for developing data literacy among end-users and evaluating the progress and impact of learning analytics. Moreover, the review highlights the need to establish formalised guidelines to monitor the soundness, effectiveness, and legitimacy of learning analytics. As interest in learning analytics among higher education institutions continues to grow, this review will provide insights into policy and strategic planning for the adoption of learning analytics.
Full-text available
This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Full-text available
Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organisation which champions the use of digital technologies in UK education and research, has attempted to address this with the development of a Code of Practice for Learning Analytics. The Code covers the main issues institutions need to address in order to progress ethically and in compliance with the law. This paper outlines the extensive research and consultation activities which have been carried out to produce a document which covers the concerns of institutions and, critically, the students they serve. The resulting model for developing a code of practice includes a literature review, setting up appropriate governance structures, developing a taxonomy of the issues, drafting the code, consulting widely with stakeholders, publication, dissemination, and embedding it in institutions.
Full-text available
This paper details the anticipated impact of synthetic ‘big’ data on learning analytics (LA) infrastructures, with a particular focus on data governance, the acceleration of service development, and the benchmarking of predictive models. By reviewing two cases, one at sector wide level and the other at the institutional level - the Jisc learning analytics architecture and the UvAInform learning analytics project running at the University of Amsterdam - we explore the need for an on demand tool for generating a wide range of synthetic data. We argue that the application of synthetic data will not only accelerate the creation of complex and layered learning analytics infrastructure but also help to address the ethical and privacy risks involved during service development.
Full-text available
As a further step towards maturity, the field of learning analytics (LA) is working on the definition of frameworks that structure the legal and ethical issues that scholars and practitioners must take into account when planning and applying LA solutions to their learning contexts. However, current efforts in this direction tend to be focused on institutional higher education approaches. This paper reflects on the need to extend these ethical frameworks to cover other approaches to LA; more concretely, small-scale classroom-oriented approaches that aim to support teachers in their practice. This reflection is based on three studies where we applied our teacher-led learning analytics approach in higher education and primary school contexts. We describe the ethical issues that emerged in these learning scenarios, and discuss them according to three dimensions: the overall learning analytics approach, the particular solution to learning analytics adopted, and the educational contexts where the analytics are applied. We see this effort as a first step towards the wider objective of providing a more comprehensive and adapted ethical framework to learning analytics that is able to address the needs of different learning analytics approaches and educational contexts.
Full-text available
Studies have shown that issues of privacy, control of data, and trust are essential to implementation of learning analytics systems. If these issues are not addressed appropriately systems will tend to collapse due to legitimacy crisis, or they will not be implemented in the first place due to resistance from learners, their parents, or their teachers. This paper asks what it means to give priority to privacy in terms of data exchange and application design and offers a conceptual tool, a Learning Analytics Design Space model, to ease the requirement solicitation and design for new learning analytics solutions. The paper argues the case for privacy-driven design as an essential part of learning analytics systems development. A simple model defining a solution as the intersection of an approach, a barrier, and a concern is extended with a process focussing on design justifications to allow for an incremental development of solutions. This research is exploratory of nature, and further validation is needed to prove the usefulness of the Learning Analytics Design Space model.
Conference Paper
In this paper we explore the potential role, value and utility of a personal code of ethics (COE) for learning analytics practitioners, and in particular we consider whether such a COE might usefully mediate individual actions and choices in relation to a more abstract institutional COE. While several institutional COEs now exist, little attention has been paid to detailing the ethical responsibilities of individual practitioners. To investigate the problems associated with developing and implementing a personal COE, we drafted an LA Practitioner COE based on other professional codes, and invited feedback from a range of learning analytics stakeholders and practitioners: ethicists, students, researchers and technology executives. Three main themes emerged from their reflections: 1. A need to balance real world demands with abstract principles, 2. The limits to individual accountability within the learning analytics space, and 3. The continuing value of debate around an aspirational code of ethics within the field of learning analytics.