Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Feb 03, 2019
Content may be subject to copyright.
Draft, originally published in: Leitner P., Ebner M., Ebner M. (2019) Learning Analytics Challenges to Overcome
in Higher Education Institutions. In: Ifenthaler D., Mah DK., Yau JK. (eds) Utilizing Learning Analytics to Support
Study Success. Springer, Cham
LEARNING ANALYTICS CHALLENGES TO OVERCOME IN
HIGHER EDUCATION INSTITUTIONS
Philipp Leitner, Graz University of Technology, Graz, Austria, e-mail: philipp.leitner@tugraz.at
Markus Ebner, Graz University of Technology, Graz, Austria
Martin Ebner, Graz University of Technology, Graz, Austria
Abstract: While a large number of scientific publications explain the development of prototypes or the implementation of
case studies in detail, descriptions of the challenges and proper solutions when implementing learning analytics
initiatives are rare. In this chapter, we provide a practical tool that can be used to identify risks and challenges that
arise when implementing LA initiatives and discuss how to approach these to find acceptable solutions. In this
way, implementers are given the opportunity to handle challenges early on and avoid being surprised at a critical
moment in the project, which will save time, resources and effort. We are aware that all aspects needed to
successfully carry out learning analytics initiatives are co-dependent. Nonetheless, we identified and categorized
the criteria necessary for implementing successful LA initiatives. We conclude this chapter with an overview of
the challenges faced and possible approaches that can be taken to facilitate the successful implementation of
Learning Analytics.
Key words: learning analytics, implementation, challenge, higher education
1. INTRODUCTION
Over the past decade, Learning Analytics (LA) have received more and more attention as a rapidly
growing and promising research field in the area of Technology-Enhanced Learning (TEL) (Ferguson,
2012; Khalil & Ebner, 2015). Since it was first mentioned in the Horizon Report of 2012 (Johnson et al.,
2012), different tools have been used and initiatives carried out concerning different aspects of LA.
Thereby, LA is now finally reaching the point at which it will affect research and practice, as well as
policy- and decision-making (Gašević, 2015).
Currently, many different definitions for the term Learning Analytics are accepted. Long & Siemens
(2011) defined it as “the measurement, collection, analysis, and reporting of data about learners and their
2
contexts, for purposes of understanding and optimizing learning and the environment in which it occurs.”
Duval (2012) summarized LA by saying “learning analytics is about collecting traces that learners leave
behind and using those traces to improve learning.” Despite the different approaches, all definitions of LA
indicate that it should provide actionable insights (Siemens et al., 2011).
Therefore, the purpose should remain in focus when implementing LA initiatives. Obviously, the
potential actions strongly depend on the utilization of data and the information contained. However, what
kind of data representation is necessary to implement LA in an institution, and what ethical and moral
aspects need to be considered? Currently, the members of the European Union are particularly strongly
affected by the enforcement of the EU General Data Protection Regulation (GDPR) (Leitner et al., 2018).
The issues of data ownership and privacy are becoming increasingly significant (Drachsler & Greller,
2016). Therefore, the location and accessibility of the data needs to be kept in mind (Leitner et al., 2018).
For example, where is the data stored? On an internal or external server hosted by a service provider?
Additionally, many LA projects do not move past the prototype phase because of issues related to
transferability and scalability (Leitner et al., 2017). These aspects should already be considered at the
beginning of the development.
The goal of this study was to provide a practical tool that can be used to identify risks and challenges
that arise when implementing LA initiatives and how to approach these. This gives implementers the
opportunity to deal with these problems at an early stage and, thereby, not lose time or invest effort
needlessly later on when the realization of the initiative becomes critical. In this study, we identified and
categorized seven criteria for implementing successful LA initiatives. Although we are aware that these
areas are co-dependent, we addressed them individually throughout this study.
In the remainder of this chapter, we showcase relevant and related work, placing an emphasis on similar
research questions, and extract relevant problems that generally emerge during the implementation of LA
and identify possible solutions. In Section 3, an overview is provided of the seven areas that are most
significant when implementing LA projects. The reason behind choosing these areas is described in
greater detail. In subsections 3.1 to 3.7, we describe what we consider to be part of these areas and what
we explicitly exclude, which challenges exist and which approaches to solve appear promising in more
detail. Finally, we conclude with a discussion and remarks about future work.
2. RELATED WORK
In this chapter, the results of a survey of previous work regarding possible challenges and solutions when
implementing LA initiatives are presented. We read the literature to find work on similar topics and
determine how the authors met these challenges and what kind of solutions and/or framework they
used/proposed. The literature review of Leitner et al. (2017) showed that, in the last few years, various
publications have been published in which parts of the challenges summarized in our seven main
categories are described. In her paper, Ferguson (2012) documented the concerns about ethics and privacy
which began to surface once tools used to analyze student data became more powerful and readily
available. She additionally addressed four challenges, one of which was the development of a set of
ethical guidelines. Prior to this, Campbell (2007) had already defined a framework for locating potential
areas of misunderstanding in LA, which he based on definitions, values, principles and loyalties. Later, to
clearly differentiate between ethics and privacy, Drachsler & Greller (2016) defined ethics as “the
philosophy of moral that involves systematizing, defending, and recommending concepts of right and
wrong conduct. In that sense, ethics is rather different to privacy. In fact, privacy is a living concept made
out of personal boundary negotiations with the surrounding ethical environment.” Ferguson et al. (2016)
summarized the challenges presented by the special issue of ethics and privacy in LA in 21 points, as
shown in Table 1.
3
Table 1. Learning Analytics Challenges and Dimensions (Ferguson et al., 2016)
[Add table 1 here]
The first six challenges are related to helping learners achieve success during their studies. Therefore, the
data should or – even better – must be complete, accurate and up-to-date. It is the learner’s responsibility
to ensure this. On the other hand, the institutions also have a responsibility to ensure a state-of-the-art,
valid and reliable evaluation process, which is carried out in an understandable way. Challenge seven,
originally derived from the field of the medical sciences (Murray, 1990), however, relates to the issue that
informed consent is also needed today with regard to LA. Students should be involved as collaborators
and, therefore, give their informed consent to data access. The obtained analysis from the data is then used
to support learning and improve the learner’s chances of success. Challenges eight to ten are concerned
with the rights and interests of students and teachers, as well as the responsibility held by educational
institutions to safeguard and protected these. Providing access to data, as well as allowing the possibility
to make corrections and/or file a complaint, also play important roles (Rodríguez-Triana, et al., 2016). The
next two challenges are concerned with providing equal access to education for everyone (Challenge
eleven) and a fair and equally-applied legal system for all citizens (Challenge twelve). Challenges thirteen
to nineteen are related to data protection and place a focus on the legal responsibility for data security. The
harvested data are the property of another person, and the institution must assure data protection and
security. The last two challenges are concerned with the privacy of data and how data should be used and
treated (cf. Ferguson et al., 2016).
To meet these challenges, the scientific community already takes a variety of approaches with regard to
data protection and ethics in connection with LA (Ferguson et al. 2016): For example, a code of conduct
was developed that can be used as a taxonomy of ethical, legal and logistical issues for LA (Sclater, 2016).
Rodríguez-Triana et al. (2016) expanded the recommendations of Sclater's (2016) code and added consent,
transparency, access, accountability, data protection, validity and avoidance of adverse effects. A
framework for privacy and data protection has been proposed by Steiner et al. (2016). Cormack (2016) has
published a paper which deals with European data protection practices, and in particular with the
transparent communication of data usage. The codes of conduct and frameworks developed so far have
been supplemented by Berg et al. (2016) with tools and approaches that enable us to put them into
practice. Khalil & Ebner (2016) focused on the de-identification and anonymization of data for analysis
within LA. An examination of the study conducted by Hoel & Chen (2016) shows that the discussion on
data exchange and Big Data in education is still at an early stage. Prinsloo & Slade (2016) addressed the
rights and problems of students as well as the supervising institutions, arguing that the primary
responsibility for LA system providers is to promote individual autonomy and provide each individual
learner with enough information to make informed decisions (cf. Ferguson et al. 2016). To help
institutions enter the area of LA, Drachsler & Greller (2016) developed a checklist (the DELICATE
checklist), which helps users identify and examine possible problems and obstacles that could hinder the
introduction of LA in the education sector in advance. The term DELICATE stands for the eight points
that need to be considered if one wants to use LA (see Drachsler & Greller, 2016).
In the context of the SHEILA (Supporting Higher Education to Integrate Learning Analytics) project, a
team of research and institutional leaders in LA is currently developing a policy framework for formative
assessment and personalized learning. They have used the Rapid Outcome Mapping Approach (ROMA)
and validated their outputs through case studies. Their focus has been placed on the development of a
policy agenda for higher educational institutions by taking advantage of direct engagement with the
different stakeholders (Macfadyen, 2014). Tsai & Gasevic (2017) identified several challenges related to
strategic planning and policy:
• Challenge 1 - Shortage of Leadership: The leadership lacks the capabilities to guarantee the
implementation of LA in the environment of the institution. Therefore, different stakeholders and their
4
interests must be taken into account to ensure their commitment to the topic. Otherwise, these
stakeholders may become stoppers.
• Challenge 2 - Shortage of Equal Engagement: There are gaps between the various stakeholders within
institutions with regards to understanding LA. Teams who work in technical areas showed the highest
level of understanding, while other teams did not know much about LA. This can be seen as a barrier
for the institutional acceptance of LA.
• Challenge 3 - Shortage of Pedagogy-Based Approaches: When designing LA tools, it is also important
to include pedagogical approaches in the LA process. Institutions tend to focus more on technical
aspects rather than pedagogical aspects.
• Challenge 4 - Shortage of Sufficient Training: As highlighted in Challenge 2, there is a lack of
understanding of how LA can be beneficial to all stakeholders. A good staff training program, which
helps them improve their skill sets on this topic, is key to success.
• Challenge 5 - Shortage of Studies Empirically Validating the Impact: A budget must be allocated to
support LA. Therefore, senior staff members need a basis for the decision-making to do so. However,
the evaluation of the success of LA seems to be a challenging task.
• Challenge 6 - Shortage of Learning Analytics Specific Policies: Institutions have regulations regarding
data and ethics. However, few institutions have codes of practice for LA. This lack of clear guidance
regarding LA practice needs to be addressed.
Furthermore, Tsai & Gasevic (2017) reviewed eight policies (Jisc, LACE, LEA’s Box, NUS, NTU, OU,
CSU, USyd) concerning their suitability based on the six abovementioned challenges. Although the
policies partially lack pedagogical approaches, guidance for the development of data literacy and
evaluations of the effectiveness, they serve as valuable references for institutions interested in establishing
LA in their field of work. Particularly institutions that are interested in developing their own practice
guidelines for LA (Tsai & Gasevic, 2017) can benefit from the findings.
In our research, we found that several publications have focused on different aspects of this topic.
Overall, it can be said that the creation of clear guidelines based on a code of practice is needed when
planning to introduce LA in an institution. Our knowledge and thoughts are summarized in seven main
categories and presented in the next section.
3. SEVEN MAIN CATEGORIES FOR LA-IMPLEMENTATIONS
Bearing in mind the related work, the issues identified during previous research, as well as our own
experiences with implementing LA projects and initiatives in higher education (De Laet et al., 2018;
Leitner & Ebner, 2017; Leitner et al, 2018), we developed a framework for LA implementations. Based on
the results of the literature review and a workshop with LA specialists, stakeholders and researchers,
different issues were identified. These core issues were discussed, verified and could be categorized into
seven main areas (Figure 1).
[Add Fig 1. here]
Figure 1: Framework with seven main categories for LA initiatives
In the following subsections, we explain the seven categories in detail, pointing out the challenges they
present and providing possible solutions.
5
3.1 Purpose and gain
The expectations related to improving learning and teaching when talking about LA in higher education
are extremely high. However, at an institutional level, the line between LA and Academic Analytics is
blurred. Therefore, it is advisable to distinguish between the different stakeholders with regard to the
various goals and perspectives of stakeholders such as learners, educators, researchers and administrators.
The goal of the learners is to improve their performance. LA supports this by providing adaptive
feedback, recommendations and individual responses on their learning performance (Romero & Ventura,
2013).
The educators are interested in understanding the students’ learning processes, understanding social,
cognitive and behavioral aspects, reflecting on their teaching methods and performance, as well as
optimizing their instructions to achieve a better learning outcome (Leitner et al., 2017). They want to be
able to assess the students' activities more effectively and draw conclusions to find out where they need to
take more action to improve the students’ learning performance.
Researchers use the data to develop theoretical models for new and improved teaching and learning
methods. This includes pursuing the goal to predict future learning paths and support the needs of learners
more appropriately. Educational technologists and researchers in the field of pedagogy review existing
didactical models and develop new didactical ones by carrying out field studies in classrooms. For this
reason, they conduct research continuously and adapt LA techniques based on the data collected to meet
the new expectations of the younger generation.
Administrators are interested in implementing their agendas in a more efficient environment. Their
aim is to offer students a more pleasant and efficient learning environment. Additional goals are to reduce
the failure rates and numbers of drop-outs, increase performance and, thus, optimize and improve the
curricula. The government is responsible for the enforcement of data privacy and data protection issues.
Challenges may occur when dealing with the different stakeholders. If stakeholders are confronted with
hard facts without being asking for their thoughts and opinion first, they may rebel. Additionally, despite
the generally positive intentions of those introducing LA into institutions, stakeholders often have their
own thoughts about LA. Students and teachers might be afraid that the results of the analytics, if made
public, would put them in bad positions. Or even worse, the representatives of the different stakeholders
have their own in-game and expect to use the results to expose their counterparts. Therefore, it is
necessary to make the goals of the LA initiative transparent, clarifying exactly what is going to happen
with the information and explicitly what is not.
When the purpose of the LA initiative is very clear from the beginning, this does not seem to be a
problem. However, if it is not, the situation might become complicated when attempting to explain the LA
initiative to the stakeholders. Fuzziness not only puts you in a weak negotiating position, but can also
become a major problem when the stakeholders try to bring you over onto their side. Therefore,
implementers need to specify and adhere to the ultimate objective of the LA initiative.
3.2 Representation and actions
The purpose of LA is to use the data collected to optimize the students' learning processes and improve
teaching. The aim is to make learning itself more predictable and visible. Actions derived from this can
serve as a basis for developing a catalogue of measures to support risk groups and provide them with
better assistance during their study. Based on this, recommendations are made to support learners and
encourage them to reflect on their behaviors. The information is provided within a suitable environment
and clearly visualized as being included in the student’s personalized learning process. The
personalization of the working environment and the associated advantages are placed in the foreground.
This should have the effect of motivating the learner in terms of improving their attitude. The feedback
received is intended to stimulate reflection and lead to a shift in goals and the associated improvement in
learning success.
6
Choosing the right environment for the learner’s feedback and the correct visualization technique can
present a large challenge for all parties involved. Due to the quantity of data harvested and the focus
placed on quantitative metrics, teachers sometimes consider LA to be antithetical to an educational sense
of teaching. Dashboards with performance metrics are becoming increasingly popular in these contexts
(Clow, 2013). The interpretation of this data can sometimes seem incredibly difficult if it has not been
properly prepared before it is presented to the student. Therefore, it can be better not to provide the student
with all information related to the learning outcome. A mentor can discuss the results with the student.
However, university staff who are acting as mentors need specialized training so they can interpret the
data and pedagogical and psychological skills to discuss his/her results with the student and provide
deeper insights about the data.
3.3 Data
Universities and schools are constantly analyzing data from their students for a variety of reasons. LA can,
therefore, be seen as an innovative continuation of this principle, applied to make use of the advantages of
modern technology and the various data sources available today. The data can be examined and analyzed
for their impact in the learning context to improve the quality of learning and teaching, as well as enhance
the chances of the students’ success. Of course, universities require the individual’s permission to collect
and evaluate sensitive data for the purpose of LA. Students must be made aware of the purpose of
collecting and the process of analyzing the data. Consent is mandatory for the use of these data, which
then can be used as a basis for strategic decisions by the various stakeholders. The teacher can monitor
their students and analyze the behavior and actions of their students during their interactions with the
learning management system, providing insights into their learning culture. For example, whether the
students submitted all assignments or how actively they engage in their studies. Derived models can be
used to provide better student support so that they can reach their goals more efficiently.
Students leave various data traces while using the university infrastructure. The data collected will be
used together with statistical models and methods for the purpose of LA when a benefit for student
learning is expected. Students may want to know why they have been categorized as potential risk
candidates in specific courses. Therefore, the data and models used must be communicated and explained
to them by trained staff in an comprehensible way to provide them with guidance. Access to that data must
be secured, and only a few staff members are allowed to have access permissions to students’ data. The
institutions must enact policies that address data protection and access. Students must be informed of who
has access to the data.
The data used will not only have impact on the individual student, but also influence the practice of
teaching at the university. Therefore, the data have to be re-evaluated over time and adjusted to meet the
new demands. Furthermore, to ensure the best support and quality of the data, students need to keep their
data up-to-date. Giving them the (proactive) opportunity to check and update their data supports them and
the university during this process. Additionally, all of these points must comply with the GDPR and local
data protection acts.
A policy needs to be created for LA that aligns with the organization’s core principles. Transparent
communication about where the data are stored, what is being done to ensure data security and privacy
and how the data are evaluated and used (and by whom) is essential. Responsible handling of the students’
data by all stakeholders, especially university staff members, is important. Further training and skill-
building for responsible tutors/mentors in interpretation of the students’ data and action-taking in this
context are required. Interventions should be recommended to the student, which are based on the
collected data, and must be delivered in a transparent and comprehensible way (e.g., which methods and
models have been used) to ensure broad students acceptance and engagement. Students need to clearly
understand how the data are interpreted and manipulated and which techniques are used to ensure optimal
handling and verifiable recommendations.
7
3.4 IT infrastructure
IT infrastructure refers to a set of information technology (IT) components, such as hardware, software,
network resources and services that are the foundation for the operation and management of an enterprise
IT environment (Laan, 2011). This infrastructure allows organizations in higher education to deliver IT
services to its students, teachers and administrative staff. This IT infrastructure is usually internal and
deployed with in-house facilities, but it is possible to commission an external provider. However, IT
infrastructure is the basis for any LA measurements and, therefore, has to be considered carefully.
Why is it important to think about the IT infrastructure? To understand its relevance it is necessary to
know where the data is located. Therefore, we can distinguish between two different scenarios. First, the
data are stored and processed in a university-owned service center. Thereby, the responsibilities and
liabilities are located at the university itself, and national and organizational rules must be obeyed. This
scenario has the advantage that access to the data and the data ownership are located at the university,
which makes it easier to work with the data. However, it also presents some disadvantages, such as the
fact that initiatives with special technology requirements need to comply with the standardized rules held
by the internal service provider. Also, the cost-benefit ratio should be kept in mind because hosting and
support services privately might be more expensive than outsourcing.
The second scenario concerns working with external service providers. In this scenario, individual
solutions can be applied, as many providers are available that might meet the specific needs. In contrast to
the internal service center of a university, external service providers can concentrate their efforts on their
smaller and highly specialized digital product. Furthermore, the costs that arise can easily be estimated
and should be much lower than providing a private, individual solution. The negative aspects of working
with an external service provider are related to issues of access and data ownership as well as meeting the
necessary security standards when working with sensitive data, such as student performance data.
Regardless of whether one works with an internal or external service provider, it takes time to establish
the appropriate basis. Therefore, efforts should be made from the beginning to search for possible
solutions to set up the necessary IT infrastructure and contact and establish connections with relevant
people (Leitner et al., 2018). This will save time and resources when the implementation of an LA
initiative becomes critical.
3.5 Development and operation
This category combines the process of developing and operating LA initiatives. It includes a wide range of
different developments, from designing a simple questionnaire to developing an enterprise software
solution. Additionally, the activities cover research on and the development, prototyping, modification,
reuse, re-engineering, monitoring and maintenance of LA initiatives or projects.
Once the first prototype has been produced, implemented in a real-life context and evaluated, the
discussion can proceed to the next step. How can the prototype be realized? How can it move from the
prototype phase to the production phase? These are quite critical questions because new tasks and
challenges will appear. For example, the scalability of the implementation has to be taken into account.
The number of learners may differ arbitrarily, and this can lead to a complete new concept for the existing
IT infrastructure. Furthermore, processes which were first created manually must be redefined so that they
can be performed at least semi-automatically or completely automatically.
Even if student data is stored, this is typically done via different information systems. Normally, several
information systems are responsible for performing different tasks and, therefore, storing the data – in
different formats, on different servers and with different data owners. The efforts that are required to
receive and manage all data can be stressful and tedious. Additionally, converting raw data into a useful
format can be another big challenge. This is a highly complicated process, which needs thorough planning
and a consistent final implementation. Additionally, the implementation should include working in
8
different layers and should probably be implemented in a modular manner. In doing so, any changes in the
circumstances of the different, associated information systems can easily be adapted.
From the first stages of any learning measurement, we suggest that the scope should be specified in
detail. Will the LA be established merely for testing and to obtain initial impressions or will it be
implemented at a university-wide level? Scalability is maybe one of the most frequently underestimated
problems in today’s IT industry. Furthermore, we strongly emphasize planning the LA implementation
beforehand, so that the costs can be estimated as exactly as possible. A distinction must be made as to
whether processes have to be carried out manually, semi-automatically or fully automatically.
3.6 Privacy
The term privacy is defined as an intrinsic part of a person’s identity and integrity and constitutes one of
the basic human rights in developed countries, as it should be an established element of the legal systems
(Drachsler & Greller, 2016). All LA implementations have to ensure the privacy of the involved parties.
Learners must trust the final systems and, therefore, keeping information private is of the utmost
importance. Additionally, depending on the country where the higher education institution is situated,
different regulations in addition to the General Data Protection Regulation (GDPR), which is applicable in
Europe, are enforced. Organizations have to deal with different tasks while finding a suitable legal
framework that covers the GDPR. They could take another tack and start to minimize the data harvested
and/or take actions to anonymize or pseudo-anonymize their data.
Nevertheless, even when keeping privacy in mind when handling LA initiatives from the beginning, the
situation can become highly complex. For example, by merging different data sources, new and surprising
results can be visualized, and, therefore, new insights which were never intended can be provided. Due to
the fact that universities are huge institutions, there is a high risk that unauthorized people receive access
to these data interpretations.
Another large problem that is closely related to privacy is the fact that a person/learner is reduced to
their stored data. Society is made up of individuals, so every situation has to be considered in a
differentiated way. For example, an activity profile can be created, but we will never know exactly how
those activities actually took place and to which extent. The reduction of people to categories and profiles
can be particularly dangerous, because a learning individual could be reduced to a merely few parameters.
Since society seems to like to fall back on so-called facts, the derivation of causal connections on the basis
of learning algorithms always needs to be critically questioned. This also means that gaps in data need to
be analyzed and handled.
Finally, the general lifetime of personal data is a topic that requires further discussion. The data may be
interesting at a time when the activities and learning outcomes are relevant, but the data may no longer be
relevant in the future. Arguments for keeping data could be presented for the purposes of training
algorithms and machine learning. Improvements could also be made by providing a larger data resource.
However, these steps should only be carried out with absolute anonymization of the data. Khalil & Ebner
have shown how this should be done (Khalil & Ebner, 2016).
First, privacy is a fundamental right of every person and must be respected. This means that any LA
implementation must take this into account from the very beginning. However, this is often difficult or
perhaps not clarified at all, because complex situations can arise as a result of data mergers. Therefore, we
suggest working with the highest possible level of transparency, because this encourages confidence: The
learners know what happens to their data and what statements can be made. At the same time,
unauthorized people cannot be allowed to access the data, and the personnel need to be well-trained in
terms of data interpretation, but also know how to deal with questions about privacy. If doubts with regard
to privacy arise, the LA measure must always be omitted.
Finally, we would like to point out once again that the mere use of data – i.e. “facts” – will not be
sufficient to adequately represent such a complex situation as a learning process. LA is only an auxiliary
tool that can be used to gain a better understanding of this process.
9
3.7 Ethics
Ethics is defined as a moral code of norms and conventions that involves systematizing, defending and
recommending concepts of right and wrong conduct. It exists external to a person in society (Drachsler &
Greller, 2016). In the context of LA, various ethical and practical concerns arise as the potential exists to
harvest personalized data and intervene at an individual level (Prinsloo & Slade, 2015). Therefore, privacy
pose as a major challenge for the implementation of LA initiatives.
Additionally, working with sensitive data presents a particular challenge. Sensitive data includes
information on medical conditions, financial information, religious beliefs, or sexual orientation, but also
about student performance. If made public, such information could result in harm to that particular person.
Therefore, it is necessary to ensure restrictions on who has access to the information and for which
purpose(s) it is used.
Some questions arise when looking at data from an ethical point of view. First, which data of a person
are permitted to be harvested, used and processed, regardless of whether they are a student or an educator?
Second, which information can be communicated to someone, and what may be the resulting
consequences? These are increasing concerns in the context of ethics, because LA enables the
improvement of accuracy of the predictions for different learning profiles by combining different data
sources. The LA implementers must find a suitable way to meet high ethical standards and ensure a
beneficial outcome for all stakeholders.
Another important point is the option to opt-in and opt-out for participants from harvesting, storing and
processing the individual data of a single person. However, how should institutions relying on LA deal
with students who take the right to opt-out? When implementing LA in an institution, it is advisable to
involve all stakeholders at an early state in the process of creating rules and the legal framework for the
use of data. Transparency is key, as well as understanding the different needs of the interesting groups
involved in the process. All intentions, goals and benefits for harvesting and using the data have to be
explained in a clear and comprehensible way to all stakeholders. The consent for using the data begins
with the login into the system, which tracks data from their users. During this process, the consent of all
parties involved must be communicated. In this context, the areas in which the data will be used must be
clearly communicated. During discussions, the possibilities of interpretation of the provided information
need to be described to prevent misunderstandings and incorrect decisions. As a precautionary measure,
the institutions can introduce codes of conduct and procedures that provide initial support on this subject.
At the Learning Analytics and Knowledge conference 2018 (LAK18) in Sydney, a draft code of ethics
v1.0 was presented (Lang et al. 2018). This document may be considered as a foundation for ethical
matters when implementing LA initiatives. Additionally, a legal counsel could offer their advice when the
interpretation of a topic or situation seems unclear. The European Learning Analytics Exchange (LACE)
project offers workshops on ethics and privacy in LA (EP4LA). The LACE project also plays a key role in
advancing the issues on the ethical dilemmas of using LA.
4. CONCLUSION
Within higher education institutes, researchers are still full of enthusiasm and excitement about LA and its
potential. Furthermore, LA is now at the point at which affects research, practice, policy- and decision-
making equally (Gašević, 2015).
However, to facilitate successful LA initiatives, a few things have to be kept in mind. In this chapter,
we presented seven main criteria, which can be used for initial orientation when implementing LA. The
order of appearance was intentionally chosen, although the order of application depends on the
implementer.
We hope that the classification of the seven main criteria, the presented challenges and the approaches
that can be taken to overcome them will be helpful to implementers of LA initiatives. We are aware that
10
the presented examples cover only a small range of the challenges an implementer might encounter, but
we hope the results of this study can help researchers and educators understand the bigger picture and
become aware of other potential issues.
In future research, we plan to investigate the seven categories in more detail to identify different
examples and validate our framework to foster future LA measurements.
REFERENCES
Berg, A. M., Mol, S. T., Kismihók, G., & Sclater, N. (2016). The Role of a Reference Synthetic Data Generator within the Field
of Learning Analytics. Journal of Learning Analytics, 3(1), 107-128.
Campbell, J.P. (2007). Utilizing Student Data within the Course Management System to Determine Undergraduate Student
Academic Success: An Exploratory Study, PhD, Purdue University.
Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6) pp. 683–695.
Cormack, A. N. (2016). A data protection framework for learning analytics. Journal of Learning Analytics, 3(1), 91-106.
De Laet, T., Broos, T., Verbert, K., van Staalduinen, J-P., Ebner, M. & Leitner, P. (2018). Proceedings 8th International
Conference on Learning Analytcis & Knowledge. Sydney, p. 602-606
Drachsler, H., & Greller, W. (2016). Privacy and analytics - it’s a DELICATE issue: A checklist to establish trusted learning
analytics. Proceedings of the 6th International Conference on Learning Analytics and Knowledge, 89-96.
http://dx.doi.org/10.1145/2883851.2883893
Duval, E. (2012). Learning Analytics and Educational Data Mining, In: Erik Duval’s Weblog, 30 January 2012, Retrieved from
https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/ accessed on 16 March 2018.
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced
Learning (IJTEL), 4(5/6), 304–317. http://dx.doi.org/10.1504/IJTEL.2012.051816
Ferguson, R., Hoel, T., Scheffel, M., & Drachsler, H. (2016). Guest editorial: Ethics and privacy in learning analytics. Journal of
learning analytics, 3(1), 5-15. http://dx.doi.org/10.18608/jla.2016.31.2
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.
Hoel, T., & Chen, W. (2016). Privacy-driven design of learning analytics applications - exploring the design space of solutions for
data sharing and interoperability. Journal of Learning Analytics, 3(1), 139-158.
Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2012). The NMC Horizon Report: 2012 Higher
Education Edition. The New Media Consortium.
Khalil, M., & Ebner, M. (2015). Learning analytics: principles and constraints. In Proceedings of world conference on
educational multimedia, hypermedia and telecommunications, 1326-1336.
Khalil, M. & Ebner, M. (2016) De-Identification in Learning Analytics. Journal of Learning Analytics. 3(1). pp. 129 - 138
Laan, S. (2011). IT Infrastructure Architecture: Infrastructure Building Blocks and Concepts. Lulu Press.
Lang, C., Macfadyen, L. P., Slade, S., Prinsloo, P., & Sclater, N. (2018, March). The complexities of developing a personal code
of ethics for learning analytics practitioners: implications for institutions and the field. In Proceedings of the 8th International
Conference on Learning Analytics and Knowledge (pp. 436-440). ACM.
Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education - a literature review. In Learning Analytics:
Fundaments, Applications, and Trends (pp. 1-23). Springer, Cham.
Leitner, P. & Ebner, M. 12 Jul 2017 Learning and Collaboration Technologies. Technology in Education: 4th International
Conference, LCT 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, July 9-14, 2017, Proceedings, Part
II. Zaphiris, P. & Ioannou, A. (eds.). Cham: Springer International Publishing AG, p. 293-301 9 p.
Leitner, P., Broos, T., & Ebner, M. (2018). Lessons Learned when transferring Learning Analytics Interventions across
Institutions. In Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (pp. 621-629).
Sydney.
Long, P. & Siemens, G., (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30.
Retrieved from https://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education accessed on 16
March 2018.
Macfadyen, L. P., Dawson, S., Pardo, A., & Gasevic, D. (2014). Embracing big data in complex educational systems: The
learning analytics imperative and the policy challenge. Research & Practice in Assessment, 9. pp. 17-28.
Murray, P. M. (1990). The history of informed consent. The Iowa Orthopaedic Journal, 10, 104–109.
Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. In Proceedings of the fifth
international conference on learning analytics and knowledge (pp. 83-92). ACM.
Prinsloo, P., & Slade, S. (2016). Student vulnerability, agency and learning analytics: an exploration. Journal of Learning
Analytics, 3(1), 159-182.
Rodríguez-Triana, M. J., Martínez-Monés, A., & Villagrá-Sobrino, S. (2016). Learning analytics in small-scale teacher-led
innovations: ethical and data privacy issues. Journal of Learning Analytics, 3(1), 43-65.
11
Romero C, Ventura S (2013) Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery,
3(1), pp 12-27
Sclater, N. (2016). Developing a Code of Practice for learning analytics. Journal of Learning Analytics, 3(1), 16-42.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., Duval, E., Verbert, K., Baker,
RSJD (2011). Open learning analytics: an integrated & modularized platform. Proposal to design, implement and evaluate an
open platform to integrate heterogeneous learning analytics techniques.
Steiner, C. M., Kickmeier-Rust, M. D., & Albert, D. (2016). LEA in private: a privacy and data protection framework for a
learning analytics toolbox. Journal of Learning Analytics, 3(1), 66-90.
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education---challenges and policies: a review of eight learning
analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
ACM.