Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Mar 12, 2018
Content may be subject to copyright.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
1
Lessons Learned when transferring Learning Analytics Interventions
across Institutions
Philipp Leitner
TU Graz
philipp.leitner@tugraz.at
Tom Broos
KU Leuven
tom.broos@kuleuven.be
Martin Ebner
TU Graz
martin.ebner@tugraz.at
ABSTRACT: Learning Analytics is a promising research field, which is advancing quickly. Therefore,
it finally impacts research, practice, policy, and decision making [7] in the field of education.
Nonetheless, there are still influencing obstacles when establishing Learning Analytics initiatives
on higher education level. Besides the much discussed ethical and moral concerns, there is also
the matter of data privacy.
In 2015, the European collaboration project STELA started with the main goal to enhance the
Successful Transition from secondary to higher Education by means of Learning Analytics [1].
Together, the partner universities develop, test, and assess Learning Analytics approaches that
focus on providing feedback to students. Some promising approaches are then shared between
the partner universities. Therefore, the transferability of the Learning Analytics initiatives is of
great significance.
During the duration of our project, we found a variety of difficulties, we had to overcome to
transfer one of those Learning Analytics initiatives, the Learning Tracker from one partner to the
other. Despite, some of the difficulties can be categorized as small, all of them needed our
attention and were time consuming. In this paper, we present the lessons learned while solving
these obstacles.
Keywords: Learning Analytics, scalability, cooperation, lessons learned
Originally published in: Leitner, P., Broos, T. & Ebner, M. (2018) Lessons Learned when transferring Learning Analytics Interventions across
Institutions. In: Companion Proceedings 8th International Conference on Learning Analytcis & Knowledge. Sydney. pp. 621-629
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
2
1 INTRODUCTION
Learning Analytics has emerged in the last decade as a fast-growing and promising research field in
Technology-Enhanced Learning (TEL) by providing tools and platforms that influence researchers [10, 6].
Long defined Learning Analytics as “the measurement, collection, analysis, and reporting of data about
learners and their contexts, for purposes of understanding and optimizing learning and the environment
in which it occurs” [14]. Since it was first mentioned in the Horizon Report 2012 [9], various different
projects and initiatives were performed surrounding Learning Analytics, which is finally entering the next
phase and has an impact on research, practice, policy, and decision making [7].
Nonetheless, there are many obstacles when establishing Learning Analytics initiatives especially in higher
education. Besides ethical and moral issues, the matter of data ownership and data privacy is getting more
and more important [5]. Particularly affected are the member states of the EU as the new EU General
Data Protection Regulation (GDPR)
1
is going to be enforced soon. Thereby, the users, lecturers and
students, have to be informed in advance of what is going to happen with their personal data as well as
give the consent. Unfortunately, anonymizing personal data to circumvent the issue with personal data
makes Learning Analytics more difficult and is not that trivial [11]. Further, many Learning Analytics
projects are still in the prototype phase, because of issues with transferability and scalability [13].
Within the scope of the European collaboration project STELA, the Learning Tracker [8] was proposed for
giving students feedback in a Small Private Online Courses (SPOC) deployed at KU Leuven. In this
publication, we will present issues and lessons learned in the process of deployment. We summarized this
through two research questions:
RQ1: What should be kept in mind when working with external providers?
RQ2: What should be kept in mind when working across higher education institutions?
In the next section, we start by explaining the case study and its circumstances. Section 3 explores issues
when working with external providers and the lessons learned. In Section 4, we discuss obstacles when
working across institutions and how to overcome them. Conclusion and remarks on future work are
presented in Section 5.
2 CASE STUDY
The Erasmus+ STELA project [1] is a European collaboration project with the primary partners Catholic
University of Leuven (KU Leuven, Belgium), Delft University of Technology (TU Delft, Netherlands), Graz
University of Technology (TU Graz, Austria), and as secondary partner the Nottingham Trent University
(NTU, England). The main goal is to enhance the successful transition from secondary to higher education
by means of learning analytics. Together, the partner universities develop, test, and assess Learning
1
https://www.eugdpr.org/ - Last accessed January 30th, 2018
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
3
Analytics approaches that focuses on providing formative and summative feedback to students in the
transition. In the first step, promising approaches are shared between the partners to evaluate them
under different circumstances. Therefore, transferability, scalability, and modularity of the approaches
are of high interest.
One promising initiative of University of Technology of Delft is the so-called “Learning Tracker” [8], which
is made available by TU Delft as open source
2
and is displayed in Figure 1 by a typical user interface. The
Learning Tracker itself tracks the behavior of all current participants in the MOOC and displays it against
the aggregated activities of previous participants that successfully completed. Thereby, the Learning
Tracker supports learners in Massive Open Online Courses (MOOC) in becoming more efficient and
encourages them to develop their self-regulated learning skills by reflecting on their own learning
activities [8]. This approach follows Baker’s alternate paradigm for online learning by using the
information to rather empower human decision making than feeding it to an intelligent learning system
[2].
Figure 1: Visual design of the Learning Tracker. It provides several metrics in a small space and offers a
simple overall evaluation [8]
The Learning Tracker was already deployed within different MOOCs and has been shown to be easily
transferable to different MOOCs on the same platform within the same university [4]. The impact on the
engagement of the students in comparison to the completion rate of the MOOC was evaluated and the
results have shown that the Learning Tracker improves the achievement of already highly educated
learners, but is less effective for less educated ones [4]. Further, it has been shown that the cultural
context of the learners is impacting the engagement and the completion rate [4].
2
https://github.com/ioanajivet/LearningTracker – Last accessed January 30th, 2018
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
4
Our goal was to deploy the Learning Tracker to the Chemistry SPOC of KU Leuven, which is based on the
edX system. Further, we wanted to get the Learning Tracker more dynamically. Therefore, we used the
opensource technology stack developed at TU Graz within the STELA project [12]. Figure 2 illustrates a
flow diagram of responsibilities and relations throughout the case study.
Figure 2: Illustration of responsibilities and relations
3 WORKING WITH EXTERNAL SERVICE PROVIDERS
This section deals with obstacles when working with external service providers (RQ1). We start by
explaining issues with data ownership when using an external service provider. Then, we discuss what
should be kept in mind when exchanging data with external service providers.
3.1 Data ownership issues
Essential when working with external service providers, is the question ”who owns the data?”. Here we
don’t consider matters related to copyright of the material provided on the platform. We also make
abstraction of the more fundamental idea that the final ownership of student produced data, whether it
concerns learner created content or simply digital activity traces, should always belong to the students
themselves.
When the external party functions as a contractor for the institution, it is reasonable to assume that the
latter preserves full ownership. But what if the platform of the service provider is independent and
subsequently used by the institution to deliver its content to learners?” To draw a parallel: when a
company uses a popular social media platform like LinkedIn to disseminate its message, would one not
assume that the platform provider retains the ownership of data related to its own user base, even if it
was in fact the company that pushed these users to the platform in the first place? And yet, it may come
as a surprise to institutions that they don’t automatically acquire ownership of or even access to student
data within the external educational platforms used by them.
KU Leuven invested extensively in its Learning Management System ”Toledo”, which is predominantly
based on the Blackboard product line. The system is maintained by an internal team and embedded in a
broader software architecture, fully hosted in the university’s own data center. Only in recent years, KU
Leuven started to invest in MOOCs and SPOCs. Due to the limited in-house experience with MOOC’s and
the common practice of hosting shared by many institutions of using an existing platforms, edX was
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
5
selected as KU Leuven’s MOOC platform of choice. However, while the issue of ownership of “Toledo”
data did not arise before, it suddenly become relevant in the new context of the edX platform.
3.2 Exchanging data with external providers
Once an agreement with the external service provider is established, the problem of data access arises.
For information systems managed by the institution itself, there is usually an option to extend or modify
the software to export data required for the Learning Analytics application. In many cases, the data may
also be fetched from a database management system directly, by setting up an ETL-process (extract,
transform, load) as is common in the domain of Business Intelligence (BI). Institutional IT services are often
familiar with these practices, also used to enable reporting on data captured in financial, administrative,
Human Resources (HR), and other information systems.
Yet when working with an external service provider, data is not directly accessible by the internal services.
As the software is primarily designed to serve multiple tenants, it may not be straightforward to adapt it
to meet the data needs of a single institution – especially in the context of an ongoing research project,
when requirements are still unstable.
In some cases, the service provider offers a set of application programming interfaces (APIs) to facilitate
the communication with on-premises software of the institutions. However, these APIs are likely to be
limited to the use-cases anticipated on beforehand, if not by internal planning and priorities. Innovative
and experimental use of data, as it is to be expected within a research context, is not always compatible
with this approach. The resulting requirement is to dig deeper into the data that is being captured by the
external system, if possible by accessing it directly, circumventing the limited scope of the APIs. After all,
this would also be a common approach for internal systems, as explained above.
Apart from the requirement to create clarity about the data ownership and sharing, our case study also
involves finding a technical process to get the data from the service provider. edX indeed offers an API for
accessing student activity data. However, the provided methods are limited to the data perspectives as
imagined by the edX developers and incompatible with the requirements of the TU Delft Learning Tracker.
On request, edX offered the option to get direct access to extract the underlying log data through an FTP
server. The manual way of working is little optimized for continuous, preferably real-time data extraction,
but it allows the initiation of the case study implementation. At KU Leuven side, the process of collecting
data from edX needs to be further automated. A question is how to anticipate data structure changes on
edX side, as the data format is meant for internal use and might be reorganized in the future.
A related issue concerns the reverse flow: once the exported data has been transformed into information
that may be offered to students, how can this information be fed back to them? edX supports the Learning
Tools Interoperability (LTI) standard created by the IMS Global Learning Consortium. This standard was
designed to enable the sharing of tools across different learning systems. In our setup, the edX
environment is the LTI Tool Consumer and our Learning Tracker system is the LTI Tool Provider. When the
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
6
Learning Tracker is shown to the student, edX (trusted consumer) passes a user identification string,
whom makes an extra authentication step on the provider side unnecessary.
4 WORKING ACROSS INSTITUTIONS
In this section, we discuss obstacles when working across higher education institutions and how to
overcome them (RQ2). First, we explain what you need to keep in mind when facilitating cross-border
European initiatives. Second, we point out how to handle external data subjects.
4.1 Facilitating cross-border European Initiatives
Research cooperation is common among European universities. Students, lecturers and researchers are
increasingly roaming from one institution to another, increasing the opportunities for teaming up. But
when the research project directly involves the daily practice of the involved institutions, practical
incompatibilities may start to surface.
If working together with institutions within a single region may already be complicated, working across
(European) borders is unlikely to make matters easier. Despite the unification efforts of the Bologna
Process, Higher Education Institutions (HEI) from different European countries operate in dissimilar
contexts. Education and general laws, culture, and societal views on the role of education, organization
of the institutions, and role of the government are just a few examples of contextual areas that are likely
to differ from one country to another. Not in the least because education today is often influenced by
local tradition.
While preparing the case study implementation, it became clear that the Austrian view on data privacy is
more strict than the Belgian interpretation. Privacy awareness is stronger developed in the Austrian and
German culture. Notwithstanding the General Data Protection Regulation (GDPR), which will soon be in
effect throughout the entire European Union, the interpretation of what is allowed and what is not turned
out to be rather different. The Austrian reading, as translated into TU Graz internal policy, for instance,
directs on avoiding the integration of data from separate source systems.
The concept of processing data about the Belgian students on its Austrian servers provoked resistance on
the side of TU Graz, as it would put the internal IT department in a challenging position with respect to its
policy. Consequently, the alternative of moving the case study implementation to the KU Leuven
infrastructure was considered. However, this would require a TU Graz project member to access the
server infrastructure of KU Leuven remotely. While there was no objection to this in principle, this turned
out to be practically impossible to arrange without an existing employee relationship: the procedure to
do so was nonexistent.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
7
4.2 Handling external Data Subjects
The debate about ethics and privacy in Learning Analytics is growing. Skeptics are questioning to what
extend providers of education are entitled to study the learning behavior of their students. LA proponents,
on the other hand, are arguing that it is the duty of educators to improve learning and that it not using
data to do so may be unethical. In most cases, the (implicit) focus of such debate however, is on data
institutions collect and process about their own students, it is to say, student with which the institution
has some kind of formal engagement. It is not uncommon for students to sign a contract at registration
that already contains certain agreements about if and how the institution may use learning traces for
educational research or to improve its internal educational processes.
However, as is the situation for our case study it is also not uncommon for higher education institutions
to interact with prospective students prior to official registration. This complicates matters of privacy and
ethics, and in the absence of an agreement, it is less clear what data institutions can use to extend their
mission to improve the quality of education to the orienting and transitional process. We therefore prefer
to extract as little data as possible (data minimization) to enable the selected features of the Learning
Tracker tool. This, for instance, does not require knowledge of the student’s name or any other
characteristics, besides some kind of user id or pseudonym which is also required to feed the resulting
charts back into the SPOC user interface.
The external data subject issue is discussed in detail by [3], applied there to a shared positioning test for
engineers students, co-organized by several universities. The proposed solution uses an anonymous
feedback code that is provided to students. In this approach, data subjects retain a large part of the data
ownership and freely decide to transfer data across service providers or institutions.
5 CONCLUSION
The intention of this paper was to formulate lessons learned, which the authors consider important for
future development and implementation of Learning Analytics initiatives. In this paper we have outlined
obstacles when working with external providers (RQ1) or across institutions (RQ2), and proposed partial
solutions to overcome them. We try to allow that implementer of Learning Analytics initiatives can benefit
from this findings, adjust properly and thereby, save time and effort. In Table 1, a summary of questions
that surfaced during our case study is provided.
Table 1: Summary of surfacing questions .
Source of issue
Issue
Question
Working with an
external provider
Data ownership
• Who owns the data? The institution or the service
provider?
Data access
• How to get data out of the external platform? Are API’s
available and sufficient? Is direct data access possible?
• How to get information back into the systems? How to
reach the end-user? Is a standard (e.g. LTI) supported?
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
8
Working across
institutions
Working cross-border
• How does the educational context differ from one
partner to the other? In case of shared legislation, does
the interpretation differ?
• What procedures are available to host other partner’s
data or to provide access to a researcher staffed by
another partner.
External data subjects
• To what extend can data from unregistered/prospective
students be used to improve education and to feed
information back to these students?
• If anonymous data is insufficient, is the use of
pseudonymization tokens (e.g. feedback codes [3]) an
alternative?
ACKNOWLEDGMENTS
This research project is co-funded by the European Commission Erasmus+ program, in the context of the
project 562167EPP-1-2015-1-BE-EPPKA3- PI-FORWARD. The European Commission support for the
production of this publication does not constitute an endorsement of the contents which reflects the
views only of the authors, and the Commission cannot be held responsible for any use which may be made
of the information contained therein.
Please visit our website http://stela-project.eu .
REFERENCES
[1] Stela project http://stela-project.eu/ - last accessed january 30th, 2018.
[2] Baker, R. S. (2016). Stupid tutoring systems, intelligent humans. International Journal of Artificial
Intelligence in Education, 26(2), 600-614.
[3] Broos, T., Verbert, K., Langie, G., Van Soom, C., & De Laet, T. (2018, March). Multi-institutional
positioning test feedback dashboard for aspiring students: lessons learnt from a case study in
Flanders. In Proceedings of the Eighth International Learning Analytics & Knowledge Conference (pp.
1-6). ACM.
[4] Davis, D., Jivet, I., Kizilcec, R. F., Chen, G., Hauff, C., & Houben, G. J. (2017, March). Follow the
successful crowd: raising MOOC completion rates through social comparison at scale. In
Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 454-463).
ACM.
[5] Drachsler, H., & Greller, W. (2016, April). Privacy and analytics: it's a DELICATE issue a checklist for
trusted learning analytics. In Proceedings of the sixth international conference on learning analytics
& knowledge (pp. 89-98). ACM.
[6] Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal
of Technology Enhanced Learning, 4(5-6), 304-317.
[7] Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning.
TechTrends, 59(1), 64-71.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
9
[8] Jivet, I. (2016). The Learning tracker: a learner dashboard that encourages self-regulation in MOOC
learners.
[9] Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2012). The NMC
Horizon Report: 2012 Higher Education Edition. The New Media Consortium.
[10] Khalil, M., & Ebner, M. (2015, June). Learning analytics: principles and constraints. In EdMedia:
World Conference on Educational Media and Technology (pp. 1789-1799). Association for the
Advancement of Computing in Education (AACE).
[11] Khalil, M., & Ebner, M. (2016). De-identification in learning analytics. Journal of Learning Analytics,
3(1), 129-138.
[12] Leitner, P., & Ebner, M. (2017, July). Development of a Dashboard for Learning Analytics in Higher
Education. In International Conference on Learning and Collaboration Technologies (pp. 293-301).
Springer, Cham.
[13] Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education—a literature
review. In Learning Analytics: Fundaments, Applications, and Trends (pp. 1-23). Springer, Cham.
[14] Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE
review, 46(5), 30.