Conference PaperPDF Available

Lessons Learned when transferring Learning Analytics Interventions across Institutions

Authors:

Abstract and Figures

Learning Analytics is a promising research field, which is advancing quickly. Therefore, it finally impacts research, practice, policy, and decision making in the field of education. Nonetheless, there are still influencing obstacles when establishing Learning Analytics initiatives on higher education level. Besides the much discussed ethical and moral concerns, there is also the matter of data privacy. In 2015, the European collaboration project STELA started with the main goal to enhance the Successful Transition from secondary to higher Education by means of Learning Analytics. Together, the partner universities develop, test, and assess Learning Analytics approaches that focus on providing feedback to students. Some promising approaches are then shared between the partner universities. Therefore, the transferability of the Learning Analytics initiatives is of great significance. During the duration of our project, we found a variety of difficulties, we had to overcome to transfer one of those Learning Analytics initiatives, the Learning Tracker from one partner to the other. Despite, some of the difficulties can be categorized as small, all of them needed our attention and were time consuming. In this paper, we present the lessons learned while solving these obstacles.
Content may be subject to copyright.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
1
Lessons Learned when transferring Learning Analytics Interventions
across Institutions
Philipp Leitner
TU Graz
philipp.leitner@tugraz.at
Tom Broos
KU Leuven
tom.broos@kuleuven.be
Martin Ebner
TU Graz
martin.ebner@tugraz.at
ABSTRACT: Learning Analytics is a promising research field, which is advancing quickly. Therefore,
it finally impacts research, practice, policy, and decision making [7] in the field of education.
Nonetheless, there are still influencing obstacles when establishing Learning Analytics initiatives
on higher education level. Besides the much discussed ethical and moral concerns, there is also
the matter of data privacy.
In 2015, the European collaboration project STELA started with the main goal to enhance the
Successful Transition from secondary to higher Education by means of Learning Analytics [1].
Together, the partner universities develop, test, and assess Learning Analytics approaches that
focus on providing feedback to students. Some promising approaches are then shared between
the partner universities. Therefore, the transferability of the Learning Analytics initiatives is of
great significance.
During the duration of our project, we found a variety of difficulties, we had to overcome to
transfer one of those Learning Analytics initiatives, the Learning Tracker from one partner to the
other. Despite, some of the difficulties can be categorized as small, all of them needed our
attention and were time consuming. In this paper, we present the lessons learned while solving
these obstacles.
Keywords: Learning Analytics, scalability, cooperation, lessons learned
Originally published in: Leitner, P., Broos, T. & Ebner, M. (2018) Lessons Learned when transferring Learning Analytics Interventions across
Institutions. In: Companion Proceedings 8th International Conference on Learning Analytcis & Knowledge. Sydney. pp. 621-629
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
2
1 INTRODUCTION
Learning Analytics has emerged in the last decade as a fast-growing and promising research field in
Technology-Enhanced Learning (TEL) by providing tools and platforms that influence researchers [10, 6].
Long defined Learning Analytics as “the measurement, collection, analysis, and reporting of data about
learners and their contexts, for purposes of understanding and optimizing learning and the environment
in which it occurs” [14]. Since it was first mentioned in the Horizon Report 2012 [9], various different
projects and initiatives were performed surrounding Learning Analytics, which is finally entering the next
phase and has an impact on research, practice, policy, and decision making [7].
Nonetheless, there are many obstacles when establishing Learning Analytics initiatives especially in higher
education. Besides ethical and moral issues, the matter of data ownership and data privacy is getting more
and more important [5]. Particularly affected are the member states of the EU as the new EU General
Data Protection Regulation (GDPR)
1
is going to be enforced soon. Thereby, the users, lecturers and
students, have to be informed in advance of what is going to happen with their personal data as well as
give the consent. Unfortunately, anonymizing personal data to circumvent the issue with personal data
makes Learning Analytics more difficult and is not that trivial [11]. Further, many Learning Analytics
projects are still in the prototype phase, because of issues with transferability and scalability [13].
Within the scope of the European collaboration project STELA, the Learning Tracker [8] was proposed for
giving students feedback in a Small Private Online Courses (SPOC) deployed at KU Leuven. In this
publication, we will present issues and lessons learned in the process of deployment. We summarized this
through two research questions:
RQ1: What should be kept in mind when working with external providers?
RQ2: What should be kept in mind when working across higher education institutions?
In the next section, we start by explaining the case study and its circumstances. Section 3 explores issues
when working with external providers and the lessons learned. In Section 4, we discuss obstacles when
working across institutions and how to overcome them. Conclusion and remarks on future work are
presented in Section 5.
2 CASE STUDY
The Erasmus+ STELA project [1] is a European collaboration project with the primary partners Catholic
University of Leuven (KU Leuven, Belgium), Delft University of Technology (TU Delft, Netherlands), Graz
University of Technology (TU Graz, Austria), and as secondary partner the Nottingham Trent University
(NTU, England). The main goal is to enhance the successful transition from secondary to higher education
by means of learning analytics. Together, the partner universities develop, test, and assess Learning
1
https://www.eugdpr.org/ - Last accessed January 30th, 2018
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
3
Analytics approaches that focuses on providing formative and summative feedback to students in the
transition. In the first step, promising approaches are shared between the partners to evaluate them
under different circumstances. Therefore, transferability, scalability, and modularity of the approaches
are of high interest.
One promising initiative of University of Technology of Delft is the so-called “Learning Tracker” [8], which
is made available by TU Delft as open source
2
and is displayed in Figure 1 by a typical user interface. The
Learning Tracker itself tracks the behavior of all current participants in the MOOC and displays it against
the aggregated activities of previous participants that successfully completed. Thereby, the Learning
Tracker supports learners in Massive Open Online Courses (MOOC) in becoming more efficient and
encourages them to develop their self-regulated learning skills by reflecting on their own learning
activities [8]. This approach follows Baker’s alternate paradigm for online learning by using the
information to rather empower human decision making than feeding it to an intelligent learning system
[2].
Figure 1: Visual design of the Learning Tracker. It provides several metrics in a small space and offers a
simple overall evaluation [8]
The Learning Tracker was already deployed within different MOOCs and has been shown to be easily
transferable to different MOOCs on the same platform within the same university [4]. The impact on the
engagement of the students in comparison to the completion rate of the MOOC was evaluated and the
results have shown that the Learning Tracker improves the achievement of already highly educated
learners, but is less effective for less educated ones [4]. Further, it has been shown that the cultural
context of the learners is impacting the engagement and the completion rate [4].
2
https://github.com/ioanajivet/LearningTracker Last accessed January 30th, 2018
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
4
Our goal was to deploy the Learning Tracker to the Chemistry SPOC of KU Leuven, which is based on the
edX system. Further, we wanted to get the Learning Tracker more dynamically. Therefore, we used the
opensource technology stack developed at TU Graz within the STELA project [12]. Figure 2 illustrates a
flow diagram of responsibilities and relations throughout the case study.
Figure 2: Illustration of responsibilities and relations
3 WORKING WITH EXTERNAL SERVICE PROVIDERS
This section deals with obstacles when working with external service providers (RQ1). We start by
explaining issues with data ownership when using an external service provider. Then, we discuss what
should be kept in mind when exchanging data with external service providers.
3.1 Data ownership issues
Essential when working with external service providers, is the question ”who owns the data?”. Here we
don’t consider matters related to copyright of the material provided on the platform. We also make
abstraction of the more fundamental idea that the final ownership of student produced data, whether it
concerns learner created content or simply digital activity traces, should always belong to the students
themselves.
When the external party functions as a contractor for the institution, it is reasonable to assume that the
latter preserves full ownership. But what if the platform of the service provider is independent and
subsequently used by the institution to deliver its content to learners?” To draw a parallel: when a
company uses a popular social media platform like LinkedIn to disseminate its message, would one not
assume that the platform provider retains the ownership of data related to its own user base, even if it
was in fact the company that pushed these users to the platform in the first place? And yet, it may come
as a surprise to institutions that they don’t automatically acquire ownership of or even access to student
data within the external educational platforms used by them.
KU Leuven invested extensively in its Learning Management System ”Toledo”, which is predominantly
based on the Blackboard product line. The system is maintained by an internal team and embedded in a
broader software architecture, fully hosted in the university’s own data center. Only in recent years, KU
Leuven started to invest in MOOCs and SPOCs. Due to the limited in-house experience with MOOC’s and
the common practice of hosting shared by many institutions of using an existing platforms, edX was
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
5
selected as KU Leuven’s MOOC platform of choice. However, while the issue of ownership of “Toledo”
data did not arise before, it suddenly become relevant in the new context of the edX platform.
3.2 Exchanging data with external providers
Once an agreement with the external service provider is established, the problem of data access arises.
For information systems managed by the institution itself, there is usually an option to extend or modify
the software to export data required for the Learning Analytics application. In many cases, the data may
also be fetched from a database management system directly, by setting up an ETL-process (extract,
transform, load) as is common in the domain of Business Intelligence (BI). Institutional IT services are often
familiar with these practices, also used to enable reporting on data captured in financial, administrative,
Human Resources (HR), and other information systems.
Yet when working with an external service provider, data is not directly accessible by the internal services.
As the software is primarily designed to serve multiple tenants, it may not be straightforward to adapt it
to meet the data needs of a single institution especially in the context of an ongoing research project,
when requirements are still unstable.
In some cases, the service provider offers a set of application programming interfaces (APIs) to facilitate
the communication with on-premises software of the institutions. However, these APIs are likely to be
limited to the use-cases anticipated on beforehand, if not by internal planning and priorities. Innovative
and experimental use of data, as it is to be expected within a research context, is not always compatible
with this approach. The resulting requirement is to dig deeper into the data that is being captured by the
external system, if possible by accessing it directly, circumventing the limited scope of the APIs. After all,
this would also be a common approach for internal systems, as explained above.
Apart from the requirement to create clarity about the data ownership and sharing, our case study also
involves finding a technical process to get the data from the service provider. edX indeed offers an API for
accessing student activity data. However, the provided methods are limited to the data perspectives as
imagined by the edX developers and incompatible with the requirements of the TU Delft Learning Tracker.
On request, edX offered the option to get direct access to extract the underlying log data through an FTP
server. The manual way of working is little optimized for continuous, preferably real-time data extraction,
but it allows the initiation of the case study implementation. At KU Leuven side, the process of collecting
data from edX needs to be further automated. A question is how to anticipate data structure changes on
edX side, as the data format is meant for internal use and might be reorganized in the future.
A related issue concerns the reverse flow: once the exported data has been transformed into information
that may be offered to students, how can this information be fed back to them? edX supports the Learning
Tools Interoperability (LTI) standard created by the IMS Global Learning Consortium. This standard was
designed to enable the sharing of tools across different learning systems. In our setup, the edX
environment is the LTI Tool Consumer and our Learning Tracker system is the LTI Tool Provider. When the
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
6
Learning Tracker is shown to the student, edX (trusted consumer) passes a user identification string,
whom makes an extra authentication step on the provider side unnecessary.
4 WORKING ACROSS INSTITUTIONS
In this section, we discuss obstacles when working across higher education institutions and how to
overcome them (RQ2). First, we explain what you need to keep in mind when facilitating cross-border
European initiatives. Second, we point out how to handle external data subjects.
4.1 Facilitating cross-border European Initiatives
Research cooperation is common among European universities. Students, lecturers and researchers are
increasingly roaming from one institution to another, increasing the opportunities for teaming up. But
when the research project directly involves the daily practice of the involved institutions, practical
incompatibilities may start to surface.
If working together with institutions within a single region may already be complicated, working across
(European) borders is unlikely to make matters easier. Despite the unification efforts of the Bologna
Process, Higher Education Institutions (HEI) from different European countries operate in dissimilar
contexts. Education and general laws, culture, and societal views on the role of education, organization
of the institutions, and role of the government are just a few examples of contextual areas that are likely
to differ from one country to another. Not in the least because education today is often influenced by
local tradition.
While preparing the case study implementation, it became clear that the Austrian view on data privacy is
more strict than the Belgian interpretation. Privacy awareness is stronger developed in the Austrian and
German culture. Notwithstanding the General Data Protection Regulation (GDPR), which will soon be in
effect throughout the entire European Union, the interpretation of what is allowed and what is not turned
out to be rather different. The Austrian reading, as translated into TU Graz internal policy, for instance,
directs on avoiding the integration of data from separate source systems.
The concept of processing data about the Belgian students on its Austrian servers provoked resistance on
the side of TU Graz, as it would put the internal IT department in a challenging position with respect to its
policy. Consequently, the alternative of moving the case study implementation to the KU Leuven
infrastructure was considered. However, this would require a TU Graz project member to access the
server infrastructure of KU Leuven remotely. While there was no objection to this in principle, this turned
out to be practically impossible to arrange without an existing employee relationship: the procedure to
do so was nonexistent.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
7
4.2 Handling external Data Subjects
The debate about ethics and privacy in Learning Analytics is growing. Skeptics are questioning to what
extend providers of education are entitled to study the learning behavior of their students. LA proponents,
on the other hand, are arguing that it is the duty of educators to improve learning and that it not using
data to do so may be unethical. In most cases, the (implicit) focus of such debate however, is on data
institutions collect and process about their own students, it is to say, student with which the institution
has some kind of formal engagement. It is not uncommon for students to sign a contract at registration
that already contains certain agreements about if and how the institution may use learning traces for
educational research or to improve its internal educational processes.
However, as is the situation for our case study it is also not uncommon for higher education institutions
to interact with prospective students prior to official registration. This complicates matters of privacy and
ethics, and in the absence of an agreement, it is less clear what data institutions can use to extend their
mission to improve the quality of education to the orienting and transitional process. We therefore prefer
to extract as little data as possible (data minimization) to enable the selected features of the Learning
Tracker tool. This, for instance, does not require knowledge of the student’s name or any other
characteristics, besides some kind of user id or pseudonym which is also required to feed the resulting
charts back into the SPOC user interface.
The external data subject issue is discussed in detail by [3], applied there to a shared positioning test for
engineers students, co-organized by several universities. The proposed solution uses an anonymous
feedback code that is provided to students. In this approach, data subjects retain a large part of the data
ownership and freely decide to transfer data across service providers or institutions.
5 CONCLUSION
The intention of this paper was to formulate lessons learned, which the authors consider important for
future development and implementation of Learning Analytics initiatives. In this paper we have outlined
obstacles when working with external providers (RQ1) or across institutions (RQ2), and proposed partial
solutions to overcome them. We try to allow that implementer of Learning Analytics initiatives can benefit
from this findings, adjust properly and thereby, save time and effort. In Table 1, a summary of questions
that surfaced during our case study is provided.
Table 1: Summary of surfacing questions .
Source of issue
Issue
Question
Working with an
external provider
Data ownership
Who owns the data? The institution or the service
provider?
Data access
How to get data out of the external platform? Are API’s
available and sufficient? Is direct data access possible?
How to get information back into the systems? How to
reach the end-user? Is a standard (e.g. LTI) supported?
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
8
Working across
institutions
Working cross-border
How does the educational context differ from one
partner to the other? In case of shared legislation, does
the interpretation differ?
What procedures are available to host other partner’s
data or to provide access to a researcher staffed by
another partner.
External data subjects
To what extend can data from unregistered/prospective
students be used to improve education and to feed
information back to these students?
If anonymous data is insufficient, is the use of
pseudonymization tokens (e.g. feedback codes [3]) an
alternative?
ACKNOWLEDGMENTS
This research project is co-funded by the European Commission Erasmus+ program, in the context of the
project 562167EPP-1-2015-1-BE-EPPKA3- PI-FORWARD. The European Commission support for the
production of this publication does not constitute an endorsement of the contents which reflects the
views only of the authors, and the Commission cannot be held responsible for any use which may be made
of the information contained therein.
Please visit our website http://stela-project.eu .
REFERENCES
[1] Stela project http://stela-project.eu/ - last accessed january 30th, 2018.
[2] Baker, R. S. (2016). Stupid tutoring systems, intelligent humans. International Journal of Artificial
Intelligence in Education, 26(2), 600-614.
[3] Broos, T., Verbert, K., Langie, G., Van Soom, C., & De Laet, T. (2018, March). Multi-institutional
positioning test feedback dashboard for aspiring students: lessons learnt from a case study in
Flanders. In Proceedings of the Eighth International Learning Analytics & Knowledge Conference (pp.
1-6). ACM.
[4] Davis, D., Jivet, I., Kizilcec, R. F., Chen, G., Hauff, C., & Houben, G. J. (2017, March). Follow the
successful crowd: raising MOOC completion rates through social comparison at scale. In
Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 454-463).
ACM.
[5] Drachsler, H., & Greller, W. (2016, April). Privacy and analytics: it's a DELICATE issue a checklist for
trusted learning analytics. In Proceedings of the sixth international conference on learning analytics
& knowledge (pp. 89-98). ACM.
[6] Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal
of Technology Enhanced Learning, 4(5-6), 304-317.
[7] Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning.
TechTrends, 59(1), 64-71.
Companion Proceedings 8th International Conference on Learning Analytic s & Knowledge (LAK18)
Creative Comm ons License, Attribution - NonCommerci al-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
9
[8] Jivet, I. (2016). The Learning tracker: a learner dashboard that encourages self-regulation in MOOC
learners.
[9] Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2012). The NMC
Horizon Report: 2012 Higher Education Edition. The New Media Consortium.
[10] Khalil, M., & Ebner, M. (2015, June). Learning analytics: principles and constraints. In EdMedia:
World Conference on Educational Media and Technology (pp. 1789-1799). Association for the
Advancement of Computing in Education (AACE).
[11] Khalil, M., & Ebner, M. (2016). De-identification in learning analytics. Journal of Learning Analytics,
3(1), 129-138.
[12] Leitner, P., & Ebner, M. (2017, July). Development of a Dashboard for Learning Analytics in Higher
Education. In International Conference on Learning and Collaboration Technologies (pp. 293-301).
Springer, Cham.
[13] Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher educationa literature
review. In Learning Analytics: Fundaments, Applications, and Trends (pp. 1-23). Springer, Cham.
[14] Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE
review, 46(5), 30.
... Bearing in mind the related work, the issues identified during previous research, as well as our own experiences with implementing LA projects and initiatives in higher education (De Laet et al., 2018;Leitner et al, 2018), we developed a framework for LA implementations. Based on the results of the literature review and a workshop with LA specialists, stakeholders and researchers, different issues were identified. ...
Chapter
Full-text available
While a large number of scientific publications explain the development of prototypes or the implementation of case studies in detail, descriptions of the challenges and proper solutions when implementing learning analytics initiatives are rare. In this chapter, we provide a practical tool that can be used to identify risks and challenges that arise when implementing learning analytics (LA) initiatives and discuss how to approach these to find acceptable solutions. In this way, implementers are given the opportunity to handle challenges early on and avoid being surprised at a critical moment in the project, which will save time, resources, and effort. We are aware that all aspects needed to successfully carry out learning analytics initiatives are co-dependent. Nonetheless, we identified and categorized the criteria necessary for implementing successful learning analytics initiatives. We conclude this chapter with an overview of the challenges faced and possible approaches that can be taken to facilitate the successful implementation of learning analytics.
Chapter
Full-text available
Schön, Sandra; Leitner, Philipp; Lindner, Jakob & Ebner, Martin (2023). Learning Analytics in Hochschulen und Künstliche Intelligenz. Eine Übersicht über Einsatzmöglichkeiten, erste Erfahrungen und Entwicklungen von KI-Anwendungen zur Unterstützung des Lernens und Lehrens. In: Tobias Schmohl, Alice Watanabe, Kathrin Schelling (Hg.), Künstliche Intelligenz in der Hochschulbildung, Bielefeld: transkript, S. 27-49. Online zugänglich unter: https://www.transcript-verlag.de/media/pdf/c9/16/59/oa9783839457696.pdf Erschienen unter der Lizenz CC BY SA 4.0 International (https://creativecommons.org/lice nses/by-sa/4.0/deed.de)
Conference Paper
Full-text available
Our work focuses on a multi-institutional implementation and evaluation of a Learning Analytics Dashboards (LAD) at scale, providing feedback to N=337 aspiring STEM (science, technology, engineering and mathematics) students participating in a region-wide positioning test before entering the study program. Study advisors were closely involved in the design and evaluation of the dashboard. The multi-institutional context of our case study requires careful consideration of external stakeholders and data ownership and portability issues, which gives shape to the technical design of the LAD. Our approach confirms students as active agents with data ownership, using an anonymous feedback code to access the LAD and to enable students to share their data with institutions at their discretion. Other distinguishing features of the LAD are the support for active content contribution by study advisors and LATEX type-setting of question item feedback to enhance visual recognizability. We present our lessons learnt from a first iteration in production.
Conference Paper
Full-text available
In this paper, we discuss the design, development, and implementation of a Learning Analytics (LA) dashboard in the area of Higher Education (HE). The dashboard meets the demands of the different stakeholders, maximizes the mainstreaming potential and transferability to other contexts, and is developed in the path of Open Source. The research concentrates on developing an appropriate concept to fulfil its objectives and finding a suitable technology stack. Therefore, we determine the capabilities and functionalities of the dashboard for the different stakeholders. This is of significant importance as it identifies which data can be collected, which feedback can be given, and which functionalities are provided. A key approach in the development of the dashboard is the modularity. This leads us to a design with three modules: the data collection, the search and information processing, and the data presentation. Based on these modules, we present the steps of finding a fitting Open Source technology stack for our concept and discuss pros and cons trough out the process.
Chapter
Full-text available
This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Article
Full-text available
Learning analytics has reserved its position as an important field in the educational sector. However, the large-scale collection, processing, and analyzing of data has steered the wheel beyond the borders to face an abundance of ethical breaches and constraints. Revealing learners’ personal information and attitudes, as well as their activities, are major aspects that lead to identifying individuals personally. Yet, de-identification can keep the process of learning analytics in progress while reducing the risk of inadvertent disclosure of learners’ identities. In this paper, the authors discuss de-identification methods in the context of the learning environment and propose a first prototype conceptual approach that describes the combination of anonymization strategies and learning analytics techniques.
Conference Paper
Full-text available
The widespread adoption of Learning Analytics (LA) and Educational Data Mining (EDM) has somewhat stagnated recently, and in some prominent cases even been reversed following concerns by governments, stakeholders and civil rights groups about privacy and ethics applied to the handling of personal data. In this ongoing discussion, fears and realities are often indistinguishably mixed up, leading to an atmosphere of uncertainty among potential beneficiaries of Learning Analytics, as well as hesitations among institutional managers who aim to innovate their institution's learning support by implementing data and analytics with a view on improving student success. In this paper, we try to get to the heart of the matter, by analysing the most common views and the propositions made by the LA community to solve them. We conclude the paper with an eight-point checklist named DELICATE that can be applied by researchers, policy makers and institutional managers to facilitate a trusted implementation of Learning Analytics.
Conference Paper
Full-text available
Within the evolution of technology in education, Learning Analytics has reserved its position as a robust technological field that promises to empower instructors and learners in different educational fields. The 2014 horizon report (Johnson et al., 2014), expects it to be adopted by educational institutions in the near future. However, the processes and phases as well as constraints are still not deeply debated. In this research study, the authors talk about the essence, objectives and methodologies of Learning Analytics and propose a first prototype life cycle that describes its entire process. Furthermore, the authors raise substantial questions related to challenges such as security, policy and ethics issues that limit the beneficial appliances of Learning Analytics processes.
Article
Full-text available
The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational data mining. This paper first introduces the field of learning analytics and outlines the lessons learned from well-known case studies in the research literature. The paper then identifies the critical topics that require immediate research attention for learning analytics to make a sustainable impact on the research and practice of learning and teaching. The paper concludes by discussing a growing set of issues that if unaddressed, could impede the future maturation of the field. The paper stresses that learning analytics are about learning. As such, the computational aspects of learning analytics must be well integrated within the existing educational research.
Conference Paper
Social comparison theory asserts that we establish our social and personal worth by comparing ourselves to others. In in-person learning environments, social comparison offers students critical feedback on how to behave and be successful. By contrast, online learning environments afford fewer social cues to facilitate social comparison. Can increased availability of such cues promote effective self-regulatory behavior and achievement in Massive Open Online Courses (MOOCs)? We developed a personalized feedback system that facilitates social comparison with previously successful learners based on an interactive visualization of multiple behavioral indicators. Across four randomized controlled trials in MOOCs (overall N = 33, 726), we find: (1) the availability of social comparison cues significantly increases completion rates, (2) this type of feedback benefits highly educated learners, and (3) learners' cultural context plays a significant role in their course engagement and achievement.
Article
The initial vision for intelligent tutoring systems involved powerful, multi-faceted systems that would leverage rich models of students and pedagogies to create complex learning interactions. But the intelligent tutoring systems used at scale today are much simpler. In this article, I present hypotheses on the factors underlying this development, and discuss the potential of educational data mining driving human decision-making as an alternate paradigm for online learning, focusing on intelligence amplification rather than artificial intelligence.
Article
Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.