Conference PaperPDF Available

"We're Seeking Relevance": Qualitative Perspectives on the Impact of Learning Analytics on Teaching and Learning



Whilst a significant body of learning analytics research tends to focus on impact from the perspective of usability or improved learning outcomes, this paper proposes an approach based on Affordance Theory to describe awareness and intention as a bridge between usability and impact. 10 educators at 3 European institutions participated in detailed interviews on the affordances they perceive in using learning analytics to support practice in education. Evidence illuminates connections between an educator's epistemic beliefs about learning and the purpose of education , their perception of threats or resources in delivering a successful learning experience, and the types of data they would consider as evidence in recognizing or regulating learning. This evidence can support the learning analytics community in considering the proximity to the student, the role of the educator, and their personal belief structure in developing robust analytics tools that educators may be more likely to utilize.
“We’re Seeking Relevance”: Qualitative
Perspectives on the Impact of Learning
Analytics on Teaching and Learning
Tracie Farrell1, Alexander Mikroyannidis1, and Harith Alani1
Open University UK
{tracie.farrell-frey, alexander.mikroyannidis, h.alani}
Abstract. Whilst a significant body of learning analytics research tends
to focus on impact from the perspective of usability or improved learning
outcomes, this paper proposes an approach based on Affordance Theory
to describe awareness and intention as a bridge between usability and
impact. 10 educators at 3 European institutions participated in detailed
interviews on the affordances they perceive in using learning analytics to
support practice in education. Evidence illuminates connections between
an educator’s epistemic beliefs about learning and the purpose of edu-
cation, their perception of threats or resources in delivering a successful
learning experience, and the types of data they would consider as evi-
dence in recognizing or regulating learning. This evidence can support
the learning analytics community in considering the proximity to the
student, the role of the educator, and their personal belief structure in
developing robust analytics tools that educators may be more likely to
1 Introduction and Motivation
Learning analytics intends to leverage the “collection, measurement, analysis
and reporting of data” to ”understand and optimize learning” [7]. However, the
real impact of learning analytics has been difficult to determine, in particular
with respect to the effects of personal agency and a lack of standardization in
how tools are used [5].
The study presented in this paper adopted a qualitative approach to this
problem, based on Affordances, the ”actionable properties” that an individual
can perceive about a given object [9]. Educators’ perceptions of the ”actionable
properties” of learning analytics were derived from how they spoke about using
them, now or in the future.The aim of this study was to probe the ideological and
practical assumptions of educators, to determine how this relates to their under-
standing and intention to use learning analytics to support practice (personal
agency). This knowledge can assist the learning analytics research community
and other key stakeholders in making more accurate estimations of software en-
gineering requirements, more effective measurements and evaluations of impact,
and targeted approaches for deploying learning analytics tools.
2 Related Work
2.1 The Problem of Relevance
Institutions and educators are currently burdened with an abundance of data
about their educational contexts [5]. For example, technology is used to gather
and present trace data about learners’ activities on the web [15] or within virtual
learning environments (VLE) [1][8], to collect data about learners’ physiological
responses [2], and even to highlight social interactions in learning processes [13].
However, researchers have illustrated that educators are likely to be most inter-
ested in using analytics data to interrogate the efficacy of specific interventions
that they implement in their classrooms, whereas most of the tools with which
they are presented are complex and overshoot their requirements [6]. These re-
sults indicate a necessity for deeper investigation into what kinds of data matter,
to whom they matter and why they matter to support the search for relevance
in this vast landscape of information.
2.2 Evaluating impact
Challenges of relevance are also manifested in how real impact on practice is
evaluated. If the data is overwhelming, evaluations are likely to be either too
broad or too narrow to get an accurate picture of an educator’s intentions to
use a given tool, their understanding of its utility and their actual use of the tool
in an authentic environment. For example, research on disparities in how Learn-
ing Management System (LMS) tools are used showed that most disparities can
be related to specific tool, task and interface combinations [12]. At the other
end of the spectrum, a 2013 survey of 15 learning analytics dashboard applica-
tions for educators and learners found that evaluations of tools were primarily
organized around usability studies and efficacy in controlled environments [14].
In a usability study, the perceived utility of the object at the time of evaluation
is already provided to the user. This makes it difficult to ascertain how likely
an educator is to incorporate the tool into their practice, even if the educator
expresses confidence in the tool’s utility. The knock-on effect of this tendency
is that the research community knows much more about how tools could and
should work, than how they do work [11].
3 Research Design
To prompt educators to articulate affordances, they were asked to reflect on
their perceptions of challenges unique to their practice, their understanding of
the “desired state” of successful learning and the steps they believe are necessary
to achieve it.
1. To which extent are educators able to perceive specific affordances of learning
analytics? Will those affordances be linked to the educator’s domain?
2. What recommendations can be made to learning analytics researchers and
To probe these questions, we deployed a multi-stage, purposive sampling strat-
egy to gain access to educators from various types of institutions (formal and
non-formal), who embodied different roles within the institution (staff tutors,
associate lecturers, facilitators, module chairs, tutors, etc.). The term “educa-
tor” was defined as any individual involved directly in the process of working
with learners or developing their curriculum. We conducted 10, 60-minute inter-
views, concluding sampling through saturation and constant comparison among
the transcripts. An inductive, qualitative analysis exposed and connected the
research participants’ perspectives [3]. We coded 1225 participant statements.
A second rater coded a random subset of 150 participant statements from 6 of
the 10 interviews. We calculated interrater reliability (IRR) using the Cohen’s
Kappa statistical test [4]. For the first coding procedure, kappa was .76, which
rose to .87 after the two individuals coding the data negotiated some of the word-
ing for descriptions of general themes. While this study cannot generalize across
a large number of educators and institutions, it did provide a rich description of
educators’ perceptions and intentions with regard to using learning analytics.
4 Findings
Participants consistently framed their arguments about the challenges they per-
ceive, their ideas of the ”desired state”, and the ways in which they monitor
their progress in terms of their personal beliefs about learning and the goal of
education. Goals tended to cluster around one of three general categories: to
develop strong minds, to prepare learners for practice and to satisfy the learner.
Domain differences were noted in that the goal to develop strong minds was
exclusively found among educators working in the social sciences, arts and hu-
manities. STEM 1educators primarily described preparing learners for practice.
Educators with the goal of satisfying learners all had class sizes of 1000+ stu-
dents (regardless of platform or domain). The domain differences prompted us
to conduct an analysis of the modules in which the educators were involved,
using the learning design taxonomy provided by the Open University Learn-
ing Design Initiative [10] and comparing this to how educators described the
classroom experience in the interview evidence. There was consistency between
goals and activities for all of the educators’ interviews, indicating a conscious
learning design, on behalf of the educator, and an expression of their educational
epistemology. Interview data suggested that educators with different educational
epistemology have significantly different priorities and viewpoints on challenges
and success in education. To triangulate these findings, we conducted a frequency
analysis of the open codes and discovered that educators with a shared episte-
mology also tend to share a similar perspective on challenges and desired states.
1Science, Technology, Engineering and Mathematics
For example, learner background and agency is of particular concern to edu-
cators preparing learners for practice, whereas communication and interaction
are consistently mentioned as challenges by educators from the social sciences,
arts and humanities, who aim to develop strong minds. Analysis of educators’
statements of the “desired state” also mirrored educators’ goals. For example,
educators who are preparing students for practice tended to connect performance
with having a strong motivation for learning and identification with a specific
career objective. Educators who felt they were responsible for developing strong
minds tended to determine their success through energy and euphoria in the
classroom, particularly in the presence of lively, rich discussion.
4.1 Sources of Data and Affordances of Learning Analytics
An analysis of the kinds of data educators use or need also showed continu-
ity from personal belief structure, through to the affordances that educators
perceived in learning analytics. For example, educators preparing their learners
for practice appear to focus on the hard evidence that they can see, e.g. if the
learner is able to demonstrate skill, if the learner is active in the VLE. While
they did show interest in the personal lives of learners, in terms of stress and
time management, educators did not see many opportunities for gathering data
about learner emotions, unless the student provided it directly. Thus, educators
preparing for practice relied more on institutional analytics that predict learner
performance or activity. Educators that wished to develop strong minds focused
much more on their intuitions about learners and what they can observe in the
class. Educators in this category had sincere and significant reservations about
how their learners are assessed and whether or not it is a meaningful measure of
what they have learned. For this reason, educators with this goal wondered if in-
stitutional analytics could collect enough relevant data to support their practice.
Figure 1 shows the breakdown of mentions of learning analytics by educational
goal. 7 major themes were identified in the transcripts regarding how educators
use learning analytics: to understand learner engagement, learner performance,
learner motivation and use of resources, to uncover more about the social in-
teractions between learners, to interrogate and modify learning design, and to
predict performance.
One unexpected finding was that only educators in senior roles or with class
sizes of 1000+ provided unprompted affordances of learning analytics for under-
standing or improving their practice. This included predictive analytics, which
was surprising because tutors and assistant lecturers are responsible for making
interventions on the basis of the predictions. Instead, participants described hav-
ing access to this data as overwhelming and they were unsure of how to interpret
it or develop a response. A preference for having the management of this data
lie in outside of their own remit was evident.
Fig. 1: Affordances by Educational Goal
5 Recommendations for Learning Analytics Research
The tension between having too much and too little data, as described in the
previous section on related work, was reflected in our findings. Educators are
looking for relevant data that is appropriate for their role within the institution,
makes sense within the context of their domain and their learning design, and
meets their specific needs with regard to those contexts. To help educators reduce
cognitive load in dealing with analytics data, we recommend that developers and
institutions begin to filter educator requirements according to epistemology and
learning design. With a clear line from goal to outcome, the path to understand-
ing the impact of learning analytics tools (as a source of actionable information)
would be much clearer. It would also provide a mechanism for refining specific
analytics that interrogate certain types of learning designs and classroom or-
chestrations. Finally, it would also make it easier for institutions and developers
to build stakeholder buy-in for learning analytics initiatives, by targeting tools
toward the most appropriate academic communities.
At the time of writing, the authors are concluding a more in-depth case study
of the Open University UK (OU), in which several learning analytics initiatives
have already been launched and evaluated.2This case study involves both edu-
cators and students, connecting affordances of learning analytics with personal
educational goals to gather more information for specific software engineering
requirements within the OU.
6 Conclusion
The research study described in this paper was designed to explore connections
between educators’ beliefs about their work and how they perceive and utilize
learning analytics. Applying Affordance Theory to the evidence highlighted how
the participants in the study currently use learning analytics and their spe-
cific reasons for doing so. The findings indicate that an educator’s personal,
background and belief structure, professional domain, and role within an insti-
tution all play a part in their willingness and ability to use learning analytics
as a resource for understanding and optimizing learning. The learning analytics
community can use this research to help filter requirements and provide more
targeted tools that assist educators in fulfilling the responsibilities of their role.
1. Arnold, K.E., Pistilli, M.D.: Course signals at purdue: Using learning analytics to
increase student success. In: Proceedings of the 2nd international conference on
learning analytics and knowledge. pp. 267–270. ACM (2012)
2. Blikstein, P.: Multimodal learning analytics. In: Proceedings of the third interna-
tional conference on learning analytics and knowledge. pp. 102–106. ACM (2013)
3. Charmaz, K.: Grounded theory methods in social justice research. The Sage hand-
book of qualitative research 4, 359–380 (2011)
4. Cohen, J.: A coefficient of agreement for nominal scales. Educational and psycho-
logical measurement 20(1), 37–46 (1960)
5. Dawson, S., Gaˇsevi´c, D., Siemens, G., Joksimovic, S.: Current state and future
trends: A citation network analysis of the learning analytics field. In: Proceedings
of the fourth international conference on learning analytics and knowledge. pp.
231–240. ACM (2014)
6. Dyckhoff, A.L., Zielke, D., B¨ultmann, M., Chatti, M.A., Schroeder, U.: Design and
implementation of a learning analytics toolkit for teachers. Educational Technology
& Society 15(3), 58–76 (2012)
7. Ferguson, R.: Learning analytics: drivers, developments and challenges. Interna-
tional Journal of Technology Enhanced Learning 4(5-6), 304–317 (2012)
8. Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., Wolff, A.: Ou analyse:
Analysing at-risk students at the open university. Learning Analytics Review pp.
1–16 (2015)
9. Norman, D.A.: Affordance, conventions, and design. interactions 6(3), 38–43 (1999)
10. Rienties, B., Toetenel, L., Bryan, A.: Scaling up learning design: impact of learning
design activities on lms behavior and performance. In: Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. pp. 315–319. ACM
11. Rodriguez Triana, M.J., Prieto Santos, L.P., Vozniuk, A., Shirvani Boroujeni, M.,
Schwendimann, B.A., Holzer, A.C., Gillet, D.: Monitoring, awareness and reflection
in blended technology enhanced learning: a systematic review. Tech. rep. (2016)
12. Schoonenboom, J.: Using an adapted, task-level technology acceptance model to
explain why instructors in higher education intend to use some learning manage-
ment system tools more than others. Computers & Education 71, 247–256 (2014)
13. Shum, S.B., Ferguson, R.: Social learning analytics. Educational technology &
society 15(3), 3–26 (2012)
14. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
15. Winne, P.H., Hadwin, A.F.: nstudy: Tracing and supporting self-regulated learning
in the internet. In: International handbook of metacognition and learning technolo-
gies, pp. 293–308. Springer (2013)
... As it happens in other disciplines, involving stakeholders in the design, deployment and assessment of LA solutions may contribute to the success, adoption and sustainability of such solutions [2,9,15]. Despite these envisioned benefits, looking at literature reviews in the area of LA [31,36,38,39], the cases of user engagement in the design of LA proposals [20] are still in clear minority. ...
Conference Paper
In blended learning scenarios, evidence needs to be gathered from digital and physical spaces to obtain a more complete view of the teaching and learning processes. However, these scenarios are highly heterogeneous, and the varying data sources available in each particular context can condition the accuracy, relevance, interpretability and actionability of the Learning Analytics (LA) solutions, affecting also the user's sense of agency and trust in such solutions. To aid stakeholders in making use of learning analytics, we propose a process to involve teachers in customizing multimodal LA (MMLA) solutions, adapting them to their particular blended learning situation (e.g., identifying relevant data sources and metrics). Since measuring the added value of adopting an LA solution is not straightforward, we also propose a concrete method for doing so. The results obtained from two case studies in authentic, blended computer-supported collaborative learning settings show an improvement in the sensitivity and F1 scores of the customized MMLA solution. Aside from these quantitative improvements, participant teachers reported both an increment in the effort involved, but also increased relevance, understanding and actionability of the results.
In learning situations that do not occur exclusively online, the analysis of multimodal evidence can help multiple stakeholders to better understand the learning process and the environment where it occurs. However, Multimodal Learning Analytics (MMLA) solutions are often not directly applicable outside the specific data gathering setup and conditions they were developed for. This paper focuses specifically on authentic situations where MMLA solutions are used by multiple stakeholders (e.g., teachers and researchers). In this paper, we propose an architecture to process multimodal evidence of learning taking into account the situation’s contextual information. Our adapter-based architecture supports the preparation, organisation, and fusion of multimodal evidence, and is designed to be reusable in different learning situations. Moreover, to structure and organise such contextual information, a data model is proposed. Finally, to evaluate the architecture and the data model, we apply them to four authentic learning situations where multimodal learning data was collected collaboratively by teachers and researchers.
Conference Paper
Full-text available
The digital age has introduced a host of new challenges and opportunities for the learning sciences community. These challenges and opportunities are particularly abundant in multimodal learning analytics (MMLA), a research methodology that aims to extend work from Educational Data Mining (EDM) and Learning Analytics (LA) to multimodal learning environments by treating multimodal data. Recognizing the short-term opportunities and longterm challenges will help develop proof cases and identify grand challenges that will help propel the field forward. To support the field’s growth, we use this paper to describe several ways that MMLA can potentially advance learning sciences research and touch upon key challenges that researchers who utilize MMLA have encountered over the past few years.
Full-text available
Education is experiencing a paradigm shift towards blended learning models in technology-enhanced learning (TEL). Despite the potential benefits of blended learning, it also entails additional complexity in terms of monitoring, awareness and reflection, as learning happens across different spaces and modalities. In recent years, literature on Learning Analytics (LA) and Educational Data Mining (EDM) has gained momentum and started to address the issue. To provide a clear picture of the current state of the research on the topic and to outline open research gaps, this paper presents a systematic literature review of the state-of-the-art of research in LA and EDM on monitoring, awareness and reflection in blended TEL scenarios. The search included six main academic databases in TEL that were enriched with the proceedings of the workshop on ’Awareness and Reflection in TEL’ (ARTEL), resulting in 1089 papers out of which 40 papers were included in the final analysis.
Conference Paper
Full-text available ABSTRACT While substantial progress has been made in terms of predictive modeling in the Learning Analytics Knowledge (LAK) community, one element that is often ignored is the role of learning design. Learning design establishes the objectives and pedagogical plans which can be evaluated against the outcomes captured through learning analytics. However, no empirical study is available linking learning designs of a substantial number of courses with usage of Learning Management Systems (LMS) and learning performance. Using cluster-and correlation analyses, in this study we compared how 87 modules were designed, and how this impacted (static and dynamic) LMS behavior and learning performance. Our findings indicate that academics seem to design modules with an " invisible " blueprint in their mind. Our cluster analyses yielded four distinctive learning design patterns: constructivist, assessment-driven, balanced-variety and social constructivist modules. More importantly, learning design activities strongly influenced how students were engaging online. Finally, learning design activities seem to have an impact on learning performance, in particular when modules rely on assimilative activities. Our findings indicate that learning analytics researchers need to be aware of the impact of learning design on LMS data over time, and subsequent academic performance.
Full-text available
We set the stage for this chapter by recapitulating Winne and Hadwin’s (1998) model of self-regulated learning and identifying three obstacles learners face when they strive to effectively self-regulate learning autonomously. In this context, we provide an overview of the nStudy software system, a web application that offers learners a wide array of tools for identifying and operating on information they study. We designed nStudy to be a laboratory for learners and researchers alike to explore learning skills, metacognition and self-regulated learning. As learners use nStudy’s tools to study information in the Internet or researchers’ specially prepared HTML material, nStudy logs fine-grained, time-stamped trace data that reflect the cognitive and metacognitive events in self-regulated learning. Next steps in work on the nStudy system are to add tools learners that provide feedback they can use to advance personal programs of research on improving learning skills and gainfully self-regulating learning.
Full-text available
Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics Toolkit, which enables teachers to explore and correlate learning object usage, user properties, user behavior, as well as assessment results based on graphical indicators. The primary aim of the development of eLAT is to process large data sets in microseconds with regard to individual data analysis interests of teachers and data privacy issues, in order to help them to self-reflect on their technology-enhanced teaching and learning scenarios and to identify opportunities for interventions and improvements.
Full-text available
New high-frequency data collection technologies and machine learning analysis techniques could offer new insights into learning, especially in tasks in which students have ample space to generate unique, personalized artifacts, such as a computer program, a robot, or a solution to an engineering challenge. To date most of the work on learning analytics and educational data mining has focused on online courses or cognitive tutors, in which the tasks are more structured and the entirety of interaction happens in front of a computer. In this paper, I argue that multimodal learning analytics could offer new insights into students' learning trajectories, and present several examples of this work and its educational application.
We propose that the design and implementation of effective Social Learning Analytics (SLA) present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social learning. The second challenge is to identify different types of SLA and their associated technologies and uses. We discuss five categories of analytic in relation to online social learning; these analytics are either inherently social or can be socialised. This sets the scene for a third challenge, that of implementing analytics that have pedagogical and ethical integrity in a context where power and control over data are now of primary importance. We consider some of the concerns that learning analytics provoke, and suggest that Social Learning Analytics may provide ways forward. We conclude by revisiting the drivers andtrends, and consider future scenarios that we may see unfold as SLA tools and services mature. © International Forum of Educational Technology & Society (IFETS).
Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.