Content uploaded by Alexander Mikroyannidis
Author content
All content in this area was uploaded by Alexander Mikroyannidis on Aug 16, 2017
Content may be subject to copyright.
“We’re Seeking Relevance”: Qualitative
Perspectives on the Impact of Learning
Analytics on Teaching and Learning
Tracie Farrell1, Alexander Mikroyannidis1, and Harith Alani1
Open University UK
{tracie.farrell-frey, alexander.mikroyannidis, h.alani}@open.ac.uk
Abstract. Whilst a significant body of learning analytics research tends
to focus on impact from the perspective of usability or improved learning
outcomes, this paper proposes an approach based on Affordance Theory
to describe awareness and intention as a bridge between usability and
impact. 10 educators at 3 European institutions participated in detailed
interviews on the affordances they perceive in using learning analytics to
support practice in education. Evidence illuminates connections between
an educator’s epistemic beliefs about learning and the purpose of edu-
cation, their perception of threats or resources in delivering a successful
learning experience, and the types of data they would consider as evi-
dence in recognizing or regulating learning. This evidence can support
the learning analytics community in considering the proximity to the
student, the role of the educator, and their personal belief structure in
developing robust analytics tools that educators may be more likely to
utilize.
1 Introduction and Motivation
Learning analytics intends to leverage the “collection, measurement, analysis
and reporting of data” to ”understand and optimize learning” [7]. However, the
real impact of learning analytics has been difficult to determine, in particular
with respect to the effects of personal agency and a lack of standardization in
how tools are used [5].
The study presented in this paper adopted a qualitative approach to this
problem, based on Affordances, the ”actionable properties” that an individual
can perceive about a given object [9]. Educators’ perceptions of the ”actionable
properties” of learning analytics were derived from how they spoke about using
them, now or in the future.The aim of this study was to probe the ideological and
practical assumptions of educators, to determine how this relates to their under-
standing and intention to use learning analytics to support practice (personal
agency). This knowledge can assist the learning analytics research community
and other key stakeholders in making more accurate estimations of software en-
gineering requirements, more effective measurements and evaluations of impact,
and targeted approaches for deploying learning analytics tools.
2 Related Work
2.1 The Problem of Relevance
Institutions and educators are currently burdened with an abundance of data
about their educational contexts [5]. For example, technology is used to gather
and present trace data about learners’ activities on the web [15] or within virtual
learning environments (VLE) [1][8], to collect data about learners’ physiological
responses [2], and even to highlight social interactions in learning processes [13].
However, researchers have illustrated that educators are likely to be most inter-
ested in using analytics data to interrogate the efficacy of specific interventions
that they implement in their classrooms, whereas most of the tools with which
they are presented are complex and overshoot their requirements [6]. These re-
sults indicate a necessity for deeper investigation into what kinds of data matter,
to whom they matter and why they matter to support the search for relevance
in this vast landscape of information.
2.2 Evaluating impact
Challenges of relevance are also manifested in how real impact on practice is
evaluated. If the data is overwhelming, evaluations are likely to be either too
broad or too narrow to get an accurate picture of an educator’s intentions to
use a given tool, their understanding of its utility and their actual use of the tool
in an authentic environment. For example, research on disparities in how Learn-
ing Management System (LMS) tools are used showed that most disparities can
be related to specific tool, task and interface combinations [12]. At the other
end of the spectrum, a 2013 survey of 15 learning analytics dashboard applica-
tions for educators and learners found that evaluations of tools were primarily
organized around usability studies and efficacy in controlled environments [14].
In a usability study, the perceived utility of the object at the time of evaluation
is already provided to the user. This makes it difficult to ascertain how likely
an educator is to incorporate the tool into their practice, even if the educator
expresses confidence in the tool’s utility. The knock-on effect of this tendency
is that the research community knows much more about how tools could and
should work, than how they do work [11].
3 Research Design
To prompt educators to articulate affordances, they were asked to reflect on
their perceptions of challenges unique to their practice, their understanding of
the “desired state” of successful learning and the steps they believe are necessary
to achieve it.
1. To which extent are educators able to perceive specific affordances of learning
analytics? Will those affordances be linked to the educator’s domain?
2. What recommendations can be made to learning analytics researchers and
developers?
To probe these questions, we deployed a multi-stage, purposive sampling strat-
egy to gain access to educators from various types of institutions (formal and
non-formal), who embodied different roles within the institution (staff tutors,
associate lecturers, facilitators, module chairs, tutors, etc.). The term “educa-
tor” was defined as any individual involved directly in the process of working
with learners or developing their curriculum. We conducted 10, 60-minute inter-
views, concluding sampling through saturation and constant comparison among
the transcripts. An inductive, qualitative analysis exposed and connected the
research participants’ perspectives [3]. We coded 1225 participant statements.
A second rater coded a random subset of 150 participant statements from 6 of
the 10 interviews. We calculated interrater reliability (IRR) using the Cohen’s
Kappa statistical test [4]. For the first coding procedure, kappa was .76, which
rose to .87 after the two individuals coding the data negotiated some of the word-
ing for descriptions of general themes. While this study cannot generalize across
a large number of educators and institutions, it did provide a rich description of
educators’ perceptions and intentions with regard to using learning analytics.
4 Findings
Participants consistently framed their arguments about the challenges they per-
ceive, their ideas of the ”desired state”, and the ways in which they monitor
their progress in terms of their personal beliefs about learning and the goal of
education. Goals tended to cluster around one of three general categories: to
develop strong minds, to prepare learners for practice and to satisfy the learner.
Domain differences were noted in that the goal to develop strong minds was
exclusively found among educators working in the social sciences, arts and hu-
manities. STEM 1educators primarily described preparing learners for practice.
Educators with the goal of satisfying learners all had class sizes of 1000+ stu-
dents (regardless of platform or domain). The domain differences prompted us
to conduct an analysis of the modules in which the educators were involved,
using the learning design taxonomy provided by the Open University Learn-
ing Design Initiative [10] and comparing this to how educators described the
classroom experience in the interview evidence. There was consistency between
goals and activities for all of the educators’ interviews, indicating a conscious
learning design, on behalf of the educator, and an expression of their educational
epistemology. Interview data suggested that educators with different educational
epistemology have significantly different priorities and viewpoints on challenges
and success in education. To triangulate these findings, we conducted a frequency
analysis of the open codes and discovered that educators with a shared episte-
mology also tend to share a similar perspective on challenges and desired states.
1Science, Technology, Engineering and Mathematics
For example, learner background and agency is of particular concern to edu-
cators preparing learners for practice, whereas communication and interaction
are consistently mentioned as challenges by educators from the social sciences,
arts and humanities, who aim to develop strong minds. Analysis of educators’
statements of the “desired state” also mirrored educators’ goals. For example,
educators who are preparing students for practice tended to connect performance
with having a strong motivation for learning and identification with a specific
career objective. Educators who felt they were responsible for developing strong
minds tended to determine their success through energy and euphoria in the
classroom, particularly in the presence of lively, rich discussion.
4.1 Sources of Data and Affordances of Learning Analytics
An analysis of the kinds of data educators use or need also showed continu-
ity from personal belief structure, through to the affordances that educators
perceived in learning analytics. For example, educators preparing their learners
for practice appear to focus on the hard evidence that they can see, e.g. if the
learner is able to demonstrate skill, if the learner is active in the VLE. While
they did show interest in the personal lives of learners, in terms of stress and
time management, educators did not see many opportunities for gathering data
about learner emotions, unless the student provided it directly. Thus, educators
preparing for practice relied more on institutional analytics that predict learner
performance or activity. Educators that wished to develop strong minds focused
much more on their intuitions about learners and what they can observe in the
class. Educators in this category had sincere and significant reservations about
how their learners are assessed and whether or not it is a meaningful measure of
what they have learned. For this reason, educators with this goal wondered if in-
stitutional analytics could collect enough relevant data to support their practice.
Figure 1 shows the breakdown of mentions of learning analytics by educational
goal. 7 major themes were identified in the transcripts regarding how educators
use learning analytics: to understand learner engagement, learner performance,
learner motivation and use of resources, to uncover more about the social in-
teractions between learners, to interrogate and modify learning design, and to
predict performance.
One unexpected finding was that only educators in senior roles or with class
sizes of 1000+ provided unprompted affordances of learning analytics for under-
standing or improving their practice. This included predictive analytics, which
was surprising because tutors and assistant lecturers are responsible for making
interventions on the basis of the predictions. Instead, participants described hav-
ing access to this data as overwhelming and they were unsure of how to interpret
it or develop a response. A preference for having the management of this data
lie in outside of their own remit was evident.
!"#
$"#
%!"#
%$"#
&!"#
&$"#
'!"#
()*+,)+#-,.*.)/),0#
()*+,1,.#
2+)31456,#
78)#69#:)86;+4)8#
<641*=#>6,,)456,8#
()*+,1,.#?)81.,#
-/656,#
?)@)=6A1,.#<0+6,.#B1,38#
()*+,)+#<*589*456,#
2+)A*+1,.#96+#2+*454)#
Fig. 1: Affordances by Educational Goal
5 Recommendations for Learning Analytics Research
The tension between having too much and too little data, as described in the
previous section on related work, was reflected in our findings. Educators are
looking for relevant data that is appropriate for their role within the institution,
makes sense within the context of their domain and their learning design, and
meets their specific needs with regard to those contexts. To help educators reduce
cognitive load in dealing with analytics data, we recommend that developers and
institutions begin to filter educator requirements according to epistemology and
learning design. With a clear line from goal to outcome, the path to understand-
ing the impact of learning analytics tools (as a source of actionable information)
would be much clearer. It would also provide a mechanism for refining specific
analytics that interrogate certain types of learning designs and classroom or-
chestrations. Finally, it would also make it easier for institutions and developers
to build stakeholder buy-in for learning analytics initiatives, by targeting tools
toward the most appropriate academic communities.
At the time of writing, the authors are concluding a more in-depth case study
of the Open University UK (OU), in which several learning analytics initiatives
have already been launched and evaluated.2This case study involves both edu-
cators and students, connecting affordances of learning analytics with personal
educational goals to gather more information for specific software engineering
requirements within the OU.
6 Conclusion
The research study described in this paper was designed to explore connections
between educators’ beliefs about their work and how they perceive and utilize
learning analytics. Applying Affordance Theory to the evidence highlighted how
2http://www.open.ac.uk/iet/main/research-innovation/learning-analytics
the participants in the study currently use learning analytics and their spe-
cific reasons for doing so. The findings indicate that an educator’s personal,
background and belief structure, professional domain, and role within an insti-
tution all play a part in their willingness and ability to use learning analytics
as a resource for understanding and optimizing learning. The learning analytics
community can use this research to help filter requirements and provide more
targeted tools that assist educators in fulfilling the responsibilities of their role.
References
1. Arnold, K.E., Pistilli, M.D.: Course signals at purdue: Using learning analytics to
increase student success. In: Proceedings of the 2nd international conference on
learning analytics and knowledge. pp. 267–270. ACM (2012)
2. Blikstein, P.: Multimodal learning analytics. In: Proceedings of the third interna-
tional conference on learning analytics and knowledge. pp. 102–106. ACM (2013)
3. Charmaz, K.: Grounded theory methods in social justice research. The Sage hand-
book of qualitative research 4, 359–380 (2011)
4. Cohen, J.: A coefficient of agreement for nominal scales. Educational and psycho-
logical measurement 20(1), 37–46 (1960)
5. Dawson, S., Gaˇsevi´c, D., Siemens, G., Joksimovic, S.: Current state and future
trends: A citation network analysis of the learning analytics field. In: Proceedings
of the fourth international conference on learning analytics and knowledge. pp.
231–240. ACM (2014)
6. Dyckhoff, A.L., Zielke, D., B¨ultmann, M., Chatti, M.A., Schroeder, U.: Design and
implementation of a learning analytics toolkit for teachers. Educational Technology
& Society 15(3), 58–76 (2012)
7. Ferguson, R.: Learning analytics: drivers, developments and challenges. Interna-
tional Journal of Technology Enhanced Learning 4(5-6), 304–317 (2012)
8. Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., Wolff, A.: Ou analyse:
Analysing at-risk students at the open university. Learning Analytics Review pp.
1–16 (2015)
9. Norman, D.A.: Affordance, conventions, and design. interactions 6(3), 38–43 (1999)
10. Rienties, B., Toetenel, L., Bryan, A.: Scaling up learning design: impact of learning
design activities on lms behavior and performance. In: Proceedings of the Fifth In-
ternational Conference on Learning Analytics And Knowledge. pp. 315–319. ACM
(2015)
11. Rodriguez Triana, M.J., Prieto Santos, L.P., Vozniuk, A., Shirvani Boroujeni, M.,
Schwendimann, B.A., Holzer, A.C., Gillet, D.: Monitoring, awareness and reflection
in blended technology enhanced learning: a systematic review. Tech. rep. (2016)
12. Schoonenboom, J.: Using an adapted, task-level technology acceptance model to
explain why instructors in higher education intend to use some learning manage-
ment system tools more than others. Computers & Education 71, 247–256 (2014)
13. Shum, S.B., Ferguson, R.: Social learning analytics. Educational technology &
society 15(3), 3–26 (2012)
14. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics
dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013)
15. Winne, P.H., Hadwin, A.F.: nstudy: Tracing and supporting self-regulated learning
in the internet. In: International handbook of metacognition and learning technolo-
gies, pp. 293–308. Springer (2013)