Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Mar 14, 2023
Content may be subject to copyright.
Shaping learning analytics technology through human-
centredness
Carla Barreiros 1,3, Philipp Leitner 2, Martin Ebner 2, and Stefanie Lindstaedt 1,3
1 Institute of Interactive Systems and Data Science. Graz University of Technology, Sandgasse 36, Graz, Austria
2 Department of Educational Technology, Graz University of Technology, Graz, Austria
3 Know-Center GmbH, Sandgasse 36, 4th floor, 8010 Graz, Austria
Abstract
Learning Analytics (LA) researchers and practitioners are growing interested in applying
human-centred design methods and techniques to design LA technology. This approach finds
solutions by involving the perspectives of students, teachers, and other educational
stakeholders in all process steps. It enables the creation of technology that resonates and is
tailored to the end-users needs.
The "Learning Analytics – Students in Focus" project aims to support the learning and teaching
process in the higher education context. Our interdisciplinary team focuses on LA technology
that facilitates acquiring and developing students' self-regulated learning skills, such as goal
setting, planning, monitoring progress, and reflecting. We embraced a Human-Centred
Learning Analytics (HCLA) approach since the start of our project, and it helped us to
understand students' points of view and needs and find solutions together.
This article summarises the design process of a LA tool named Planner, which aims to support
students in planning and monitoring coursework. We share our experience with various
methods and techniques applied in our research and present insights about the benefits and
limitations of the HCLA approach. Finally, we highlight how the HCLA approach helped to
build a LA community at our university and promote trust towards LA.
Keywords
1
Learning analytics (LA), Human-centered learning analytics (HCLA), educational dashboards,
self-regulated learning
1. Introduction
Education and technology are increasingly intertwined as technological advancements drive digital
transformation across the higher education landscape. Higher education institutions can not overlook
how learning analytics (LA) provides innovative ways of supporting students, teachers and other
educational stakeholders.
The "Learning Analytics – Students in Focus" project provides an initial holistic and comprehensive
view of LA at Austrian universities, contributing to LA technology's research and development. The
project brought together an interdisciplinary team of LA and pedagogy researchers, TEL practitioners,
data scientists, and ethics and data protection experts from the Graz University of Technology, the
University of Graz, and the University of Vienna. The team investigates LA technology that leverages
students' academic success through the development of self-regulated learning (SRL) skills
[15][16][17][8]. Examples of SRL skills are setting goals for learning, concentrating on instruction,
Proceedings of 4th International Workshop on Human-Centred Learning Analytics (HCLA) co-located with the 13th International Learning
Analytics and Knowledge Conference (LAK2023), Virtual, March 13, 2023.
EMAIL: carla.soutabarreiros@tugraz.at (A. 1); philipp.leitner@tugraz.at (A. 2); martin.ebner@tugraz.at (A. 3) ; lindstaedt@tugraz.at (A. 4)
ORCID: 0000-0002-2578-3158 (A. 1); 0000-0001-8883-6758 (A. 2); 0000-0001-5789-5296 (A. 3) 0000-0003-3039-2255 (A. 4)
2020 Copyright for this paper by its authors.
Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR Workshop Proceedings (CEUR-WS.org)
using effective strategies to organise ideas, using resources effectively, monitoring performance,
managing time effectively, and holding positive beliefs about one's capabilities [18].
Within the context of the “Learning Analytics – Students in Focus” project, we designed, developed,
and evaluated several learning analytics tools. These learning analytics tools are combined in a course-
level educational dashboard prototype named “Learner’s Corner”. As with any other LA dashboard, we
utilise several information visualisation techniques to present indicators about the student(s), the
learning process, and the learning context. However, the Learner’s Corner is an interactive dashboard,
and the different LA tools allow users to interact with the data by managing, tracking, analysing,
monitoring and displaying key educational metrics. The Learner’s Corner dashboard prototype is
integrated into our institutional Moodle platform, and each tool is a Moodle widget. The Learner’s
Corner dashboard consists of students' and teachers' views. This article focuses on the learning analytics
Planner tool, which aims to support students in planning and monitoring coursework during the
semester. The Planner tool is used to exemplify our research methodology and the application of a
human-centred learning analytics approach (HCLA). HCLA draws from well-established research
fields, such as Human-Computer Interface (HCI) and information visualisation, which proposes a rich
mix of methodologies and techniques that can be applied to learning analytics context. We share our
experiences and present insights about the benefits and felt limitations of the HCLA approach. Finally,
we highlight how the HCLA approach helped to build a LA community at our university and promote
trust towards LA.
2. Research Methodology
The Design Science Research approach guides our research through three cycles of activities, i.e.,
the relevance cycle, the design cycle, and the rigour cycle [6]. See Figure 1. In the relevance cycle, we
identified opportunities and problems related to the students’ SRL skills, and we defined the acceptance
criteria of the research results obtained while evaluating the artefacts. The rigour cycle ensured that the
produced designs were innovative and contributed to the existing knowledge base and theories.
Therefore, it was necessary to investigate pre-existing and applicable research work. During the design
cycle, we developed and evaluated the artefacts using the evaluation criteria defined in the relevance
cycle and the design theories, evaluation methods, and theories gathered in the rigour cycle.
Figure 1: The three design science research cycles: the relevance cycle, the design cycle, and the rigour
cycle. Adapted from Hevner [6].
We utilised various data collection tools in our research, which varied in complexity and aimed to
collect certain types of information. We used both quantitive and qualitative research methods.
However, in this paper, we privileged the description of the qualitative research methods during the
design cycle. These qualitative methods involve an interpretive and naturalistic approach to uncover
needs, opinions, behaviours, attitudes, feelings, and motivations. As depicted in Figure 2, the
qualitative research methods used during the design cycle are:
- the organisation of several focus groups with interdisciplinary experts,
- one co-design workshop with our target end-user, the students,
- two workshops to collect initial feedback, one with students and another with our university
educational technology experts that support the LMS at our institution,
- interviews with students, students’ union representatives, and teachers to gather detailed
feedback about the design.
Figure 2: Examples of the qualitative research methods used during the design cycle.
2.1. Focus Groups
We organised one initial focus group targeting interdisciplinary experts. The focus group aimed to
gain insights based on the experts' personal experiences, attitudes and behaviours towards coursework
management, time management, and self-regulated learning skills.
For the experts’ focus group, we invited eight interdisciplinary experts, i.e., three teachers, one ethics
expert, two legal experts, one TEL expert, and one LA researcher who also acted as the group facilitator.
The focus group met online for two hours. Throughout the session, the facilitator encouraged people to
share their points of view on issues such as course organisation and learning activities, workload, course
milestones, pedagogical strategies, communication, students’ performance, and self-regulated learning
skills, including time management. The group worked together to summarise and synthesise the
discussion using a collaborative text document. This artefact was pivotal to defining the LA tools
requirements from a pedagogical perspective and led to the first paper prototype version.
During the project, we organised several follow-up focus groups to further discuss the project use
cases and prototypes. The presence of ethical and legal experts helped to validate the use cases from an
ethical and lawful perspective. In fact, the use case document template was adapted to answer concrete
questions related to matters such as which data will be collected, why it will be collected, who has
access to the data, and where and how long the data will be stored.
The outcome of these focus groups greatly influenced the design of the Learner’s Corner dashboard,
including the Planner tool.
2.2. Workshops
We organised one co-design workshop and two workshops where we performed initial evaluations
of the Learner’s Corner dashboard, including the Planner tool. Next, we describe each workshop's goals,
followed procedures, and primary outcomes.
2.2.1. Students’ Co-design Workshop
This co-design workshop aimed to include the students’ perspectives and generate new ideas for the
Learner’s Corner dashboard, including the Planner tool. The workshop participants were invited
through the TU Graz students' union, which helped reach out to students of different study programs
and levels. Eight students actively participated in the co-design workshop, which took place online and
lasted three hours.
The workshop moderator started by describing the context of the project, the workshop goal and
rules, and made a call for active participation and encouragement for creativity. The moderator also
presented the agenda, which included one hour dedicated to the Planner tool. We used a video
conferencing system, a cloud-based collaborative design tool, i.e. Figma, and a visual collaboration
whiteboard, i.e., Miro.
The moderator started by showing a low-fidelity prototype of the Planner tool, pointing out the
intended features. The students were invited to use the Miro board by adding stickers with thoughts and
ideas. We provided a simple template for organising contributions around concrete topics, but students
were instructed to use the whiteboard freely. The provided template contained some trigger topics, e.g.,
feedback about the planner milestones, the milestones colour encoding, the automatic notifications, the
interface, “things that I really like”, “things that I do not like”, “new features I would like to see
implemented”. Figure 3 depicts the Miro board used during the session.
Next, we asked the students to use Figma and to change the planner prototype to improve features
they appreciated collaboratively, look for solutions to solve the problems identified, and even realise
some of the new proposed features. At the end of the workshop, the students said to feel happy to
collaborate in designing such tools and referred to how important it is to have the chance to
communicate their needs and opinions regarding LA.
This co-design workshop had a substantial effect on the design of the Planner tool. We picked many
ideas and integrated them into the Planner prototype, e.g., adding quick information about a milestone
when the students moved the mouse over it.
Figure 3: Students used a Miro whiteboard during the co-design workshop across several topics.
2.2.2. Time management Workshop
TU Graz organised and promoted an online time management workshop for students. An external
expert presented the content, which included strategies and time management digital tools. The
workshop lasted three hours and counted with the presence of seventy four students.
In collaboration with the expert, we included several activities related to our research in the
workshop, including an online time management questionnaire [1] (short-range planning, time attitudes,
and long-range planning), a presentation of the Planner tool prototype, and a round of discussion to
collect feedback about the Planner tool.
The workshop allowed us to learn more about the problems and needs that the students feel regarding
time management. Particularly interesting were the first-hand testimonies on how the students organise
their time to manage the coursework, when and how they keep track of the course milestones, and which
digital or analogue tools they use. Most importantly, this workshop allowed the initial evaluation of the
student’s view of the Planner tool.
2.2.3. TU Graz TeachCenter support team Workshop
TU Graz TeachCenter is the learning management system (Moodle based) of TU Graz, which
students and teachers broadly use. The Learner’s Corner dashboard prototype is integrated into the TU
Graz TeachCenter. Therefore we thought it appropriate to organise a workshop with the team of the
department of Educational Technology that supports teachers and teaching staff in setting the courses
and making use of all the available applications.
The online workshop lasted for two hours and counted with five participants. The moderator started
by presenting the project and the Learner’s Corner dashboard, including the Planner tool. We used a
collaborative text document to record the discussions and feedback. The participants could interact with
the Learner’s Corner, pose questions, and comment during this explorative process.
In addition to the feedback received about the tools on both the students’ and teachers’ views, we
acquired valuable information about how teachers and teaching staff use the TeachCenter. We also
discussed the importance of ensuring teachers see the benefit of activating the LA tools in their courses.
2.3. Interviews
During the course of our research, we interviewed students, union representatives, teachers, and
other educational stakeholders to gain detailed information, e.g., needs, preferences, and feedback. We
opted for semi-structured interviews, which allowed us to pose open-ended questions about specific
topics, but with the freedom to explore participants' responses. In addition, we also partially used
storytelling techniques during our interviews as we found these stories offered surprising perspectives
and hidden knowledge.
For example, we wanted to investigate when, how, and with which tools the students use to plan the
coursework. So instead of posing a direct question, we presented a scenario and used small prompts to
direct the conversation. “Consider that you just attended the first class for a course where the teacher
presented the course's content, organisation and grading schema. What do you usually do? Do you plan
your coursework? If so, tell us how? If not, why? When do you do it? How do you do it?”. Questions
such as this typically led the student to recall past experiences and construct a rich story-like narrative
full of details. Some students even showed artefacts such as calendars, excel sheets, and hand-written
documents with their plans.
We found these interviews valuable to deepen previously gained knowledge, e.g., through focus
groups and workshops.
3. Learner’s Corner dashboard - Planner tool
The Planner tool provides an overview of the milestones of a course set by the teacher and the
personal milestones set by the student. The Planner tool supports students in planning their coursework
and managing their time. The current version of the Planner tool results from an iterative human-centred
design process developed over three years. Figure 4 depicts the evolution from the initial paper
prototype, to the low-fidelity prototype, to the high-fidelity prototype.
The current version presents the course’s milestones over the semester timeline as a circle. At the
top of the timeline are the milestones defined by the teacher, and at the bottom are the personal
milestones defined by the student. All milestones have properties such as a title, date and time, and
completion status (completed, incompleted), allowing the students to monitor their progress across the
coursework. A legend is presented below the timeline to facilitate understanding of the visual
information.
The type of milestones is explicit by the letters inside the circle. We use a visual traffic light
metaphor to encode the milestones' completion status, which means a completed milestone is shown in
green, a milestone is shown in yellow if the deadline is approaching, and the milestone is incomplete.
A milestone is shown in red if the milestone is incomplete and the deadline is over. Finally, a milestone
is shown in grey if there is nothing to remark on. The student can identify graded milestones (star over
the circle) and mandatory milestones (darker circle border).
Figure 4: Evolution from the initial paper prototype (left), to the low-fidelity prototype (middle), to
the high-fidelity prototype (right).
In addition, teachers can define automatic reminders for students about deadlines and performance
reports. These reminders are delivered by email and the TU Graz TeachCenter notification system and
can be configured by the students. Also, students can use filters and zoom in/out of the timeline, see the
milestone’s quick information when moving the mouse over it, and see the detailed information when
clicking it. Also, the teacher can configure the system to send a monthly performance report to the
students.
4. Discussion
The success of any system or product depends on its utility, usability and usefulness. Utility refers to
the system's ability to provide the features the user needs, and usability refers to how easy and pleasing
it is to use these features. Utility and usability often shape the perceived usefulness of a system in a
specific context affecting the system's acceptance and adoption. Learning analytics systems are not an
exception, and therefore it is unsurprising the growing interest in the human-centred learning analytics
approach [2][12]. HCLA draws from fields such as Human-Computer Interaction (HCI), information
visualisation, Technology Enhanced Learning (TEL), and Learning Experience Design (LXD). Overall,
our experience with the HCLA approach is very positive, and we believe it will drive the success of the
Learner’s Corner dashboard and its tools.
Including educational stakeholders. The work done with students, teachers, and other educational
stakeholders has been rewarding as we investigated their needs and pain points and found solutions
together. However, organising activities such as workshops, focus groups, and interviews is time-
consuming and demanding on resources. Recruiting students was sometimes challenging, as much of
our work occurred during the Covid pandemic. We also realised that most students who participated in
our initiatives were typically successful, and engaging with low-performance students was almost
impossible.
Dollinger et al. identified barriers that may deter the generation of ideas or making comments, such as
lack of knowledge, expertise, or confidence, time constraints, unbalanced power relations between
stakeholders in a group, and ethical and privacy concerns [3]. Therefore, we implemented some
mitigation strategies to overcome these obstacles, especially in the activities conducted with the
students.
Multi-methods and techniques. Various methods and techniques can be used to involve students and
other educational stakeholders in the design process of LA [12][9][10]. Using multiple methods and
techniques allowed us to gain a deeper understanding of the needs and perspectives of the stakeholders,
and it provided higher confidence in our findings as we gathered more robust evidence.
Interdisciplinary team. The design of effective LA technology is a complex problem that goes beyond
addressing technical and pedagogical issues. Therefore an interdisciplinary team that brings researchers
and experts from diverse fields is essential to adopt human-centredness in learning analytics
successfully. We believe the team members' different backgrounds and expertise positively affected our
project work. This diversity allowed us to create a shared understanding and find solutions from various
fields of knowledge. Additionally, we believe that interdisciplinary teams help to ensure that technology
is grounded in ethical and socially responsible practices.
Ethics and law. One of our project goals is to create LA technology that can be easily transferred from
a pure research environment to a production environment. Therefore, we consciously designed our LA
tools to support specific values and go beyond the legal requirements. To this end, we considered several
frameworks and codes of practice [7][11][13]. We proposed a normative framework in the form of an
interdisciplinary Criteria Catalog that guided our design choices to build trustworthy LA tools [14]. We
also questioned if asking for consent before starting to use LA is enough and explored LA being offered
as a service in higher education institutions rather than an intervention [5].
Community building. It was gratifying to bring people together around the learning analytics topic to
discuss and evaluate benefits and limitations and actively contribute to finding solutions that suit each
stakeholder’s needs. Even though it was not a goal of our project, we think that during our activities,
we contributed to disseminating LA in our institution and set grounds for a LA community.
5. Conclusion
This paper shares our experience in achieving human-centredness in a learning analytics system
through the example of the Planner tool and the Learners’ Corner educational dashboard. We describe
how we involved students, teachers and other educational stakeholders on different levels, e.g.,
identifying users’ needs, designing the interface, and evaluating our LA prototypes. We believe that
adopting the HCLA approach is a step towards the success of any LA system, as it leads to a tailored
system that addresses the end-users' needs. We are privileged with an interdisciplinary team and enough
project time to conduct these activities, which we think contributed significantly to a very positive
experience with the HCLA approach. However, we recognise that it is demanding on time and
resources.
We realised that HCLA promotes transparency and trust towards LA and LA systems, as the users
actively shape the system. In addition, we operationalised into our LA tools and dashboard design
values such as transparency, privacy and good data governance, autonomy, non-discrimination, respect,
responsibility and accountability, and protection. Finally, we believe that the close contact with students
and teachers helped to build a LA community at our university.
6. Acknowledgements
The developed work presented here was co-funded by the Federal Ministry of Education, Science
and Research, Austria, as part of the 2019 call for proposals for digital and social transformation in
higher education for the project "Learning Analytics" (2021-2024, partner organisations: the Graz
University of Technology, the University of Vienna, and the University of Graz).
7. References
[1] Britton, B. K.; Tesser, A. Effects of Time-Management Practices on College Grades. Journal of
Educational Psychology 1991, 83 (3), 405–410. https://doi.org/10.1037/0022-0663.83.3.405.
[2] Dimitriadis, Y., Martínez-Maldonado, R., Wiley, K.: Human-Centred Design Principles for
Actionable Learning Analytics. In: Tsiatsos, T., Demetriadis, S., Mikropoulos, A., and Dagdilelis,
V. (eds.) Research on E-Learning and ICT in Education: Technological, Pedagogical and
Instructional Perspectives. pp. 277–296. Springer International Publishing, Cham (2021).
https://doi.org/10.1007/978-3-030-64363-8_15.
[3] Dollinger, M., Liu, D., Arthars, N., & Lodge, J. (2019). Working Together in Learning Analytics
Towards the Co-Creation of Value. Journal of Learning Analytics, 6(2), 10–26.
https://doi.org/10.18608/jla.2019.62.2
[4] Gašević, D., Kovanović, V., Joksimović, S.: Piecing the learning analytics puzzle: a consolidated
model of a field of research and practice. Learning: Research and Practice. 3, 63–78 (2017).
https://doi.org/10.1080/23735082.2017.1286142.
[5] Gosch, N., Andrews, D., Barreiros, C., Leitner, P., Staudegger, E., Ebner, M., Lindstaedt, S.:
Learning Analytics as a Service for Empowered Learners: From Data Subjects to Controllers. In:
LAK21: 11th International Learning Analytics and Knowledge Conference. pp. 475–481.
Association for Computing Machinery, New York, NY, USA (2021).
https://doi.org/10.1145/3448139.3448186.
[6] Hevner, A. R. (2007). A Three Cycle View of Design Science Research. Scandinavian Journal of
Information Systems, 19:87–92.
[7] Kay, D., Korn, N., and Oppenheim, C.: Legal, Risk and Ethical Aspects of Analytics in Higher
Education. CETIS (2012).
[8] Pintrich, P. R. (2000). Chapter 14—The Role of Goal Orientation in Self-Regulated Learning. In
M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of Self-Regulation (pp. 451–502).
Academic Press. https://doi.org/10.1016/B978-012109890-2/50043-3
[9] Prieto-Alvarez, C. G., Martinez-Maldonado, R., & Dirndorfer Anderson, T. (2018). Co-designing
learning analytics tools with learners. In J. M. Lodge, J. Cooney Horvath, & L. Corrin (Eds.),
Learning Analytics in the Classroom: Translating Learning Analytics Research for Teachers (1st
ed., pp. 93-110). Taylor & Francis. https://doi.org/10.4324/9781351113038-7
[10] Sanders, E.B.-N., Stappers, P.J.: Co-creation and the new landscapes of design. CoDesign. 4, 5–
18 (2008). https://doi.org/10.1080/15710880701875068.
[11] Sclater, N., Bailey, P.: Code of practice for learning analytics (2018).
[12] Shum, S.B., Ferguson, R., Martinez-Maldonado, R.: Human-Centred Learning Analytics. Journal
of Learning Analytics. 6, 1–9 (2019). https://doi.org/10.18608/jla.2019.62.1.
[13] Slade, S., Prinsloo, P.: Learning Analytics - ethical issues and dilemmas. American Behavioral
Scientist 57, 1509–1528 (2013). DOI:https://doi.org/10.1177/0002764213479366
[14] Veljanova, H., Barreiros, C., Gosch, N., Staudegger, E., Ebner, M., Lindstaedt, S.: Towards
Trustworthy Learning Analytics Applications: An Interdisciplinary Approach Using the Example
of Learning Diaries. In: HCI International 2022 Posters: 24th International Conference on Human-
Computer Interaction, HCII 2022, Virtual Event, June 26–July 1, 2022, Proceedings, Part III, 138-
145. Cham: Springer International Publishing (2022).
[15] Zimmerman, B.J.: Self-Regulated Learning and Academic Achievement: An Overview.
Educational Psychologist. 25, 3–17 (1990). https://doi.org/10.1207/s15326985ep2501_2.
[16] Zimmerman, B., Schunk, D. (2001). Self-Regulated Learning and Academic Achievement:
Theoretical Perspectives. Mahwah, NJ: Lawrence Erlbaum Associates.
[17] Zimmerman, B.J.: Self-Regulated Learning: Theories, Measures, and Outcomes. In: Wright, J.D.
(ed.) International Encyclopedia of the Social & Behavioral Sciences (Second Edition). pp. 541–
546. Elsevier, Oxford (2015). https://doi.org/10.1016/B978-0-08-097086-8.26060-1.
[18] Zimmerman, B.J.: Chapter 2 - Attaining Self-Regulation: A Social Cognitive Perspective. In:
Boekaerts, M., Pintrich, P.R., and Zeidner, M. (eds.) Handbook of Self-Regulation. pp. 13–39.
Academic Press, San Diego (2000). https://doi.org/10.1016/B978-012109890-2/50031-7.