Conference PaperPDF Available

Shaping learning analytics technology through human- centredness


Abstract and Figures

Learning Analytics (LA) researchers and practitioners are growing interested in applying human-centred design methods and techniques to design LA technology. This approach finds solutions by involving the perspectives of students, teachers, and other educational stakeholders in all process steps. It enables the creation of technology that resonates and is tailored to the end-users needs. The "Learning Analytics-Students in Focus" project aims to support the learning and teaching process in the higher education context. Our interdisciplinary team focuses on LA technology that facilitates acquiring and developing students' self-regulated learning skills, such as goal setting, planning, monitoring progress, and reflecting. We embraced a Human-Centred Learning Analytics (HCLA) approach since the start of our project, and it helped us to understand students' points of view and needs and find solutions together. This article summarises the design process of a LA tool named Planner, which aims to support students in planning and monitoring coursework. We share our experience with various methods and techniques applied in our research and present insights about the benefits and limitations of the HCLA approach. Finally, we highlight how the HCLA approach helped to build a LA community at our university and promote trust towards LA.
Content may be subject to copyright.
Shaping learning analytics technology through human-
Carla Barreiros 1,3, Philipp Leitner 2, Martin Ebner 2, and Stefanie Lindstaedt 1,3
1 Institute of Interactive Systems and Data Science. Graz University of Technology, Sandgasse 36, Graz, Austria
2 Department of Educational Technology, Graz University of Technology, Graz, Austria
3 Know-Center GmbH, Sandgasse 36, 4th floor, 8010 Graz, Austria
Learning Analytics (LA) researchers and practitioners are growing interested in applying
human-centred design methods and techniques to design LA technology. This approach finds
solutions by involving the perspectives of students, teachers, and other educational
stakeholders in all process steps. It enables the creation of technology that resonates and is
tailored to the end-users needs.
The "Learning Analytics Students in Focus" project aims to support the learning and teaching
process in the higher education context. Our interdisciplinary team focuses on LA technology
that facilitates acquiring and developing students' self-regulated learning skills, such as goal
setting, planning, monitoring progress, and reflecting. We embraced a Human-Centred
Learning Analytics (HCLA) approach since the start of our project, and it helped us to
understand students' points of view and needs and find solutions together.
This article summarises the design process of a LA tool named Planner, which aims to support
students in planning and monitoring coursework. We share our experience with various
methods and techniques applied in our research and present insights about the benefits and
limitations of the HCLA approach. Finally, we highlight how the HCLA approach helped to
build a LA community at our university and promote trust towards LA.
Learning analytics (LA), Human-centered learning analytics (HCLA), educational dashboards,
self-regulated learning
1. Introduction
Education and technology are increasingly intertwined as technological advancements drive digital
transformation across the higher education landscape. Higher education institutions can not overlook
how learning analytics (LA) provides innovative ways of supporting students, teachers and other
educational stakeholders.
The "Learning Analytics Students in Focus" project provides an initial holistic and comprehensive
view of LA at Austrian universities, contributing to LA technology's research and development. The
project brought together an interdisciplinary team of LA and pedagogy researchers, TEL practitioners,
data scientists, and ethics and data protection experts from the Graz University of Technology, the
University of Graz, and the University of Vienna. The team investigates LA technology that leverages
students' academic success through the development of self-regulated learning (SRL) skills
[15][16][17][8]. Examples of SRL skills are setting goals for learning, concentrating on instruction,
Proceedings of 4th International Workshop on Human-Centred Learning Analytics (HCLA) co-located with the 13th International Learning
Analytics and Knowledge Conference (LAK2023), Virtual, March 13, 2023.
EMAIL: (A. 1); (A. 2); (A. 3) ; (A. 4)
ORCID: 0000-0002-2578-3158 (A. 1); 0000-0001-8883-6758 (A. 2); 0000-0001-5789-5296 (A. 3) 0000-0003-3039-2255 (A. 4)
2020 Copyright for this paper by its authors.
Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR Workshop Proceedings (
using effective strategies to organise ideas, using resources effectively, monitoring performance,
managing time effectively, and holding positive beliefs about one's capabilities [18].
Within the context of the “Learning Analytics – Students in Focus” project, we designed, developed,
and evaluated several learning analytics tools. These learning analytics tools are combined in a course-
level educational dashboard prototype named “Learner’s Corner”. As with any other LA dashboard, we
utilise several information visualisation techniques to present indicators about the student(s), the
learning process, and the learning context. However, the Learner’s Corner is an interactive dashboard,
and the different LA tools allow users to interact with the data by managing, tracking, analysing,
monitoring and displaying key educational metrics. The Learner’s Corner dashboard prototype is
integrated into our institutional Moodle platform, and each tool is a Moodle widget. The Learner’s
Corner dashboard consists of students' and teachers' views. This article focuses on the learning analytics
Planner tool, which aims to support students in planning and monitoring coursework during the
semester. The Planner tool is used to exemplify our research methodology and the application of a
human-centred learning analytics approach (HCLA). HCLA draws from well-established research
fields, such as Human-Computer Interface (HCI) and information visualisation, which proposes a rich
mix of methodologies and techniques that can be applied to learning analytics context. We share our
experiences and present insights about the benefits and felt limitations of the HCLA approach. Finally,
we highlight how the HCLA approach helped to build a LA community at our university and promote
trust towards LA.
2. Research Methodology
The Design Science Research approach guides our research through three cycles of activities, i.e.,
the relevance cycle, the design cycle, and the rigour cycle [6]. See Figure 1. In the relevance cycle, we
identified opportunities and problems related to the students’ SRL skills, and we defined the acceptance
criteria of the research results obtained while evaluating the artefacts. The rigour cycle ensured that the
produced designs were innovative and contributed to the existing knowledge base and theories.
Therefore, it was necessary to investigate pre-existing and applicable research work. During the design
cycle, we developed and evaluated the artefacts using the evaluation criteria defined in the relevance
cycle and the design theories, evaluation methods, and theories gathered in the rigour cycle.
Figure 1: The three design science research cycles: the relevance cycle, the design cycle, and the rigour
cycle. Adapted from Hevner [6].
We utilised various data collection tools in our research, which varied in complexity and aimed to
collect certain types of information. We used both quantitive and qualitative research methods.
However, in this paper, we privileged the description of the qualitative research methods during the
design cycle. These qualitative methods involve an interpretive and naturalistic approach to uncover
needs, opinions, behaviours, attitudes, feelings, and motivations. As depicted in Figure 2, the
qualitative research methods used during the design cycle are:
- the organisation of several focus groups with interdisciplinary experts,
- one co-design workshop with our target end-user, the students,
- two workshops to collect initial feedback, one with students and another with our university
educational technology experts that support the LMS at our institution,
- interviews with students, students union representatives, and teachers to gather detailed
feedback about the design.
Figure 2: Examples of the qualitative research methods used during the design cycle.
2.1. Focus Groups
We organised one initial focus group targeting interdisciplinary experts. The focus group aimed to
gain insights based on the experts' personal experiences, attitudes and behaviours towards coursework
management, time management, and self-regulated learning skills.
For the experts’ focus group, we invited eight interdisciplinary experts, i.e., three teachers, one ethics
expert, two legal experts, one TEL expert, and one LA researcher who also acted as the group facilitator.
The focus group met online for two hours. Throughout the session, the facilitator encouraged people to
share their points of view on issues such as course organisation and learning activities, workload, course
milestones, pedagogical strategies, communication, students’ performance, and self-regulated learning
skills, including time management. The group worked together to summarise and synthesise the
discussion using a collaborative text document. This artefact was pivotal to defining the LA tools
requirements from a pedagogical perspective and led to the first paper prototype version.
During the project, we organised several follow-up focus groups to further discuss the project use
cases and prototypes. The presence of ethical and legal experts helped to validate the use cases from an
ethical and lawful perspective. In fact, the use case document template was adapted to answer concrete
questions related to matters such as which data will be collected, why it will be collected, who has
access to the data, and where and how long the data will be stored.
The outcome of these focus groups greatly influenced the design of the Learner’s Corner dashboard,
including the Planner tool.
2.2. Workshops
We organised one co-design workshop and two workshops where we performed initial evaluations
of the Learner’s Corner dashboard, including the Planner tool. Next, we describe each workshop's goals,
followed procedures, and primary outcomes.
2.2.1. Students’ Co-design Workshop
This co-design workshop aimed to include the students’ perspectives and generate new ideas for the
Learner’s Corner dashboard, including the Planner tool. The workshop participants were invited
through the TU Graz students' union, which helped reach out to students of different study programs
and levels. Eight students actively participated in the co-design workshop, which took place online and
lasted three hours.
The workshop moderator started by describing the context of the project, the workshop goal and
rules, and made a call for active participation and encouragement for creativity. The moderator also
presented the agenda, which included one hour dedicated to the Planner tool. We used a video
conferencing system, a cloud-based collaborative design tool, i.e. Figma, and a visual collaboration
whiteboard, i.e., Miro.
The moderator started by showing a low-fidelity prototype of the Planner tool, pointing out the
intended features. The students were invited to use the Miro board by adding stickers with thoughts and
ideas. We provided a simple template for organising contributions around concrete topics, but students
were instructed to use the whiteboard freely. The provided template contained some trigger topics, e.g.,
feedback about the planner milestones, the milestones colour encoding, the automatic notifications, the
interface, “things that I really like”, “things that I do not like”, “new features I would like to see
implemented”. Figure 3 depicts the Miro board used during the session.
Next, we asked the students to use Figma and to change the planner prototype to improve features
they appreciated collaboratively, look for solutions to solve the problems identified, and even realise
some of the new proposed features. At the end of the workshop, the students said to feel happy to
collaborate in designing such tools and referred to how important it is to have the chance to
communicate their needs and opinions regarding LA.
This co-design workshop had a substantial effect on the design of the Planner tool. We picked many
ideas and integrated them into the Planner prototype, e.g., adding quick information about a milestone
when the students moved the mouse over it.
Figure 3: Students used a Miro whiteboard during the co-design workshop across several topics.
2.2.2. Time management Workshop
TU Graz organised and promoted an online time management workshop for students. An external
expert presented the content, which included strategies and time management digital tools. The
workshop lasted three hours and counted with the presence of seventy four students.
In collaboration with the expert, we included several activities related to our research in the
workshop, including an online time management questionnaire [1] (short-range planning, time attitudes,
and long-range planning), a presentation of the Planner tool prototype, and a round of discussion to
collect feedback about the Planner tool.
The workshop allowed us to learn more about the problems and needs that the students feel regarding
time management. Particularly interesting were the first-hand testimonies on how the students organise
their time to manage the coursework, when and how they keep track of the course milestones, and which
digital or analogue tools they use. Most importantly, this workshop allowed the initial evaluation of the
student’s view of the Planner tool.
2.2.3. TU Graz TeachCenter support team Workshop
TU Graz TeachCenter is the learning management system (Moodle based) of TU Graz, which
students and teachers broadly use. The Learner’s Corner dashboard prototype is integrated into the TU
Graz TeachCenter. Therefore we thought it appropriate to organise a workshop with the team of the
department of Educational Technology that supports teachers and teaching staff in setting the courses
and making use of all the available applications.
The online workshop lasted for two hours and counted with five participants. The moderator started
by presenting the project and the Learner’s Corner dashboard, including the Planner tool. We used a
collaborative text document to record the discussions and feedback. The participants could interact with
the Learner’s Corner, pose questions, and comment during this explorative process.
In addition to the feedback received about the tools on both the students’ and teachers’ views, we
acquired valuable information about how teachers and teaching staff use the TeachCenter. We also
discussed the importance of ensuring teachers see the benefit of activating the LA tools in their courses.
2.3. Interviews
During the course of our research, we interviewed students, union representatives, teachers, and
other educational stakeholders to gain detailed information, e.g., needs, preferences, and feedback. We
opted for semi-structured interviews, which allowed us to pose open-ended questions about specific
topics, but with the freedom to explore participants' responses. In addition, we also partially used
storytelling techniques during our interviews as we found these stories offered surprising perspectives
and hidden knowledge.
For example, we wanted to investigate when, how, and with which tools the students use to plan the
coursework. So instead of posing a direct question, we presented a scenario and used small prompts to
direct the conversation. “Consider that you just attended the first class for a course where the teacher
presented the course's content, organisation and grading schema. What do you usually do? Do you plan
your coursework? If so, tell us how? If not, why? When do you do it? How do you do it?”. Questions
such as this typically led the student to recall past experiences and construct a rich story-like narrative
full of details. Some students even showed artefacts such as calendars, excel sheets, and hand-written
documents with their plans.
We found these interviews valuable to deepen previously gained knowledge, e.g., through focus
groups and workshops.
3. Learner’s Corner dashboard - Planner tool
The Planner tool provides an overview of the milestones of a course set by the teacher and the
personal milestones set by the student. The Planner tool supports students in planning their coursework
and managing their time. The current version of the Planner tool results from an iterative human-centred
design process developed over three years. Figure 4 depicts the evolution from the initial paper
prototype, to the low-fidelity prototype, to the high-fidelity prototype.
The current version presents the course’s milestones over the semester timeline as a circle. At the
top of the timeline are the milestones defined by the teacher, and at the bottom are the personal
milestones defined by the student. All milestones have properties such as a title, date and time, and
completion status (completed, incompleted), allowing the students to monitor their progress across the
coursework. A legend is presented below the timeline to facilitate understanding of the visual
The type of milestones is explicit by the letters inside the circle. We use a visual traffic light
metaphor to encode the milestones' completion status, which means a completed milestone is shown in
green, a milestone is shown in yellow if the deadline is approaching, and the milestone is incomplete.
A milestone is shown in red if the milestone is incomplete and the deadline is over. Finally, a milestone
is shown in grey if there is nothing to remark on. The student can identify graded milestones (star over
the circle) and mandatory milestones (darker circle border).
Figure 4: Evolution from the initial paper prototype (left), to the low-fidelity prototype (middle), to
the high-fidelity prototype (right).
In addition, teachers can define automatic reminders for students about deadlines and performance
reports. These reminders are delivered by email and the TU Graz TeachCenter notification system and
can be configured by the students. Also, students can use filters and zoom in/out of the timeline, see the
milestone’s quick information when moving the mouse over it, and see the detailed information when
clicking it. Also, the teacher can configure the system to send a monthly performance report to the
4. Discussion
The success of any system or product depends on its utility, usability and usefulness. Utility refers to
the system's ability to provide the features the user needs, and usability refers to how easy and pleasing
it is to use these features. Utility and usability often shape the perceived usefulness of a system in a
specific context affecting the system's acceptance and adoption. Learning analytics systems are not an
exception, and therefore it is unsurprising the growing interest in the human-centred learning analytics
approach [2][12]. HCLA draws from fields such as Human-Computer Interaction (HCI), information
visualisation, Technology Enhanced Learning (TEL), and Learning Experience Design (LXD). Overall,
our experience with the HCLA approach is very positive, and we believe it will drive the success of the
Learner’s Corner dashboard and its tools.
Including educational stakeholders. The work done with students, teachers, and other educational
stakeholders has been rewarding as we investigated their needs and pain points and found solutions
together. However, organising activities such as workshops, focus groups, and interviews is time-
consuming and demanding on resources. Recruiting students was sometimes challenging, as much of
our work occurred during the Covid pandemic. We also realised that most students who participated in
our initiatives were typically successful, and engaging with low-performance students was almost
Dollinger et al. identified barriers that may deter the generation of ideas or making comments, such as
lack of knowledge, expertise, or confidence, time constraints, unbalanced power relations between
stakeholders in a group, and ethical and privacy concerns [3]. Therefore, we implemented some
mitigation strategies to overcome these obstacles, especially in the activities conducted with the
Multi-methods and techniques. Various methods and techniques can be used to involve students and
other educational stakeholders in the design process of LA [12][9][10]. Using multiple methods and
techniques allowed us to gain a deeper understanding of the needs and perspectives of the stakeholders,
and it provided higher confidence in our findings as we gathered more robust evidence.
Interdisciplinary team. The design of effective LA technology is a complex problem that goes beyond
addressing technical and pedagogical issues. Therefore an interdisciplinary team that brings researchers
and experts from diverse fields is essential to adopt human-centredness in learning analytics
successfully. We believe the team members' different backgrounds and expertise positively affected our
project work. This diversity allowed us to create a shared understanding and find solutions from various
fields of knowledge. Additionally, we believe that interdisciplinary teams help to ensure that technology
is grounded in ethical and socially responsible practices.
Ethics and law. One of our project goals is to create LA technology that can be easily transferred from
a pure research environment to a production environment. Therefore, we consciously designed our LA
tools to support specific values and go beyond the legal requirements. To this end, we considered several
frameworks and codes of practice [7][11][13]. We proposed a normative framework in the form of an
interdisciplinary Criteria Catalog that guided our design choices to build trustworthy LA tools [14]. We
also questioned if asking for consent before starting to use LA is enough and explored LA being offered
as a service in higher education institutions rather than an intervention [5].
Community building. It was gratifying to bring people together around the learning analytics topic to
discuss and evaluate benefits and limitations and actively contribute to finding solutions that suit each
stakeholder’s needs. Even though it was not a goal of our project, we think that during our activities,
we contributed to disseminating LA in our institution and set grounds for a LA community.
5. Conclusion
This paper shares our experience in achieving human-centredness in a learning analytics system
through the example of the Planner tool and the Learners’ Corner educational dashboard. We describe
how we involved students, teachers and other educational stakeholders on different levels, e.g.,
identifying users’ needs, designing the interface, and evaluating our LA prototypes. We believe that
adopting the HCLA approach is a step towards the success of any LA system, as it leads to a tailored
system that addresses the end-users' needs. We are privileged with an interdisciplinary team and enough
project time to conduct these activities, which we think contributed significantly to a very positive
experience with the HCLA approach. However, we recognise that it is demanding on time and
We realised that HCLA promotes transparency and trust towards LA and LA systems, as the users
actively shape the system. In addition, we operationalised into our LA tools and dashboard design
values such as transparency, privacy and good data governance, autonomy, non-discrimination, respect,
responsibility and accountability, and protection. Finally, we believe that the close contact with students
and teachers helped to build a LA community at our university.
6. Acknowledgements
The developed work presented here was co-funded by the Federal Ministry of Education, Science
and Research, Austria, as part of the 2019 call for proposals for digital and social transformation in
higher education for the project "Learning Analytics" (2021-2024, partner organisations: the Graz
University of Technology, the University of Vienna, and the University of Graz).
7. References
[1] Britton, B. K.; Tesser, A. Effects of Time-Management Practices on College Grades. Journal of
Educational Psychology 1991, 83 (3), 405410.
[2] Dimitriadis, Y., Martínez-Maldonado, R., Wiley, K.: Human-Centred Design Principles for
Actionable Learning Analytics. In: Tsiatsos, T., Demetriadis, S., Mikropoulos, A., and Dagdilelis,
V. (eds.) Research on E-Learning and ICT in Education: Technological, Pedagogical and
Instructional Perspectives. pp. 277296. Springer International Publishing, Cham (2021).
[3] Dollinger, M., Liu, D., Arthars, N., & Lodge, J. (2019). Working Together in Learning Analytics
Towards the Co-Creation of Value. Journal of Learning Analytics, 6(2), 1026.
[4] Gašević, D., Kovanović, V., Joksimović, S.: Piecing the learning analytics puzzle: a consolidated
model of a field of research and practice. Learning: Research and Practice. 3, 6378 (2017).
[5] Gosch, N., Andrews, D., Barreiros, C., Leitner, P., Staudegger, E., Ebner, M., Lindstaedt, S.:
Learning Analytics as a Service for Empowered Learners: From Data Subjects to Controllers. In:
LAK21: 11th International Learning Analytics and Knowledge Conference. pp. 475481.
Association for Computing Machinery, New York, NY, USA (2021).
[6] Hevner, A. R. (2007). A Three Cycle View of Design Science Research. Scandinavian Journal of
Information Systems, 19:8792.
[7] Kay, D., Korn, N., and Oppenheim, C.: Legal, Risk and Ethical Aspects of Analytics in Higher
Education. CETIS (2012).
[8] Pintrich, P. R. (2000). Chapter 14The Role of Goal Orientation in Self-Regulated Learning. In
M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of Self-Regulation (pp. 451502).
Academic Press.
[9] Prieto-Alvarez, C. G., Martinez-Maldonado, R., & Dirndorfer Anderson, T. (2018). Co-designing
learning analytics tools with learners. In J. M. Lodge, J. Cooney Horvath, & L. Corrin (Eds.),
Learning Analytics in the Classroom: Translating Learning Analytics Research for Teachers (1st
ed., pp. 93-110). Taylor & Francis.
[10] Sanders, E.B.-N., Stappers, P.J.: Co-creation and the new landscapes of design. CoDesign. 4, 5
18 (2008).
[11] Sclater, N., Bailey, P.: Code of practice for learning analytics (2018).
[12] Shum, S.B., Ferguson, R., Martinez-Maldonado, R.: Human-Centred Learning Analytics. Journal
of Learning Analytics. 6, 19 (2019).
[13] Slade, S., Prinsloo, P.: Learning Analytics - ethical issues and dilemmas. American Behavioral
Scientist 57, 15091528 (2013). DOI:
[14] Veljanova, H., Barreiros, C., Gosch, N., Staudegger, E., Ebner, M., Lindstaedt, S.: Towards
Trustworthy Learning Analytics Applications: An Interdisciplinary Approach Using the Example
of Learning Diaries. In: HCI International 2022 Posters: 24th International Conference on Human-
Computer Interaction, HCII 2022, Virtual Event, June 26July 1, 2022, Proceedings, Part III, 138-
145. Cham: Springer International Publishing (2022).
[15] Zimmerman, B.J.: Self-Regulated Learning and Academic Achievement: An Overview.
Educational Psychologist. 25, 317 (1990).
[16] Zimmerman, B., Schunk, D. (2001). Self-Regulated Learning and Academic Achievement:
Theoretical Perspectives. Mahwah, NJ: Lawrence Erlbaum Associates.
[17] Zimmerman, B.J.: Self-Regulated Learning: Theories, Measures, and Outcomes. In: Wright, J.D.
(ed.) International Encyclopedia of the Social & Behavioral Sciences (Second Edition). pp. 541
546. Elsevier, Oxford (2015).
[18] Zimmerman, B.J.: Chapter 2 - Attaining Self-Regulation: A Social Cognitive Perspective. In:
Boekaerts, M., Pintrich, P.R., and Zeidner, M. (eds.) Handbook of Self-Regulation. pp. 1339.
Academic Press, San Diego (2000).
ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
As Learning Analytics (LA) in the higher education setting increasingly transitions from a field of research to an implemented matter of fact of the learner's experience, the demand of practical guidelines to support its development is rising. LA Policies bring together different perspectives, like the ethical and legal dimensions, into frameworks to guide the way. Usually the first time learners get in touch with LA is at the act of consenting to the LA tool. Utilising an ethical (TRUESSEC) and a legal framework (GDPR), we question whether sincere consent is possible in the higher education setting. Drawing upon this premise, we then show how it might be possible to recognise the autonomy of the learner by providing LA as a service, rather than an intervention. This could indicate a paradigm shift towards the learner as empowered demander. At last, we show how this might be incorporated within the GDPR by also recognising the demand of the higher education institutions to use the learner's data at the same time. These considerations will in the future influence the development of our own LA policy: a LA criteria catalogue.
Full-text available
The field of learning analytics was founded with the goal to harness vast amounts of data about learning collected by the extensive use of technology. After the early formation, the field has now entered the next phase of maturation with a growing community who has an evident impact on research, practice, policy, and decision-making. Although learning analytics is a bricolage field borrowing from many related other disciplines, there is still no systematized model that shows how these different disciplines are pieced together. Existing models and frameworks of learning analytics are valuable in identifying elements and processes of learning analytics, but they insufficiently elaborate on the links with foundational disciplines. With this in mind, this paper proposes a consolidated model of the field of research and practice that is composed of three mutually connected dimensions – theory, design, and data science. The paper defines why and how each of the three dimensions along with their mutual relations is critical for research and practice of learning analytics. Finally, the paper stresses the importance of multi-perspective approaches to learning analytics based on its three core dimensions for a healthy development of the field and a sustainable impact on research and practice.
Full-text available
Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organisation which champions the use of digital technologies in UK education and research, has attempted to address this with the development of a Code of Practice for Learning Analytics. The Code covers the main issues institutions need to address in order to progress ethically and in compliance with the law. This paper outlines the extensive research and consultation activities which have been carried out to produce a document which covers the concerns of institutions and, critically, the students they serve. The resulting model for developing a code of practice includes a literature review, setting up appropriate governance structures, developing a taxonomy of the issues, drafting the code, consulting widely with stakeholders, publication, dissemination, and embedding it in institutions.
Full-text available
The field of learning analytics has the potential to enable higher education institutions to increase their understanding of their students’ learning needs and to use that understanding to positively influence student learning and progression. Analysis of data relating to students and their engagement with their learning is the foundation of this process. There is an inherent assumption linked to learning analytics that knowledge of a learner’s behavior is advantageous for the individual, instructor, and educational provider. It seems intuitively obvious that a greater understanding of a student cohort and the learning designs and interventions they best respond to would benefit students and, in turn, the institution’s retention and success rate. Yet collection of data and their use face a number of ethical challenges, including location and interpretation of data; informed consent, privacy, and deidentification of data; and classification and management of data. Approaches taken to understand the opportunities and ethical challenges of learning analytics necessarily depend on many ideological assumptions and epistemologies. This article proposes a sociocritical perspective on the use of learning analytics. Such an approach highlights the role of power, the impact of surveillance, the need for transparency, and an acknowledgment that student identity is a transient, temporal, and context-bound construct. Each of these affects the scope and definition of learning analytics’ ethical use. We propose six principles as a framework for considerations to guide higher education institutions to address ethical issues in learning analytics and challenges in context-dependent and appropriate ways.
Designing for effective and efficient pedagogical interventions and orchestration in complex technology-enhanced learning (TEL) ecosystems is an increasingly challenging issue. Learning analytics (LA) solutions are very promising for purposes of understanding and optimizing learning and the environments in which it occurs. Moreover, LA solutions may contribute to an improved evidence-based Teacher Inquiry into Student Learning. However, it is still unclear how can LA be designed to position teachers as designers of effective interventions and orchestration actions. This chapter argues for human-centered design (HCD) and orchestration of actionable learning analytics, and it proposes three HCD principles for LA solutions, i.e., agentic positioning of teachers and other stakeholders, integration of the learning design cycle and the LA design process, and reliance on educational theories to guide the LA solution design and implementation. The HCD principles are illustrated and discussed through two case studies in authentic learning contexts. This chapter aims at contributing to move the research community in relation to the design and implementation of Human-Centered Learning Analytics solutions for complex technology-enhanced learning ecosystems.
The value of technology lies not only with the service or functionality of the tool, but also with its subsequent value to the people who use it. New learning analytics (LA) software and platforms for capturing data and improving student learning are frequently introduced; however, they suffer from issues of adoption and continued usage by stakeholders. Scholars have previously suggested that it is not enough to introduce stakeholders (e.g., teachers and students) to LA technologies; they must also be a part of the LA creation and design process. In this paper, we will continue the ongoing work to clarify and compare different approaches of human-centred design through an overview of participatory frameworks in LA (co-design, co-creation). We will also present a case study using an LA tool used and developed within several Australian universities that was utilized over six years as an example of how LA designers can co-create dynamic platforms with teachers. Implications of participatory design frameworks in LA will also be presented through a discussion of the costs, challenges, and benefits of adopting human-centred design.
The design of effective learning analytics extends beyond sound technical and pedagogical principles. If these analytics are to be adopted and used successfully to support learning and teaching, their design process needs to take into account a range of human factors, including why and how they will be used. In this editorial, we introduce principles of human-centred design developed in other, related fields that can be adopted and adapted to support the development of Human-Centred Learning Analytics (HCLA). We draw on the papers in this special section, together with the wider literature, to define human-centred design in the field of learning analytics and to identify the benefits and challenges that this approach offers. We conclude by suggesting that HCLA will enable the community to achieve more impact, more quickly, with tools that are fit for purpose and a pleasure to use.
Self-regulated learning refers to how students become masters of their own learning processes. Neither a mental ability nor a performance skill, self-regulation is instead the self-directive process through which learners transform their mental abilities into task-related skills in diverse areas of functioning, such as academia, sports, music, and health. This article will define self-regulated learning and describe the intellectual context in which the construct emerged, changes in researchers' emphasis over time as well as current emphases, methodological issues related to the construct, and directions for future research.
Publisher Summary There is considerable agreement about the importance of self-regulation to human survival. There is disagreement about how it can be analyzed and defined in a scientifically useful way. A social cognitive perspective differs markedly from theoretical traditions that seek to define self-regulation as a singular internal state, trait, or stage that is genetically endowed or personally discovered. Instead, it is defined in terms of context-specific processes that are used cyclically to achieve personal goals. These processes entail more than metacognitive knowledge and skill; they also include affective and behavioral processes, and a resilient sense of self-efficacy to control them. The cyclical interdependence of these processes, reactions, and beliefs is described in terms of three sequential phases: forethought, performance or volitional control, and self-reflection. An important feature of this cyclical model is that it can explain dysfunctions in self-regulation, as well as exemplary achievements. Dysfunctions occur because of the unfortunate reliance on reactive methods of self-regulation instead of proactive methods, which can profoundly change the course of cyclical learning and performance. An essential issue confronting all theories of self-regulation is how this capability or capacity can be developed or optimized. Social cognitive views place particular emphasis on the role of socializing agents in the development of self-regulation, such as parents, teachers, coaches, and peers. At an early age, children become aware of the value of social modeling experiences, and they rely heavily on them when acquiring needed skills.