Thesis

The Dashboard that Loved Me: Designing adaptive learning analytics for self-regulated learning

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Learning dashboards are learning analytics (LA) tools built to make learners aware of their learning performance and behaviour and supporting self-reflection. However, most of the existing dashboards follow a “one size fits all” philosophy disregarding individual differences between learners, e.g., differences that stem from diverse cultural backgrounds, different motivations for learning or different levels of self-regulated learning (SRL) skills. In this thesis, we challenge the assumption that impactful learning analytics should be limited to making learners aware of their learning, but rather should encourage and support learners in taking action and changing their learning behaviour. We thus take a learner-centred approach and explore what information learners choose to receive on learning analytics dashboards and how this choice relates to their learning motivation and their SRL skills. We also investigate how dashboard designs support learners in making sense of the displayed information and how learner goals and level of SRL skills influence what learners find relevant on such interfaces. The large-scale experiments conducted with both higher education students and with MOOC learners bring empirical evidence as to how aligning the design of learning analytics dashboards with the learners’ intentions, learning motivation and the level of self-regulated learning skills influences the uptake and impact of such tools. The outcomes of these studies are synthesised in eleven recommendations for learning analytics dashboard design grouped according to the phase of the dashboard life-cycle to which they apply: (i) methodological aspects to be considered before designing dashboards, (ii) design requirements to be considered during the design phase and (iii) support offered to learners after the dashboard has been rolled out.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... However, considering the importance of students' expectations for design and implementation of LA in practice, they are so far not well explored across countries, with some exceptions (e.g., Schumacher & Ifenthaler, 2018;Hilliger et al., 2020;West et al., 2020;Tsai et al., 2020;Whitelock-Wainwright et al., 2021). Moreover, researchers stress that students' engagement in the design of LA services has hitherto been largely low (Viberg et al., 2018;Jivet, 2021). All this may have underpinned a slow adoption of LA in practice worldwide, including Sweden, a highly digitalized country (European Commission, 2019), in which there have hitherto been very scarce, largely small-scale attempts to implement LA Nouri et al., 2019). ...
... Also, scholars differ between ideal expectations (i.e., representing stakeholders' wanted outcomes) and predicted or realistic ones (i.e., unveiling what a person realistically expects the service is most likely to be; David et al., 2004;Dowling & Rickwood, 2016). In the setting of LA, stakeholders' ideal and predicted expectations have been recently compared (e.g., Whitelock-Wainwright et al., 2019, 2021Hilliger et al., 2020;Kollom et al., 2021). The differentiation between them allows researchers and practitioners to better understand what students realistically expect from LA services (e.g., in terms of the functionality of the system and potential privacy concerns), whilst also being attentive to what students prefer (Whitelock-Wainwright et al., 2021). ...
... Whereas the prevailing part of the studies examining students' expectations toward LA have approached their samples as homogenous student populations, scholars posit that we cannot assume that all students have similar expectations (see e.g., Jivet, 2021;Schumacher & Ifenthaler, 2018;Teasley, 2017). Their expectations may differ based on their level of self-regulation in studies and their level of motivation, as well as their cultural values (e.g., Tsiligiris et al., 2021). ...
Article
Full-text available
In order to successfully implement learning analytics (LA), we need a better understanding of student expectations of such services. Yet, there is still a limited body of research about students' expectations across countries. Student expectations of LA have been predominantly examined from a view that perceives students as a group of individuals representing homogenous views. This study examines students' ideal (i.e., representing their wanted outcomes) and predicted expectations (i.e., unveiling what they realistically expect the LA service is most likely to be) of LA by employing a person-centered approach that allows exploring the heterogeneity that may be found in student expectations. We collected data from 132 students in the setting of Swedish higher education by means of an online survey. Descriptive statistics and Latent Class Analysis (LCA) were used for the analysis. Our findings show that students' ideal expectations of LA were considerably higher compared to their predicted expectations. The results of the LCA exhibit that the Swedish students' expectations of LA were heterogeneous, both regarding their privacy concerns and their expectations of LA services. The findings of this study can be seen as a baseline of students' expectations or a cross-sectional average, and be used to inform student-centered implementation of LA in higher education.
... Th ere are many reasons behind the slow adoption of LA and data-driven decisionmaking processes in educational settings, especially in K-12 education, including challenges related to data interoperability (Dodero et al., 2017;Samuelsen, Chen, & Wasson, 2019), ethics and privacy concerns (Beerwinkle, 2021;Livingstone, 2020;Viberg, Andersson et al., 2021), development of stakeholders' data literacy (Ifenthaler et al., 2020), as well as feedback literacy skills (Jivet, 2021) and a general lack of participatory approaches that take into account the needs and preferences of the students and teachers-even less actually engage them directly-in the LA design process (Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Jivet, 2021). We should not forget that LA is about supporting learning, not just reporting it (Gasevic, Dawson, & Simiens, 2015). ...
... Th ere are many reasons behind the slow adoption of LA and data-driven decisionmaking processes in educational settings, especially in K-12 education, including challenges related to data interoperability (Dodero et al., 2017;Samuelsen, Chen, & Wasson, 2019), ethics and privacy concerns (Beerwinkle, 2021;Livingstone, 2020;Viberg, Andersson et al., 2021), development of stakeholders' data literacy (Ifenthaler et al., 2020), as well as feedback literacy skills (Jivet, 2021) and a general lack of participatory approaches that take into account the needs and preferences of the students and teachers-even less actually engage them directly-in the LA design process (Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Jivet, 2021). We should not forget that LA is about supporting learning, not just reporting it (Gasevic, Dawson, & Simiens, 2015). ...
... Taking the teacher's perspective, in today's society such goals should focus not only on the improvement of students' subject knowledge, but also on development of their critical 21st-century skills, including collaborative and self-regulated learning skills that are directly associated with academic performance, especially in online learning settings (e.g., Viberg, Khalil, & Baars, 2020). Moreover, such goals may be directed towards the development and improvements of students' data-, feedback-, and digital-literacy skills that are crucial for their successful navigation and study success in online learning settings (see, e.g., Ifenthaler et al., 2020;Jivet, 2021). ...
... Th ere are many reasons behind the slow adoption of LA and data-driven decisionmaking processes in educational settings, especially in K-12 education, including challenges related to data interoperability (Dodero et al., 2017;Samuelsen, Chen, & Wasson, 2019), ethics and privacy concerns (Beerwinkle, 2021;Livingstone, 2020;Viberg, Andersson et al., 2021), development of stakeholders' data literacy (Ifenthaler et al., 2020), as well as feedback literacy skills (Jivet, 2021) and a general lack of participatory approaches that take into account the needs and preferences of the students and teachers-even less actually engage them directly-in the LA design process (Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Jivet, 2021). We should not forget that LA is about supporting learning, not just reporting it (Gasevic, Dawson, & Simiens, 2015). ...
... Th ere are many reasons behind the slow adoption of LA and data-driven decisionmaking processes in educational settings, especially in K-12 education, including challenges related to data interoperability (Dodero et al., 2017;Samuelsen, Chen, & Wasson, 2019), ethics and privacy concerns (Beerwinkle, 2021;Livingstone, 2020;Viberg, Andersson et al., 2021), development of stakeholders' data literacy (Ifenthaler et al., 2020), as well as feedback literacy skills (Jivet, 2021) and a general lack of participatory approaches that take into account the needs and preferences of the students and teachers-even less actually engage them directly-in the LA design process (Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Jivet, 2021). We should not forget that LA is about supporting learning, not just reporting it (Gasevic, Dawson, & Simiens, 2015). ...
... Taking the teacher's perspective, in today's society such goals should focus not only on the improvement of students' subject knowledge, but also on development of their critical 21st-century skills, including collaborative and self-regulated learning skills that are directly associated with academic performance, especially in online learning settings (e.g., Viberg, Khalil, & Baars, 2020). Moreover, such goals may be directed towards the development and improvements of students' data-, feedback-, and digital-literacy skills that are crucial for their successful navigation and study success in online learning settings (see, e.g., Ifenthaler et al., 2020;Jivet, 2021). ...
Preprint
Full-text available
Learning analytics (LA) is argued to be able to improve learning outcomes, learner support and teaching. However, despite an increasingly expanding amount of student (digital) data accessible from various online education and learning platforms and the growing interest in LA worldwide as well as considerable research efforts already made, there is still little empirical evidence of impact on practice that shows the effectiveness of LA in education settings. Based on a selection of theoretical and empirical research, this chapter provides a critical discussion about the possibilities of collecting and using student data as well as barriers and challenges to overcome in providing data-informed support to educators' everyday teaching practices. We argue that in order to increase the impact of data-driven decision-making aimed at students' improved learning in education at scale, we need to better understand educators' needs, their teaching practices and the context in which these practices occur, and how to support them in developing relevant knowledge, strategies and skills to facilitate the data-informed process of digitalization of education. Source: https://arxiv.org/abs/2105.06680 & a published version at https://www.routledge.com/Online-Learning-Analytics/Liebowitz/p/book/9781032047775
... According to Wise [18], learners need four principles of pedagogical learning analytics for useful interpretation of data and productive use of LA: integration, agency, reference frame, and dialogue. As far as the reference frame is concerned, Jivet [19] identified three types of reference frames: (i) social, i.e., comparison with other peers; (ii) achievement, i.e., in terms of goal achievement; and (iii) progress, i.e., comparison with an earlier self. The three reference frames are, also, classified by the planned time of comparison. ...
Article
Full-text available
Research has shown the effectiveness of designing a Learning Analytics Dashboard (LAD) for learners and instructors, including everyone’s levels of progress and performance. An intertwined relationship exists between learning analytics (LA) and the learning process. Understanding information or data about learners and their learning journey can contribute to a deeper understanding of learners and the learning process. The design of an effective learning dashboard relies heavily on LA, including assessment of the learning process, i.e., gains and losses. A Learning Loss Recovery Dashboard (LLRD) can be designed as an instructional tool, to support the learning process as well as learners’ performance and their academic achievement. The current project proposes a LLRD prototype model to deal with potential learning loss; increase the achievement of learning outcomes; and provide a single, comprehensive learning process, where schools can evaluate and remedy any potential learning loss resulting from the distance-learning period that was caused by the COVID-19 pandemic. This systematic dashboard prototype functions to determine learning gains by K–12 learners. It is expected that the implementation of the proposed dashboard would provide students, teachers, and educational administrators with an integrated portal, for a holistic and unified remedial experience for addressing learning loss.
Thesis
Full-text available
In this work, we attempt to answer the question: "How to learn robust and interpretable rule-based models from data for machine learning and data mining, and define their optimality?".Rules provide a simple form of storing and sharing information about the world. As humans, we use rules every day, such as the physician that diagnoses someone with flu, represented by "if a person has either a fever or sore throat (among others), then she has the flu.". Even though an individual rule can only describe simple events, several aggregated rules can represent more complex scenarios, such as the complete set of diagnostic rules employed by a physician.The use of rules spans many fields in computer science, and in this dissertation, we focus on rule-based models for machine learning and data mining. Machine learning focuses on learning the model that best predicts future (previously unseen) events from historical data. Data mining aims to find interesting patterns in the available data. To answer our question, we use the Minimum Description Length (MDL) principle, which allows us to define the statistical optimality of rule-based models. Furthermore, we empirically show that this formulation is highly competitive for real-world problems.
Preprint
Full-text available
Fairness is a critical system-level objective in recommender systems that has been the subject of extensive recent research. It is especially important in multi-sided recommendation platforms where it may be crucial to optimize utilities not just for the end user, but also for other actors such as item sellers or producers who desire a fair representation of their items. Existing solutions do not properly address various aspects of multi-sided fairness in recommendations as they may either solely have one-sided view (i.e. improving the fairness only for one side), or do not appropriately measure the fairness for each actor involved in the system. In this thesis, I aim at first investigating the impact of unfair recommendations on the system and how these unfair recommendations can negatively affect major actors in the system. Then, I seek to propose solutions to tackle the unfairness of recommendations. I propose a rating transformation technique that works as a pre-processing step before building the recommendation model to alleviate the inherent popularity bias in the input data and consequently to mitigate the exposure unfairness for items and suppliers in the recommendation lists. Also, as another solution, I propose a general graph-based solution that works as a post-processing approach after recommendation generation for mitigating the multi-sided exposure bias in the recommendation results. For evaluation, I introduce several metrics for measuring the exposure fairness for items and suppliers, and show that these metrics better capture the fairness properties in the recommendation results. I perform extensive experiments to evaluate the effectiveness of the proposed solutions. The experiments on different publicly-available datasets and comparison with various baselines confirm the superiority of the proposed solutions in improving the exposure fairness for items and suppliers.
Article
Full-text available
Society is become increasingly reliant on data, making it necessary to ensure that all citizens are equipped with the skills needed to be data literate. We argue that the foundations for a data literate society begin by acquiring key data literacy competences in school. However, as yet there is no clear definition of what these should be. This paper explores the different perspectives currently offered on both data and statistical literacy and then critically examines to what extent these address the data literacy needs of citizens in today’s society. We survey existing approaches to teaching data literacy in schools, to identify how data literacy is interpreted in practice. Based on these analyses, we propose a definition of data literacy that is focused on using data to understand real world phenomena. The contribution of this paper is the creation of a common foundation for teaching and learning data literacy skills.
Article
Full-text available
Self-regulation of learning (SRL) positively affects achievement and motivation. Therefore, teachers are supposed to foster students’ SRL by providing them with strategies. However, two preconditions have to be met: teachers need to diagnose their students’ SRL to take instructional decisions about promoting SRL. To this end, teachers need knowledge about SRL to know what to diagnose. Only little research has investigated teachers’ knowledge about SRL and its assessment yet. Thus, the aim of this study was to identify teachers’ conceptions about SRL, to investigate their ideas about how to diagnose their students’ SRL, and to test relationships between both. To this end, we developed two systematic coding schemes to analyze the conceptions about SRL and the ideas about assessing SRL in the classroom among a sample of 205 teachers. The coding schemes for teachers’ open answers were developed based on models about SRL and were extended by deriving codes from the empirical data and produced satisfactory interrater reliability (conceptions about SRL: κ = 0.85, SE = 0.03; ideas about assessing SRL: κ = 0.63, SE = 0.05). The results showed that many teachers did not refer to any regulation procedure at all and described SRL mainly as student autonomy and self-directedness. Only few teachers had a comprehensive conception of the entire SRL cycle. We identified three patterns of teachers’ conceptualizations of SRL: a motivation-oriented, an autonomy-oriented, and a regulation-oriented conceptualization of SRL. Regarding teachers’ ideas about assessing their students’ SRL, teachers mainly focused on cues that are not diagnostic of SRL. Yet, many teachers knew about portfolios to register SRL among students. Finally, our results suggest that, partly, teachers’ ideas about assessing SRL varied as a function of their SRL concept: teachers with an autonomy-oriented conceptualization of SRL were more likely to use cues that are not diagnostic of SRL, such as unsystematic observation or off-task behavior. The results provide insights into teachers’ conceptions of SRL and of its assessment. Implications for future research in the field of SRL will be drawn, in particular about how to support teachers in diagnosing and fostering SR among their students.
Chapter
Full-text available
This paper presents the results of a study, carried out as part of the design-based development of an online self-assessment for prospective students in higher online education. The self-assessment consists of a set of tests – predictive of completion – and is meant to improve informed decision making prior to enrolment. The rationale being that better decision making will help to address the ongoing concern of non-completion in higher online education. A prototypical design of the self-assessment was created based on an extensive literature review and correlational research, aimed at investigating validity evidence concerning the predictive value of the tests. The present study focused on investigating validity evidence regarding the content of the self-assessment (including the feedback it provides) from a user perspective. Results from a survey among prospective students (N = 66) indicated that predictive validity and content validity of the self-assessment are somewhat at odds: three out of the five tests included in the current prototype were considered relevant by prospective students. Moreover, students rated eleven additionally suggested tests – currently not included – as relevant concerning their study decision. Expectations regarding the feedback to be provided in connection with the tests include an explanation of the measurement and advice for further preparation. A comparison of the obtained scores to a reference group (i.e., other test-takers or successful students) is not expected. Implications for further development and evaluation of the self-assessment are discussed.
Article
Full-text available
Unequal stakeholder engagement is a common pitfall of adoption approaches of learning analytics in higher education leading to lower buy-in and flawed tools that fail to meet the needs of their target groups. With each design decision, we make assumptions on how learners will make sense of the visualisations, but we know very little about how students make sense of dashboard and which aspects influence their sense-making. We investigated how learner goals and self-regulated learning (SRL) skills influence dashboard sense-making following a mixed-methods research methodology: a qualitative pre-study followed-up with an extensive quantitative study with 247 university students. We uncovered three latent variables for sense-making: transparency of design, reference frames and support for action. SRL skills are predictors for how relevant students find these constructs. Learner goals have a significant effect only on the perceived relevance of reference frames. Knowing which factors influence students' sense-making will lead to more inclusive and flexible designs that will cater to the needs of both novice and expert learners.
Technical Report
Full-text available
In order to reduce the spread of COVID-19, most countries around the world have decided to temporarily close educational institutions. However, learning has not stopped but is now fully taking place online as schools and universities provide remote schooling. Using existing literature and evidence from recent international data (Eurostat, PISA, ICILS, PIRLS, TALIS), this report attempts to gain a better understanding of how the COVID-19 crisis may affect students’ learning. It looks at the different direct and indirect ways through which the virus and the measures adopted to contain it may impact children’s achievement. ‘Conservative’ estimates for a few selected EU countries consistently indicate that, on average, students will suffer a learning loss. It is also suggested that COVID-19 will not affect students equally, will influence negatively both cognitive and non-cognitive skills acquisition, and may have important long-term consequences in addition to the short-term ones.
Conference Paper
Full-text available
As Learning Analytics (LA) moves from theory into practice, researchers have called for increased participation of stakeholders in design processes. The implementation of such methods, however, still requires attention and specification. In this report, we share strategies and insights from a co-design process that involved university students in the development of a LA tool. We describe the participatory design workshops and highlight three strategies for engaging students in the co-design of learning analytics tools.
Conference Paper
Full-text available
This paper describes the design and evaluation of personalized visualizations to support young learners' Self-Regulated Learning (SRL) in Adaptive Learning Technologies (ALTs). Our learning path app combines three Personalized Visualizations (PV) that are designed as an external reference to support learners' internal regulation process. The personalized visualizations are based on three pillars: grounding in SRL theory, the usage of trace data and the provision of clear actionable recommendations for learners to improve regulation. This quasi-experimental pre-posttest study finds that learners in the personalized visualization condition improved the regulation of their practice behavior, as indicated by higher accuracy and less complex moment-by-moment learning curves compared to learners in the control group. Learners in the PV condition showed better transfer on learning. Finally, students in the personalized visualizations condition were more likely to underestimate instead of overestimate their performance. Overall, these findings indicates that the personalized visualizations improved regulation of practice behavior, transfer of learning and changed the bias in relative monitoring accuracy.
Article
Full-text available
Increasingly learning analytics (LA) has begun utilising staff- and student-facing dashboards capturing visualisations to present data to support student success and improve learning and teaching. The use of LA is complex, multifaceted and raises many issues for consideration, including ethical and legal challenges, competing stakeholder views and implementation decisions. It is widely acknowledged that LA development requires input from various stakeholders. This conceptual article explores the LA literature to determine how student perspectives are positioned as dashboards and visualisations are developed. While the sector acknowledges the central role of students, as demonstrated here, much of the literature reflects an academic, teacher-centric or institutional view. This view reflects some of the key ethical concerns related to informed consent and the role of power translating to a somewhat paternalistic approach to students. We suggest that as students are the primary stakeholders – they should be consulted in the development and application of LA. An ethical approach to LA requires that we engage with our students in their learning and the systems and information that support that process rather than assuming we know we know what students want, what their concerns are or how they would like data presented.
Conference Paper
Full-text available
Activating learners’ deeper thinking mechanisms and reflective judgement (i.e., metacognition) improves learning performance. This study exploits visual analytics to promote metacognition and delivers task-related visualizations to provide on-demand feedback. The goal is to broaden current knowledge on the patterns of on-demand metacognitive feedback usage, with respect to learners’ performance. The results from a between-group and within-group study (N=174) revealed statistically significant differences on the feedback usage patterns between the performance-based learner clusters. Foremost, the findings shown that learners who consistently request task-related metacognitive feedback and allocate considerable amounts of time on processing it, are more likely to handle task-complexity and cope with conflicting tasks, as well as to achieve high scores. These findings contribute to considering task-related visual analytics as a metacognitive feedback format that facilitates learners’ on-task engagement and data-driven sense-making and increases their awareness of the tasks’ requirements. Implications of the approach are also discussed.
Article
Full-text available
Purpose The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach to advancing our understanding of the learning process. This promise motivated the emergence of the field of learning analytics and supported the education sector in moving toward data-informed strategic decision making. Yet, progress to date in embedding such data-informed processes has been limited. The purpose of this paper is to address a commonly posed question asked by educators, managers, administrators and researchers seeking to implement learning analytics – how do we start institutional adoption of learning analytics? Design/methodology/approach A narrative review is performed to synthesize the existing literature on learning analytics adoption in higher education. The synthesis is based on the established models for the adoption of business analytics and finding two projects performed in Australia and Europe to develop and evaluate approaches to adoption of learning analytics in higher education. Findings The paper first defines learning analytics and touches on lessons learned from some well-known case studies. The paper then reviews the current state of institutional adoption of learning analytics by examining evidence produced in several studies conducted worldwide. The paper next outlines an approach to learning analytics adoption that could aid system-wide institutional transformation. The approach also highlights critical challenges that require close attention in order for learning analytics to make a long-term impact on research and practice of learning and teaching. Originality/value The paper proposed approach that can be used by senior leaders, practitioners and researchers interested in adoption of learning analytics in higher education. The proposed approach highlights the importance of the socio-technical nature of learning analytics and complexities pertinent to innovation adoption in higher education institutions.
Article
Full-text available
Purpose The purpose of this paper is to take a student-centred perspective to understanding the range of ways that students respond to receiving information about their learning behaviours presented on a dashboard. It identifies four principles to inform the design of dashboards which support learner agency and empowerment, features which Prinsloo and Slade (2016) suggest are central to ethical adoption of learning analytics. Design/methodology/approach The study involved semi-structured interviews with 24 final-year undergraduates to explore the students’ response to receiving dashboards that showed the students’ achievement and other learning behaviours. Findings The paper identifies four principles that should be used when designing and adopting learner dashboards to support student agency and empowerment. Research limitations/implications The study was based on a small sample of undergraduate students from the final year from one academic school. The data are based on students’ self-reporting. Practical implications The paper suggests that these four principles are guiding tenets for the design and implementation of learner dashboards in higher education. The four principles are: designs that are customisable by students; foregrounds students sense making; enables students to identify actionable insights; and dashboards are embedded into educational processes. Originality/value The paper’s originality is that it illuminates student-centred principles of learner dashboard design and adoption.
Article
Full-text available
This paper presents a systematic literature review of learning analytics dashboards (LADs) research that reports empirical findings to assess the impact on learning and teaching. Several previous literature reviews identified self-regulated learning as a primary focus of LADs. However, there has been much less understanding how learning analytics are grounded in the literature on self-regulated learning and how self-regulated learning is supported. To address this limitation, this review analyzed the existing empirical studies on LADs based on the well-known model of self-regulated learning proposed by Winne and Hadwin. The results show that existing LADs i) are rarely grounded in learning theory; ii) cannot be suggested to support metacognition; iii) do not offer any information about effective learning tactics and strategies; and iv) have significant limitations in how their evaluation is conducted and reported. Based on the findings of the study and through the synthesis of the literature, the paper proposes that future research and development should not make any a priori design decisions about representation of data and analytic results in learning analytics systems such as LADs. To formalize this proposal, the paper defines the model for user-centered learning analytics systems (MULAS). MULAS consists of the four dimensions that are cyclically and recursively inter-connected including: theory, design, feedback, and evaluation.
Conference Paper
Full-text available
Learning Analytics Dashboards (LAD) are becoming an increasingly popular way to provide students with personalised feedback. Despite the number of LADs being developed, significant research gaps exist around the student perspective, especially how students make sense of graphics provided in LADs, and how they intend to act on the feedback provided therein. This study employed a randomized-controlled trial to examine students' sense-making of LADs showing four different frames of reference, and to what extent the impact of LADs was mediated by baseline self-regulation. Using a mix of quantitative and qualitative data analysis, the results revealed rather distinct patterns in students' sense-making across the four LADs. These patterns involved the intersection of visual salience and planned learning actions. However, collectively, across all four LADs a consistent theme emerged around students planned learning actions. This theme was classified as time and study environment management. A key finding of the study is that the use of LADs as a primary feedback process should be personalized and include training and support to aid student sensemaking.
Conference Paper
Full-text available
Learning Analytics (LA) studies the learning process in order to optimize learning opportunities for students. Although LA has quickly risen to prominence, there remain questions regarding the impact LA has made to date. To evaluate the extent that LA has impacted our understanding of learning and produced insights that have been translated to mainstream practice or contributed to theory, we reviewed the research published in 2011-2018 LAK conferences and Journal of Learning Analytics. The reviewed studies were coded according to five dimensions: study focus, data types, purpose, institutional setting, and scale of research and implementation. The coding and subsequent epistemic network analysis indicates that while LA research has developed in the areas of focus and sophistication of analyses, the impact on practice, theory and frameworks have been limited. We hypothesize that this finding is due to a continuing predominance of small-scale techno-centric exploratory studies that to date have not fully accounted for the multi-disciplinarity that comprises education. For the field to reach its potential in understanding and optimizing learning and learning environments, there must be a purposeful shift to move from exploratory models to more holistic and integrative systems-level research. This necessitates greater effort applied to understanding the research cycles that emerge when multiple knowledge domains coalesce into new fields of research.
Article
Full-text available
This paper introduces a learning analytics policy and strategy framework developed by a cross-European research project team -- SHEILA (Supporting Higher Education to Integrate Learning Analytics), based on interviews with 78 senior managers from 51 European higher education institutions across 16 countries. The framework was developed adapting the RAPID Outcome Mapping Approach (ROMA), which is designed to develop effective strategies and evidence-based policy in complex environments. This paper presents four case studies to illustrate the development process of the SHEILA framework and how it can be used iteratively to inform strategic planning and policy processes in real world environments, particularly for large-scale implementation in higher education contexts. To this end, the selected cases were analyzed at two stages, each a year apart, to investigate the progression of adoption approaches that were followed to solve existing challenges, and identify new challenges that could be addressed by following the SHEILA framework.
Article
Full-text available
The massive and open nature of MOOCs contribute to attracting a great diversity of learners. However, the learners who enroll in these types of courses have trouble achieving their course objectives. One reason for this is that they do not adequately self-regulate their learning. In this context, there are few tools to support these strategies in online learning environment. Also, the lack of metrics to evaluate the impact of the proposed tools makes it difficult to identify the key features of this type of tools. In this paper, we present the process for designing NoteMyProgress, a web application that complements a MOOC platform and supports self-regulated learning strategies. For designing NoteMyProgress we followed the Design Based Research methodology. For the evaluation of the tool, we conducted two case studies using a beta version of NoteMyProgress over three MOOCs offered in Coursera. The findings of these two case studies are presented as a set of lessons learned that inform about: (1) a list of requirements to inform the design of a second version of the tool; (2) a list of requirements that could serve as a reference for other developers to design new tools that support self-regulated learning in MOOCs.
Article
Full-text available
Depending on their motivational dispositions, students choose different learning strategies and vary in their persistence in reaching learning outcomes. As learning is more and more facilitated through technology, analytics approaches allow learning processes and environments to be analyzed and optimized. However, research on motivation and learning analytics is at an early stage. Thus, the purpose of this quantitative survey study is to investigate the relation between students’ motivational dispositions and the support they perceive through learning analytics. Findings indicate that facets of students’ goal orientations and academic self-concept impact students’ expectations of the support from learning analytics. The findings emphasize the need to design highly personalized and adaptable learning analytics systems that consider students’ dispositions and needs. The present study is a first attempt at linking empirical evidence, motivational theory, and learning analytics.
Article
Full-text available
Learning analytics can improve learning practice by transforming the ways we support learning processes. This study is based on the analysis of 252 papers on learning analytics in higher education published between 2012 and 2018. The main research question is: What is the current scientific knowledge about the application of learning analytics in higher education? The focus is on research approaches, methods and the evidence for learning analytics. The evidence was examined in relation to four earlier validated propositions: whether learning analytics i) improve learning outcomes, ii) support learning and teaching, iii) are deployed widely, and iv) are used ethically. The results demonstrate that overall there is little evidence that shows improvements in students’ learning outcomes (9%) as well as learning support and teaching (35%). Similarly, little evidence was found for the third (6%) and the forth (18%) proposition. Despite the fact that the identified potential for improving learner practice is high, we cannot currently see much transfer of the suggested potential into higher educational practice over the years. However, the analysis of the existing evidence for learning analytics indicates that there is a shift towards a deeper understanding of students’ learning experiences for the last years.
Conference Paper
Full-text available
DDashboards are the graphical interface that manipulate and present data about students’ learning behaviours (attendance, visits to the library, attainment etc.). Although only a few UK HEIs have developed a dashboard for students, most other UK HEIs have an aspiration to develop their use (Sclater 2014). Hence it is timely and significant to understand the ways that students respond to seeing data presented to them in the form of a dashboard. This paper discusses and conceptualises the findings from a small scale study, funded by Society for Research in Higher Education. The study involved twenty-four final year undergraduate students in a single faculty in a UK University. The study focussed on the ways that students interpret and respond to seeing data about their learning presented via a dashboard. Sutton’s (2012) three pillars of feedback literacy: knowing, becoming and acting, were employed to understand the potential of dashboards for supporting students’ motivation towards their learning. The paper suggests that, similar to feedback literacy, there is a type of literacy associated with dashboards that has components of knowing, becoming and acting and that employing these concepts helps us to understand how students’ respond to dashboards. By identifying students' engagement with dashboards as a literacy practice rather than a technical skill or understanding, the paper argues that we need to focus on students' growing identity that is embedded into a sense of being and is individually experienced and constructed. Hence the notion of dashboard literacy suggests that institutions need to work with students to develop their personal and reflective processes to enhance the way that dashboards are interpreted. The paper provides evidence that students may be motivated by seeing their data presented in a dashboard format and this can lead to changes in behaviour which are likely to lead to improved student outcomes and attainment. It also illustrates how students’ engagement with dashboards is highly individual and dependent on their personal disposition and orientation to learning. Hence their use needs to be treated cautiously recognising the power that these tools have to shape impact on students' well-being alongside their potential.
Article
Full-text available
Student feedback literacy denotes the understandings, capacities and dispositions needed to make sense of information and use it to enhance work or learning strategies. In this conceptual paper, student responses to feedback are reviewed and a number of barriers to student uptake of feedback are discussed. Four inter-related features are proposed as a framework underpinning students’ feedback literacy: appreciating feedback; making judgments; managing affect; and taking action. Two well-established learning activities, peer feedback and analysing exemplars, are discussed to illustrate how this framework can be operationalized. Some ways in which these two enabling activities can be re-focused more explicitly towards developing students’ feedback literacy are elaborated. Teachers are identified as playing important facilitating roles in promoting student feedback literacy through curriculum design, guidance and coaching. The implications and conclusion summarise recommendations for teaching and set out an agenda for further research.
Conference Paper
Full-text available
Data science is now impacting the education sector, with a growing number of commercial products and research prototypes providing learning dashboards. From a human-centred computing perspective, the end-user's interpretation of these visualisations is a critical challenge to design for, with empirical evidence already showing that `usable' visualisations are not necessarily effective from a learning perspective. Since an educator's interpretation of visualised data is essentially the construction of a narrative about student progress, we draw on the growing body of work on Data Storytelling (DS) as the inspiration for a set of enhancements that could be applied to data visualisations to improve their communicative power. We present a pilot study that explores the effectiveness of these DS elements based on educators' responses to paper prototypes. The dual purpose is understanding the contribution of each visual element for data storytelling, and the effectiveness of the enhancements when combined.
Conference Paper
Full-text available
This paper aims to link student facing Learning Analytics Dashboards (LADs) to the corpus of research on Open Learner Models (OLMs), as both have similar goals. We conducted a systematic review of literature on OLMs and compared the results with a previously conducted review of LADs for learners in terms of (i) data use and modelling, (ii) key publication venues, (iii) authors and articles, (iv) key themes, and (v) system evaluation. We highlight the similarities and differences between the research on LADs and OLMs. Our key contribution is a bridge between these two areas as a foundation for building upon the strengths of each. We report the following key results from the review: in reports of new OLMs, almost 60% are based on a single type of data; 33% use behavioral metrics; 39% support input from the user; 37% have complex models; and just 6% involve multiple applications. Key associated themes include intelligent tutoring systems, learning analytics, and self-regulated learning. Notably, compared with LADs, OLM research is more likely to be interactive (81% of papers compared with 31% for LADs), report evaluations (76% versus 59%), use assessment data (100% versus 37%), provide a comparison standard for students (52% versus 38%), but less likely to use behavioral metrics, or resource use data (33% against 75% for LADs). In OLM work, there was a heightened focus on learner control and access to their own data.
Conference Paper
Full-text available
Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built with the purpose of empowering teachers and learners to make informed decisions about the learning process. Related work has investigated learning dashboards, yet none have explored the theoretical foundation that should inform the design and evaluation of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our analysis revealed that very few dashboard evaluations take into account the educational concepts that were used as a theoretical foundation for their design. Furthermore, we report findings suggesting that comparison with peers, a common reference frame for contextualising information on learning analytics dashboards, was not perceived positively by all learners. We summarise the insights gathered through our literature review in a set of recommendations for the design and evaluation of learning analytics dashboards for learners.
Conference Paper
Full-text available
In order to further the field of learning analytics (LA), researchers and experts may need to look beyond themselves and their own perspectives and expertise to innovate LA platforms and interventions. We suggest that by co-creating with the users of LA, such as educators and students, researchers and experts can improve the usability, usefulness, and draw greater understanding from LA interventions. Within this article we discuss the current LA issues and barriers and how co-creation strategies can help address many of these challenges. We further outline the considerations, both pre- and during interventions, which support and foster a co-created strategy for learning analytics interventions.
Conference Paper
Full-text available
While learning analytics (LA) is maturing from being a trend to being part of the institutional toolbox, the need for more empirical evidences about the effects for LA on the actual stakeholders, i.e. learners and teachers, is increasing. Within this paper we report about a further evaluation iteration of the Evaluation Framework for Learning Analytics (EFLA) that provides an efficient and effective measure to get insights into the application of LA in educational institutes. For this empirical study we have thus developed and implemented several LA widgets into a MOOC platform’s dashboard and evaluated these widgets using the EFLA as well as the framework itself using principal component and reliability analysis. The results show that the EFLA is able to measure differences between widget versions. Furthermore, they indicate that the framework is highly reliable after slightly adapting its dimensions.
Article
Full-text available
Higher education institutions are developing the capacity for learning analytics. However, the technical development of learning analytics has far exceeded the consideration of ethical issues around learning analytics. We examined higher education academics’ knowledge, attitudes, and concerns about the use of learning analytics though four focus groups (N = 35). Thematic analysis of the focus group transcripts identified five key themes. The first theme, ‘Facilitating learning’, represents academics’ perceptions that, while currently unrealized, there could be several benefits to learning analytics that would help their students. Three themes; ‘Where are the ethics?’, ‘What about the students!’, and ‘What about me!’ represented academics’ perceptions of how learning analytics could pose some considerable difficulties within a higher education context. A final theme ‘Let’s move forward together’ reflected that despite some challenges and concerns about learning analytics, academics perceived scope for learning analytics to be beneficial if there is collaboration between academics, students, and the university. The findings highlight the need to include academics in the development of learning analytics policies and procedures to promote the suitability and widespread adoption of learning analytics in the higher education sector.
Conference Paper
Full-text available
It has been long argued that learning analytics has the potential to act as a \middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
Article
This paper is in response to the manuscript entitled “Student perceptions of privacy principles for learning analytics” (Ifenthaler and Schumacher, Student perceptions of privacy principles for learning analytics. Educational Technology Research and Development, 64(5), 923–938, 2016) from a practice perspective. Learning analytics (the use of data science methods to generate actionable educational insights) have great potential to impact learning practices during the shift to digital. In particular, they can help fill a critical information gap for students created by an absence of classroom-based cues and the need for increased self-regulation in the online environment, However the adoption of learning analytics in effective, ethical and responsible ways is non-trivial. Ifenthaler and Schumacher (2016) present important findings about students’ perceptions of learning analytics’ usefulness and privacy, signaling the need for a student-centered paradigm, but stop short of addressing its implications for the creation and adoption of learning analytics tools. In this paper we address this limitation by describing the three specific shifts needed in current learning analytics practice for analytics to be accepted by and effective for students: (1) involve students in the creation of analytic tools meant to serve them; (2) develop analytics that are contextualized, explainable and configurable; and (3) empower students’ agency in using analytic tools as part of their larger process of learning. These shifts are currently in different stages of maturity and adoption in mainstream learning analytics practice. The primary implication of this work is a call to action for researchers and practitioners to rethink and reshape how students participate in the creation, interpretation and impact of learning analytics.
Article
This response to Neil Selwyn’s paper, ‘What’s the problem with learning analytics?’, relates his work to the ethical challenges associated with learning analytics and proposes six ethical challenges for the field.
Chapter
Empowering learners and teachers to take control of the indicator design process can increase value and drive forward the acceptance and adoption of learning analytics (LA) systems. In this paper, we present the Human-Centered Indicator Design (HCID) approach as a theory-driven framework to guide the systematic and effective design of LA indicators that truly meet user needs. With human needs at the forefront, the aim of HCID is to enable a shift from an ad hoc, data-first to a systematic, people-first approach to indicator design. As a proof of concept, we present a case of applying the HCID approach to indicator design in a higher education context. The case demonstrates that HCID could be a viable approach to design useful LA indicators for and with their users, informed by design practices from the human-computer interaction (HCI) and information visualization fields.
Chapter
Learning Analytics (LA) dashboards aggregate indicators about student performance and demographics to support academic advising. The majority of existing dashboards are targeted at advisors and professors, but not much attention has been put into students’ need for information for their own academic decision-making. In this study, we identify relevant indicators from a student perspective using a mixed methods approach. Qualitative data was obtained from an open-ended online questionnaire answered by 31 student representatives, and quantitative data was collected from a closed-ended online questionnaire answered by 652 students from different cohorts. Findings point out relevant indicators to help students choose what courses to take in an upcoming academic period. Since this study is part of a large research project that has motivated the adoption of academic advising dashboards in different Latin American universities, these findings were also contrasted with indicators of these advising dashboards, informing future developments targeting students.
Conference Paper
Learning analytics dashboards are at the core of the LAK vision to involve the human into the decision-making process. The key focus of these dashboards is to support better human sense-making and decision-making by visualising data about learners to a variety of stakeholders. Early research on learning analytics dashboards focused on the use of visualisation and prediction techniques and demonstrates the rich potential of dashboards in a variety of learning settings. Present research increasingly uses participatory design methods to tailor dashboards to the needs of stakeholders, employs multimodal data acquisition techniques, and starts to research theoretical underpinnings of dashboards. In this paper, we present these past and present research efforts as well as the results of the VISLA19 workshop on ``Visual approaches to Learning Analytics'' that was held at LAK19 with experts in the domain to identify and articulate common practices and challenges for the domain. Based on an analysis of the results, we present a research agenda to help shape the future of learning analytics dashboards.
Article
The design of effective learning analytics extends beyond sound technical and pedagogical principles. If these analytics are to be adopted and used successfully to support learning and teaching, their design process needs to take into account a range of human factors, including why and how they will be used. In this editorial, we introduce principles of human-centred design developed in other, related fields that can be adopted and adapted to support the development of Human-Centred Learning Analytics (HCLA). We draw on the papers in this special section, together with the wider literature, to define human-centred design in the field of learning analytics and to identify the benefits and challenges that this approach offers. We conclude by suggesting that HCLA will enable the community to achieve more impact, more quickly, with tools that are fit for purpose and a pleasure to use.
Conference Paper
Learning Analytics Dashboards (LADs) are predicated on the notion that access to more academic information can help students regulate their academic behaviors, but what is the association between information seeking preferences and help-seeking practices among college students? If given access to more information, what might college students do with it? We investigated these questions in a series of two studies. Study 1 validates a measure of information-seeking preferences---the Motivated Information-Seeking Questionnaire (MISQ)----using a college student sample drawn from across the country (n = 551). In a second study, we used the MISQ to measure college students' (n=210) performance-avoid (i.e., avoiding seeming incompetent in relation to one's peers) and performance-approach (i.e., wishing to outperform one's peers) information seeking preferences, their help-seeking behaviors, and their ability to comprehend line graphs and bar graphs---two common graphs types for LADs. Results point to a negative relationship between graph comprehension and help-seeking strategies, such as attending office hours, emailing one's professor for help, or visiting a study center---even after controlling for academic performance and demographic characteristics. This suggests that students more capable of readings graphs might not seek help when needed. Further results suggest a positive relationship between performance-approach information-seeking preferences, and how often students compare themselves to their peers. This study contributes to our understanding of the motivational implications of academic data visualizations in academic settings, and increases our knowledge of the way students interpret visualizations. It uncovers tensions between what students want to see, versus what it might be more motivationally appropriate for them to see. Importantly, the MISQ and graph comprehension measure can be used in future studies to better understand the role of students' information seeking tendencies with regard to their interpretation of various kinds of feedback present in LADs.
Conference Paper
Current Learning Analytics (LA) systems are primarily designed with University staff members as the target audience; very few are aimed at students, with almost none being developed with direct student involvement and undertaking a comprehensive evaluation. This paper describes a HEFCE funded project that has employed a variety of methods to engage students in the design, development and evaluation of a student facing LA dashboard. LA was integrated into the delivery of 4 undergraduate modules with 169 student sign-ups. The design of the dashboard uses a novel approach of trying to understand the reasons why students want to study at university and maps their engagement and predicted outcomes to these motivations, with weekly personalised notifications and feedback. Students are also given the choice of how to visualise the data either via a chart-based view or to be represented as themselves. A mixed-methods evaluation has shown that students' feelings of dependability and trust of the underlying analytics and data is variable. However, students were mostly positive about the usability and interface design of the system and almost all students once signed-up did interact with their LA. The majority of students could see how the LA system could support their learning and said that it would influence their behaviour. In some cases, this has had a direct impact on their levels of engagement. The main contribution of this paper is the transparent documentation of a User Centred Design approach that has produced forms of LA representation, recommendation and interaction design that go beyond those used in current similar systems and have been shown to motivate students and impact their learning behaviour.
Conference Paper
Instead of measuring success in Massive Open Online Courses (MOOCs) based on certification and completion-rates, researchers started to define success with alternative metrics recently, for example by evaluating the intention-behavior gap and goal achievement. Especially self-regulated and goal-oriented learning have been identified as critical skills to be successful in online learning environments with low guidance like MOOCs, but technical support is rare. Therefore, this paper examines the current technical capabilities and limitations of goal-oriented learning in MOOCs. An observational study to explore how well learners in five MOOCs achieved their initial learning objectives was conducted, and the results are compared with similar studies. Afterwards, a concept with a focus on technical feasibility and automation outlines how personalized learning objectives can be supported and implemented on a MOOC platform.
Article
Massive Open Online Courses (MOOCs) allow learning to take place anytime and anywhere with little external monitoring by teachers. Characteristically, highly diverse groups of learners enrolled in MOOCs are required to make decisions related to their own learning activities to achieve academic success. Therefore, it is considered important to support self-regulated learning (SRL) strategies and adapt to relevant human factors (e.g., gender, cognitive abilities, prior knowledge). SRL supports have been widely investigated in traditional classroom settings, but little is known about how SRL can be supported in MOOCs. Very few experimental studies have been conducted in MOOCs at present. To fill this gap, this paper presents a systematic review of studies on approaches to support SRL in multiple types of online learning environments and how they address human factors. The 35 studies reviewed show that human factors play an important role in the efficacy of SRL supports. Future studies can use learning analytics to understand learners at a fine-grained level to provide support that best fits individual learners. The objective of the paper is twofold: (a) to inform researchers, designers and teachers about the state of the art of SRL support in online learning environments and MOOCs; (b) to provide suggestions for adaptive self-regulated learning support.
Article
Technological advancements have generated a strong interest in exploring learner behavior data through learning analytics to provide both learner and instructor with process-oriented feedback in the form of dashboards. However, little is known about the typology of dashboard feedback relevant for different learning goals, learners and teachers. While most dashboards and the feedback that they give are based only on learner performance indicators, research shows that effective feedback needs also to be grounded in the regulatory mechanisms underlying learning processes and an awareness of the learner’s learning goals. The design artefact presented in this article uses a conceptual model that visualizes the relationships between dashboard design and the learning sciences to provide cognitive and behavioral process-oriented feedback to learners and teachers to support regulation of learning. A practical case example is given that demonstrates how the ideas presented in the paper can be deployed in the context of a learning dashboard. The case example uses several analytics/visualization techniques based on empirical evidence from earlier research that successfully tested these techniques in various learning contexts.
Conference Paper
As learning analytics (LA) systems become more common, teachers and students are often required to not only make sense of the user interface (UI) elements of a system, but also to make meaning that is pedagogically appropriate to the learning context. However, we suggest that the dominant way of thinking about the relationship between representation and meaning results in an overemphasis on the UI, and that re-thinking this relationship is necessary to create systems that can facilitate deeper meaning making. We propose a conceptual view as a basis for discussion among the LA and HCI communities around a different way of thinking about meaning making, specifically that it should be explicit in the design process, provoking greater consideration of system level elements such as algorithms, data structures and information flow. We illustrate the application of the conceptualisation with two cases of LA design in the areas of Writing Analytics and Multi-modal Dashboards.
Article
Big data in education offers unprecedented opportunities to support learners and advance research in the learning sciences. Analysis of observed behaviour using computational methods can uncover patterns that reflect theoretically established processes, such as those involved in self-regulated learning (SRL). This research addresses the question of how to integrate this bottom-up approach of mining behavioural patterns with the traditional top-down approach of using validated self-reporting instruments. Using process mining, we extracted interaction sequences from fine-grained behavioural traces for 3458 learners across three Massive Open Online Courses. We identified six distinct interaction sequence patterns. We matched each interaction sequence pattern with one or more theory-based SRL strategies and identified three clusters of learners. First, Comprehensive Learners, who follow the sequential structure of the course materials, which sets them up for gaining a deeper understanding of the content. Second, Targeting Learners, who strategically engage with specific course content that will help them pass the assessments. Third, Sampling Learners, who exhibit more erratic and less goal-oriented behaviour, report lower SRL, and underperform relative to both Comprehensive and Targeting Learners. Challenges that arise in the process of extracting theory-based patterns from observed behaviour are discussed, including analytic issues and limitations of available trace data from learning platforms. Link: https://authors.elsevier.com/a/1W59V2f~UW0yDj
Article
To scale student success, institutions may want to consider treating students more as partners, not just as customers or intervention recipients. One way to do so is sharing behavioral and academic feedback data that helps nudge students into taking responsibility for learning. The following chapter is drawn from the author's dissertation work (Fritz, 2016).
Article
This article is a comprehensive literature review of student-facing learning analytics reporting systems that track learning analytics data and report it directly to students. This literature review builds on four previously conducted literature reviews in similar domains. Out of the 945 articles retrieved from databases and journals, 93 articles were included in the analysis. Articles were coded based on the following five categories: functionality, data sources, design analysis, student perceptions, and measured effects. Based on this review, we need research on learning analytics reporting systems that targets the design and development process of reporting systems, not only the final products. This design and development process includes needs analyses, visual design analyses, information selection justifications, and student perception surveys. In addition, experiments to determine the effect of these systems on student behavior, achievement, and skills are needed to add to the small existing body of evidence. Furthermore, experimental studies should include usability tests and methodologies to examine student use of these systems, as these factors may affect experimental findings. Finally, observational study methods, such as propensity score matching, should be used to increase student access to these systems but still rigorously measure experimental effects.