Conference Paper

Subversive Learning Analytics

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This paper puts forth the idea of a subversive stance on learning analytics as a theoretically-grounded means of engaging with issues of power and equity in education and the ways in which they interact with the usage of data on learning processes. The concept draws on efforts from fields such as socio-technical systems and critical race studies that have a long history of examining the role of data in issues of race, gender and class. To illustrate the value that such a stance offers the field of learning analytics, we provide examples of how taking a subversive perspective can help us to identify tacit assumptions-in-practice, ask generative questions about our design processes and consider new modes of creation to produce tools that operate differently in the world.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The focus on novel techniques in LA often neglects analysis of the composition of learners and their contexts and may inadvertently amplify biases and inequities in learning ecosystems (Uttamchandani & Quick, 2022;Sabnis et al., 2022). The LA community is too focused on what techniques are employed rather than on what questions can be answered (Motz et al., 2023;Wise et al., 2021). While a recent special issue in the Journal of Learning Analytics focused on fairness, equity, and inclusion, and these themes are nominally included as focus areas at top LA conferences annually, the field has underconsidered the broader sociotechnical ecosystem in which LA are embedded (Wise et al., 2021;Meaney & Fikes, 2019), which may contribute to digital education systems reifying existing inequalities (Meaney, 2021;2023). ...
... The LA community is too focused on what techniques are employed rather than on what questions can be answered (Motz et al., 2023;Wise et al., 2021). While a recent special issue in the Journal of Learning Analytics focused on fairness, equity, and inclusion, and these themes are nominally included as focus areas at top LA conferences annually, the field has underconsidered the broader sociotechnical ecosystem in which LA are embedded (Wise et al., 2021;Meaney & Fikes, 2019), which may contribute to digital education systems reifying existing inequalities (Meaney, 2021;2023). While these questions are of growing importance in the literature (Uttamchandani & Quick, 2022), they are noticeably absent in Motz et al. (2023). ...
... A framework that includes relevant macro, meso, and micro level domains and concepts of the field would build upon early attempts to establish more cohesive epistemological frameworks (Buckingham Shum, 2012;Greller & Drachsler, 2012), the absence of which in recent work is noted in Motz et al. (2023). These frameworks could enable researchers to explore the broader sociotechnical systems in which LA are embedded, which has been called for in the literature (Wise et al., 2021). ...
Article
Full-text available
The emphasis on "Can we do this thing with these data?" detracts from other important goals of learning analytics (LA) beyond understanding and optimizing student learning. It also undermines LA's ability to develop a coherent epistemology and be consistent with its stated values of openness, fairness, and justice, equity, diversity, and inclusion (JEDI). A comprehensive theoretical framework accounting for LA's macro, meso, and micro level domains would enable more epistemologically cohesive, sociotechnical system approaches to be taken, which could help shift nominal commitment to values to a normative one.
... The data feminism (DF) approach is of interest to the LA community as it contributes a set of critical principles that questions the epistemic, political, and economic power entrenched in the so-called "objective," "neutral," "scientific" data, and the discourses about it (D'Ignazio & Klein, 2020). Such principles echo the need to embrace critical perspectives to discuss the risk that LA practices could lead to exacerbating existing inequities in education (Wise et al., 2021), and the impact that LA in practice may have on "the cognitive, emotional, and social well-being of learners in the context of broader social structures" (Ochoa et al., 2020, p. 1). DF has the potential, in this respect, to offer the LA community a conceptual lens to support LA's increasing cognizance and social responsibility for its own epistemic, educational, and sociocultural consequences. ...
... This way of reasoning firmly criticizes educational discourses (e.g., data-driven educational practices) in which the value of the student data is related to its intrinsic quality of being "objective," "neutral," and "scientific." In this respect, Wise et al. (2021), for example, problematize the "objectivity myth" that aims to question the neutrality of LA data and measures that are taken for granted in the "premise that data collected about learners and learning can provide a sound basis for making decisions to improve learning processes and outcomes" (p. 641). ...
... These examples tie back to issues of equity in LA (Holstein & Doroudi, 2022;Wise et al., 2021;Williamson & Kizilcec, 2022) that compel us to question what is known about the equitable access and use of novel technologies like ADM systems by minoritized groups of students? Reich and Ito (2017) remind us that often good intentions to deploy novel technologies in equitable ways fall short because of the quite different sociocultural and economic realities of those developing and deploying technology and those to be served; for instance, low-income and minoritized groups. ...
Article
Full-text available
The focus of ethics in learning analytics (LA) frameworks and guidelines is predominantly on procedural elements of data management and accountability. Another, less represented focus is on the duty to act and LA as a moral practice. Data feminism as a critical theoretical approach to data science practices may offer LA research and practitioners a valuable lens through which to consider LA as a moral practice. This paper examines what data feminism can offer the LA community. It identifies critical questions for further developing and enabling a responsible stance in LA research and practice taking one particular case — algorithmic decision-making — as a point of departure.
... In this context, I argue that taken-for-granted views of student data and tacit assumptions about the value of accessing and analyzing student data for learning matter for generating "new frameworks and socio-technical environments for making learning an integral part of life" (Fischer et al., 2022, p. 6). We find an example of how we can engage with researchers, practitioners and educators' assumptions about data in Wise et al. (2021). In their "subversive learning analytics" piece, Wise et al. (2021) draw on critical feminist and intersectional studies in human-computer interaction (Costanza-Chock, 2020; D'Ignazio and Klein, 2020) to identify a series of hidden assumptions regarding issues of power and representation in learning analytics. ...
... We find an example of how we can engage with researchers, practitioners and educators' assumptions about data in Wise et al. (2021). In their "subversive learning analytics" piece, Wise et al. (2021) draw on critical feminist and intersectional studies in human-computer interaction (Costanza-Chock, 2020; D'Ignazio and Klein, 2020) to identify a series of hidden assumptions regarding issues of power and representation in learning analytics. More, in particular, Wise et al. (2021) expose the "objectivity myth" regarding assumptions about data, its neutrality, apolitical character, unproblematic way to be used to predict the future, power to tell the whole story and the quality of speaking by itself (p. ...
... In their "subversive learning analytics" piece, Wise et al. (2021) draw on critical feminist and intersectional studies in human-computer interaction (Costanza-Chock, 2020; D'Ignazio and Klein, 2020) to identify a series of hidden assumptions regarding issues of power and representation in learning analytics. More, in particular, Wise et al. (2021) expose the "objectivity myth" regarding assumptions about data, its neutrality, apolitical character, unproblematic way to be used to predict the future, power to tell the whole story and the quality of speaking by itself (p. 641). ...
Article
Full-text available
Purpose The purpose of this commentary is to comment on Fischer's et al. (2022) Design/methodology/approach This commentary responds to Fischer's et al. (2022) call on envisioning alternate conceptualizations of learning for the digital era. In doing so, the author argues for reconsidering learning in its socio-material condition, situated and made of a web of social and technological relations. In this context, the author takes a relational lens on learning to interrogate taken-for-granted views of (1) personalizing data increasingly used for student learning, (2) emerging educational infrastructures for higher education and (3) the student–teacher relationship mediated by data and algorithms. Findings In this commentary, the author suggested unpacking assumptions about learning that get reflected in the design and discourses about socio-technical arrangements and transformations in education. Taking the example of personalized learning, the author has illustrated a relational mode of thinking that leads the author to argue that, renewed definitions of learning must be discussed multidimensionally and, most importantly, situated in the material world that learning is already part of. Research limitations/implications Following Fischer et al. (2022, this issue), the author agrees that the focus should be on finding “new ways of organizing learning by exploring opportunities for radically new conceptualizations and practices.” In order to do that it is of utmost importance to problematize the social and material conditions that actively configure learning today and infrastructure tomorrow's learning. Hopefully, these observations will entice others to discuss further the educational transformations at stake in the age of datafication and algorithmic decision-making. Originality/value The author argues for reconsidering learning in its socio-material condition, which is situated and made of a web of social and technological relations. In this context, the author argues that any attempt to reconceptualize learning from a transformational perspective in the 21st century, as mentioned by Fischer et al. (2022), needs to interrogate views and assumptions about the socio-technical relationships researchers, practitioners and educators are contributing to via their practices and discourses.
... Often, we see educational experts excluded from the conversation, unable to judge or evaluate highly technical approaches that rely upon advanced statistics, machine learning, and other methods that quickly come to resemble a black box [40]. This difficulty in traversing disciplinary boundaries leads to a number of critical challenges for the field around who gets to Participate in defining the questions that the field explores [64], and how educational Theory can be used to inform results. Furthermore, the field is often accused of having a lack of Transparency, which can make it difficult to Intervene in the system even with strong statistical results. ...
... We believe that much of this issue springs from a lack of genuine collaboration between analysts (who know and understand the methods to apply to the data) and educators (who understand the domain, and so should have influence over what critical questions are asked by LA researchers). In short: Who gets to define the problems that LA seeks to address? [64] While stakeholder engagement and participatory approaches are common in the field, we must take care to ensure that a genuine collaboration emerges, where educational experts are not sidelined due to an inability to penetrate the fog often created by complex analytical methods. But how can we invite people in, to genuinely collaborate when the methodologies used can take significant expertise to master? ...
... These representations can be manipulated by participants as they think through an educational problem. Similarly, a recent paper, by Wise et al. [64] has proposed that missing data can be highlighted to help stakeholders understand what is not there in a LA report as much as what is, so helping educators to reach informed decisions from a position that is aware of potential blind spots. Such methods move us closer to what Callon et al. [10] term technical democracy, but they do so by attending to the need for informal representations, typically of the sociotechnical system in which a tool must be embedded, or user interface designs. ...
... To provide assistance that would improve students' learning by off ering adequate teacher-support mechanisms, some LA researchers argue for the need of a "subversive" LA that aims to grapple with the ramifi cations of LA research eff orts and critically engage with the ways in which power, race, gender, and class infl uence and are infl uenced by LA work (Wise, Sarmiento, & Boothe, 2021). Th is becomes especially critical in the context of the recent turn to online learning worldwide, which has contributed to the considerably increased quantity and also kinds of data being generated about students and by students at diff erent levels of education. ...
... Th is becomes especially critical in the context of the recent turn to online learning worldwide, which has contributed to the considerably increased quantity and also kinds of data being generated about students and by students at diff erent levels of education. Th is, in turn, requires LA researchers and practitioners to address questions about bias, equity, surveillance, ownership (of data), control, and agency in LA use (Wise et al., 2021). ...
... Hence, a coherent understanding requires careful analysis of learning contexts and actors so as to be able to address critical questions about (algorithmic) bias, equity, surveillance, ownership (of data), control, and agency in LA design and use (Wise et al., 2021). Contextual understanding may also require taking into account qualitative data, yet largely unexplored by the LA community-for example, cultural diff erences (e.g., in terms of power distance) that may infl uence the adoption and the eff ectiveness of LA interventions (see, e.g., Davis et al., 2017;Kizilcec & Cohen, 2017). ...
... To provide assistance that would improve students' learning by off ering adequate teacher-support mechanisms, some LA researchers argue for the need of a "subversive" LA that aims to grapple with the ramifi cations of LA research eff orts and critically engage with the ways in which power, race, gender, and class infl uence and are infl uenced by LA work (Wise, Sarmiento, & Boothe, 2021). Th is becomes especially critical in the context of the recent turn to online learning worldwide, which has contributed to the considerably increased quantity and also kinds of data being generated about students and by students at diff erent levels of education. ...
... Th is becomes especially critical in the context of the recent turn to online learning worldwide, which has contributed to the considerably increased quantity and also kinds of data being generated about students and by students at diff erent levels of education. Th is, in turn, requires LA researchers and practitioners to address questions about bias, equity, surveillance, ownership (of data), control, and agency in LA use (Wise et al., 2021). ...
... Hence, a coherent understanding requires careful analysis of learning contexts and actors so as to be able to address critical questions about (algorithmic) bias, equity, surveillance, ownership (of data), control, and agency in LA design and use (Wise et al., 2021). Contextual understanding may also require taking into account qualitative data, yet largely unexplored by the LA community-for example, cultural diff erences (e.g., in terms of power distance) that may infl uence the adoption and the eff ectiveness of LA interventions (see, e.g., Davis et al., 2017;Kizilcec & Cohen, 2017). ...
Preprint
Full-text available
Learning analytics (LA) is argued to be able to improve learning outcomes, learner support and teaching. However, despite an increasingly expanding amount of student (digital) data accessible from various online education and learning platforms and the growing interest in LA worldwide as well as considerable research efforts already made, there is still little empirical evidence of impact on practice that shows the effectiveness of LA in education settings. Based on a selection of theoretical and empirical research, this chapter provides a critical discussion about the possibilities of collecting and using student data as well as barriers and challenges to overcome in providing data-informed support to educators' everyday teaching practices. We argue that in order to increase the impact of data-driven decision-making aimed at students' improved learning in education at scale, we need to better understand educators' needs, their teaching practices and the context in which these practices occur, and how to support them in developing relevant knowledge, strategies and skills to facilitate the data-informed process of digitalization of education. Source: https://arxiv.org/abs/2105.06680 & a published version at https://www.routledge.com/Online-Learning-Analytics/Liebowitz/p/book/9781032047775
... These alternative possibilities are much needed given the scale of structural oppression that higher education institutions can expose learners to through laws, policies, and practices. Our argument in this article combines abolitionist imagination with insights from emerging approaches like "subversive analytics" (Wise et al., 2021) in that we aim to examine, destabilize, and redistribute power in educational technology systems while generating new questions and possibilities from this destabilization. As Wise and colleagues (2021) suggest, it is prudent to explore possibilities of subversion and imagination, and to generate alternative educational futures using "speculative design examples of potential novel artifacts (tools, processes, experiences) ... [These] are first attempts, meant to shift perspective, displace, invert and question, by means of estrangement and humor, offered as 'objects to think with' to spur critical discussion" (p. ...
... Third, our vignettes aim to provide stakeholders in the academic LA community with a provocative point of reference to further longstanding discussions and debate about what may be problematic and possible in the field. Speculative narratives and designs are shared in order to shift perspectives and elicit critical reflection (e.g., Wise et al., 2021). Collectively, our three speculative vignettes help to raise several timely questions about power, ethics, and agency. ...
Article
Full-text available
This article advances an abolitionist reframing of learning analytics (LA) that explores the benefits of productive disorientation, considers potential harms and care made possible by LA, and suggests the abolitionist imagination as an important educational practice. By applying abolitionist concepts to LA, we propose it may be feasible to open new critiques and social futures that build toward equity-oriented LA design and implementation. We introduce speculative methods to advance three vignettes imagining how LA could be weaponized against students or transformed into a justice-directed learning tool. Our speculative methods aim to destabilize where power in LA has been routinely located and contested, thereby opening new lines of inquiry about more equitable educational prospects. Our concluding discussion addresses how speculative design and fiction are complementary methods to the abolitionist imagination and can be pragmatic tools to help build a world with fairer, more equitable, and responsible LA technologies.
... By introducing the intention of LA at the beginning of the process, leaning on it as a touchpoint throughout the process, and including LA experts as part of the design team, we can understand if its inclusion would ultimately improve the integration of LA into the educational game's design. The second principle adopted was to advocate for the player or learner, recognizing the potential consequences of their absence [32]. We wanted to ensure that the players were agents of their own learning decisions and avoid any potential harm that could arise by removing them from the process. ...
Chapter
Game-Based Learning Analytics (GBLA) is a method of integrating Game-Based Learning and Learning Analytics to enhance the effectiveness of the learning process in educational games by providing actionable learning analytics information to players within the game environment. This paper presents initial insights from an integrated design process to achieve this goal. Through a series of interdisciplinary workshops culminating in a participant playtest session, this paper highlights the challenges and opportunities that arise from this integration. The findings point to the importance of early consideration of learning analytics in game design, the challenges of conceptualizing the differences between game feedback and learning feedback, and how learners interpret learning feedback within the context of the game. This work lays the groundwork for future research and development in the interaction between Game-Based Learning and Learning Analytics.
... For a more comprehensive analysis of current research in LA, see the systematic review by Du et al. (2021). There are, also, a number of authors who have mapped alternative research agendas for LA such as Gunn (2014), Selwyn (2019Selwyn ( , 2020, Wise, Sarmiento, and Boothe (2021), and Prinsloo et al. (2021). ...
Chapter
Full-text available
Data, and specifically student data, has always been an integral part of good teaching as well as providing evidence for strategic and operational planning, resource allocation, pedagogy, and student support. As Open, Distance, and Digital Education (ODDE) become increasingly datafied, institutions have access to greater volumes, variety, and granularity of student data, from more diverse sources than ever before. This provides huge opportunity for institutions, and specifically educators and course support teams, to better understand learning, and provide more appropriate and effective student support. With the emergence of learning analytics (LA) in 2011, the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs, gained momentum, both as research focus and practice. Since then, LA have become institutionalized in many higher education institutions, mostly in residential institutions located in the Global North, and established a prolific presence in research on student learning in digitized environments. While LA has become institutionalized in the Open University (UK), it remains an emerging research focus and practice in many ODDE institutions across the world. This chapter considers the implications of LA for ODDE research and practice by first providing a brief overview of the evolution of LA, and specifically the theoretical influences in this evolution. A selection of major research findings and discourses in LA are then discussed, before the chapter is concluded with some open questions for a research agenda for LA in ODDE.
... They have the potential to perpetuate and exacerbate inequalities due to inherent biases in training datasets, their potential to influence real-world decision making, and the ways they can, intentionally or inadvertently, reify dominant, rather than diverse, cultural practices (Kizilcec & Lee, 2022;Roscoe et al., 2022). For example, AI systems are known to overrepresent the views and values of rich nations in the Global North, due to both the origins of datasets for training algorithms and what perspectives are (and are not) represented when AI-EDSS are developed (Wise et al., 2021). This raises critical concerns over calcifying, or even amplifying, existing inequalities (Baker & Hawn, 2022;Tao et al., 2023). ...
Article
Full-text available
A key goal of educational institutions around the world is to provide inclusive, equitable quality education and lifelong learning opportunities for all learners. Achieving this requires contextualized approaches to accommodate diverse global values and promote learning opportunities that best meet the needs and goals of all learners as individuals and members of different communities. Advances in learning analytics (LA), natural language processes (NLP), and artificial intelligence (AI), especially generative AI technologies, offer potential to aid educational decision making by supporting analytic insights and personalized recommendations. However, these technologies also raise serious risks for reinforcing or exacerbating existing inequalities; these dangers arise from multiple factors including biases represented in training datasets, the technologies' abilities to take autonomous decisions, and processes for tool development that do not centre the needs and concerns of historically marginalized groups. To ensure that Educational Decision Support Systems (EDSS), particularly AI‐powered ones, are equipped to promote equity, they must be created and evaluated holistically, considering their potential for both targeted and systemic impacts on all learners, especially members of historically marginalized groups. Adopting a socio‐technical and cultural perspective is crucial for designing, deploying, and evaluating AI‐EDSS that truly advance educational equity and inclusion. This editorial introduces the contributions of five papers for the special section on advancing equity and inclusion in educational practices with AI‐EDSS. These papers focus on (i) a review of biases in large language models (LLMs) applications offers practical guidelines for their evaluation to promote educational equity, (ii) techniques to mitigate disparities across countries and languages in LLMs representation of educationally relevant knowledge, (iii) implementing equitable and intersectionality‐aware machine learning applications in education, (iv) introducing a LA dashboard that aims to promote institutional equality, diversity, and inclusion, and (v) vulnerable student digital well‐being in AI‐EDSS. Together, these contributions underscore the importance of an interdisciplinary approach in developing and utilizing AI‐EDSS to not only foster a more inclusive and equitable educational landscape worldwide but also reveal a critical need for a broader contextualization of equity that incorporates the socio‐technical questions of what kinds of decisions AI is being used to support, for what purposes, and whose goals are prioritized in this process.
... As emphasized by Wise et al. (2021), pedagogical needs play a vital role alongside data in guiding LA research. The emphasized cases underscore the imperative of bridging the gap between "real-world education" and researchers' perspectives. ...
Article
Full-text available
This paper explores co-design in Japanese education for deploying data-driven educational technology and practice. Although there is a growing emphasis on data to inform educational decision-making and personalize learning experiences, challenges such as data interoperability and inconsistency with teaching goals prevent practitioners from participating. Co-design, characterized by involving various stakeholders, is instrumental in addressing the evolving needs of technology deployment. Japan's educational context aligns with co-design implementation, with a learning and evidence analytics infrastructure facilitating data collection and analysis. From the Japanese co-design practice of educational technologies, the paper highlights a 6-phase co-design framework: motivate, pilot, implement, refine, evaluate, and maintain. The practices focus on data-driven learning strategies, technology interventions, and across-context dashboards, covering assorted learning contexts in Japan. By advocating for a co-design culture and data-driven approaches to enhance education in Japan, we offer insights for education practitioners, policymakers, researchers, and industry developers.
... There are extensive calls in the literature to focus on equitable learning experiences with visualization dashboards centered on diversity, justice, and equity (Williamson & Kizilcec, 2021;Williamson & Kizilcec, 2022;Wise et al., 2021). Teachers often cited a lack of racial diversity in their classrooms, or their own perspectives on equity, as reasons for not using the equity visualizations. ...
Article
Full-text available
This study examined the ways in which an equity analytics tool — the SEET system — supported middle school science teachers’ reflections on the experiences of diverse students in their classrooms. The tool provides teachers with “equity visualizations” — disaggregated classroom data by gender and race — designed to support teachers to notice and reflect on inequitable patterns in student participation in classroom knowledge-building activities, as well as “whole class visualizations” that enable teachers to look at participation patterns. The visualizations were based on survey data collected from students reflecting on the day’s lessons, responding to questions aligned with three theoretical constructs indicative of equitable participation in science classrooms: coherence, relevance, and contribution. The study involved 42 teachers, divided into two cohorts, participating in a two-month professional learning series. Diary studies and semi-structured interviews were used to probe teachers’ perceptions of the visualizations’ usability, usefulness, and utility for supporting their reflections on student experiences and instructional practices. A key result is that only the “equity visualizations” prompted teacher reflections on diverse student experiences. However, despite the support equity visualizations provided for this core task, the teachers consistently ranked the whole class visualizations as more usable and useful.
... Indeed, Gašević et al. (2017) emphasise the importance of prioritising key design principles alongside educational theory and data science to achieve optimal outcomes from MARTINEZ-MALDONADO 2 LA systems. Some authors within the LA community (e.g., Dollinger et al., 2019;Knight et al., 2020;Prieto-Alvarez et al., 2018;Wise et al., 2021) and beyond (e.g., see review by Victorelli et al., 2020), have demonstrated the potential of enhancing the effectiveness of communicating data insights in specific social contexts by actively involving stakeholders at various, if not all, stages of the design process, including definition, ideation, prototyping and testing. In particular, human-centred design principles are increasingly recognised as essential for developing artificial intelligence (AI) and analytics innovations that prioritise the needs of users and the context where they are used (Shneiderman, 2022). ...
Article
The notion of Human-Centred Learning Analytics (HCLA) is gaining traction as educators and learning analytics (LA) researchers recognise the need to align analytics and artificial intelligence (AI) technologies with specific educational contexts. This has led an increasing number of researchers to adopt approaches, such as co-design and participatory design, to include educators and students as active participants in the LA design process. However, some experts contend that HCLA must go beyond stakeholder participation by also focusing on the safety, reliability, and trustworthiness of the analytics, and balancing human control and algorithmic automation. While the adoption of human-centred design (HCD) approaches promises considerable benefits, implementing these practices in data-intensive educational systems may not be straightforward. This paper emphasises the critical need to address specific ethical, technical, and methodological challenges tied to educational and data contexts, in order to effectively apply HCD in the creation of LA systems. We delve into four key challenges in this context: i) ensuring representative participation; ii) considering expertise and lived experiences in LA design; iii) balancing stakeholder input with technological innovation; and iv) navigating power dynamics and decision-making processes. LIFT Learning: Engage further with the author and the challenges faced when adopting human-centered approaches in learning analytics at the companion LIFT Learning site. The author will be hosting a live webinar on Tuesday 12 September 2023 at 6-7pm AEST (8-9am UTC). Visit the LIFT Learning site at https://apps.lift.c3l.ai/learning/course/coursev1:LEARNINGLETTERS+0106+2023 to sign up for your free ticket to this event. If you are unable to attend the webinar live, then the recording will be made available on this same site shortly afterwards.
... Indeed, Gašević et al. (2017) emphasise the importance of prioritising key design principles alongside educational theory and data science to achieve optimal outcomes from MARTINEZ-MALDONADO 2 LA systems. Some authors within the LA community (e.g., Dollinger et al., 2019;Knight et al., 2020;Prieto-Alvarez et al., 2018;Wise et al., 2021) and beyond (e.g., see review by Victorelli et al., 2020), have demonstrated the potential of enhancing the effectiveness of communicating data insights in specific social contexts by actively involving stakeholders at various, if not all, stages of the design process, including definition, ideation, prototyping and testing. In particular, human-centred design principles are increasingly recognised as essential for developing artificial intelligence (AI) and analytics innovations that prioritise the needs of users and the context where they are used (Shneiderman, 2022). ...
Article
The notion of Human-Centred Learning Analytics (HCLA) is gaining traction as educators and learning analytics (LA) researchers recognise the need to align analytics and artificial intelligence (AI) technologies with specific educational contexts. This has led an increasing number of researchers to adopt approaches, such as co-design and participatory design, to include educators and students as active participants in the LA design process. However, some experts contend that HCLA must go beyond stakeholder participation by also focusing on the safety, reliability, and trustworthiness of the analytics, and balancing human control and algorithmic automation. While the adoption of human-centred design (HCD) approaches promises considerable benefits, implementing these practices in data-intensive educational systems may not be straightforward. This paper emphasises the critical need to address specific ethical, technical, and methodological challenges tied to educational and data contexts, in order to effectively apply HCD in the creation of LA systems. We delve into four key challenges in this context: i) ensuring representative participation; ii) considering expertise and lived experiences in LA design; iii) balancing stakeholder input with technological innovation; and iv) navigating power dynamics and decision-making processes.
... Within all of this, attention has expanded beyond questions of how to deal with existing data to also examine means for collecting better, more useful, and extensible kinds. This also necessitates acknowledgment of the kinds of data that have not traditionally been collected and the dynamics of power in who makes these decisions [62,14]. ...
Chapter
Full-text available
Over the last ten years learning analytics (LA) has grown from a hypothetical future into a concrete field of inquiry and a global community of researchers and practitioners. Although the LA space may appear sprawling and complex, there are some clear through-lines that the new student or interested practitioner can use as entry points. Four of these are presented in this chapter, 1. LA as a concern or problem to be solved, 2. LA as an opportunity, 3. LA as field of inquiry and 4. the researchers and practitioners that make up the LA community. These four ways of understanding LA and its associated constructs, technologies, domains and history can hopefully provide a launch pad not only for the other chapters in this handbook but the world of LA in general. A world that, although large, is open to all who hold an interest in data and learning and the complexities that follow from the combination of the two.
... This study seeks to address a research gap in the work on AIED in this journal, as well as ethical concerns over the increasing global deployment of AIED and university teachers' generally low AI literacy (Luckin et al., 2022;Laupichler et al., 2022). It also echoes recent efforts on social justice in learning analytics (Holstein & Doroudi, 2022;Wise et al., 2021;Williamson & Kizilcec, 2022) and educational technology (Cerratto Pargman et al., 2023). ...
Article
This study investigates university teachers’ relationships with emerging technologies by focusing on the uptake of artificial intelligence in higher education practices. We utilise an experimental philosophy approach to i) determine university teachers’ intuitions around universities’ responsibilities to adopt new artificial intelligence technologies, ii) understand the conditions under which university teachers consider artificial intelligence defensible and, hence, would be willing to introduce such tools and services into their daily practice and iii) specify teachers’ self-reported knowledge of artificial intelligence. An online survey, where participants were sent one of three different cases (Case A related to first-generation students, Case B, an archetypical student, Case C, students with a learning disability), and 18 identical questions across all three cases. The survey was distributed among 1773 teachers. Based on the responses of 194 university teachers, we identify differences related to responsibility, equity, and knowledge about artificial intelligence. Our quantitative data exhibited that in Case A and Case C, the respondents ranked to a higher degree that universities should use artificial intelligence tools and systems to achieve equitable outcomes. Data revealed no statistically significant associations among the respondents with regard to background variables in Case A. However, Case B and Case C disclosed statistically significant associations concerning gender, age, faculty, and academic position among the participants. Moreover, we identify teachers’ fears and scepticism about artificial intelligence in higher education, concerns about fairness and responsibility, and lack of knowledge about artificial intelligence and resources to engage with artificial intelligence in teaching practices.
... Fairness, equity, and responsibility have been integral to discourses inside and outside of the learning analytics community since its emergence. Applications include ethical and privacy issues in learning analytics (Slade & Prinsloo, 2013;Khalil et al., 2018); responsible learning analytics ; human-centred learning analytics (Buckingham Shum et al., 2019), student-centred learning analytics (Broughan & Prinsloo, 2020); value-sensitive design (Chen & Zhu, 2019); the ethics of care (Prinsloo & Slade, 2017); social justice (Wise et al., 2021); student rights (Berendt et al., 2020); perspectives from data feminism (Garcia et al., 2020); critical data studies (Kitchin & Lauriault, 2014); and other related fields. Francis et al. (2020) and Baker and Hawn (2022) raise fairness and equity questions around whether data-driven learning analytics support systems might advantage some students over others. ...
Article
Full-text available
Learning analytics has the capacity to provide potential benefit to a wide range of stakeholders within a range of educational contexts. It can provide prompt support to students, facilitate effective teaching, highlight aspects of course content that might be adapted, and predict a range of possible outcomes, such as students registering for more appropriate courses, supporting students’ self-efficacy, or redesigning a course’s pedagogical strategy. It will do all these things based on the assumptions and rules that learning analytics developers set out. As such, learning analytics can exacerbate existing inequalities such as unequal access to support or opportunities based on (any combination of) race, gender, culture, age, socioeconomic status, etc., or work to overcome the impact of such inequalities on realizing student potential. In this editorial, we introduce several selected articles that explore the principles of fairness, equity, and responsibility in the context of learning analytics. We discuss existing research and summarize the papers within this special section to outline what is known, and what remains to be explored. This editorial concludes by celebrating the breadth of work set out here, but also by suggesting that there are no simple answers to ensuring fairness, trust, transparency, equity, and responsibility in learning analytics. More needs to be done to ensure that our mutual understanding of responsible learning analytics continues to be embedded in the learning analytics research and design practice.
... In our understanding, responsible learning analytics are also rooted in critical theory that explicitly addresses "power relations" and the "relationships between culture, forms of domination, and society" (Prinsloo & Slade, 2018, p. 4). This aligns well with the subversive stance on learning analytics as proposed by Wise et al. (2021) as a way of engaging with issues of power and equity and their interaction with data on learning processes. Being rooted in critical theory and the subversive stance on learning analytics on the one hand, as well as navigating the tensions between the obligation to act and accountability on the other, responsible learning analytics open up a fuzzy, complex space for decision-making in practice. ...
Article
Full-text available
Learning Analytics are an academic field with promising usage scenarios for many educational domains. At the same time, learning analytics come with threats such as the amplification of historically grown inequalities. A range of general guidelines for more equity-focused learning analytics have been proposed but fail to provide sufficiently clear guidance for practitioners. With this paper, we attempt to address this theory–practice gap through domain-specific (physics education) refinement of the general guidelines We propose a process as a starting point for this domain-specific refinement that can be applied to other domains as well. Our point of departure is a domain-specific analysis of historically grown inequalities in order to identify the most relevant diversity categories and evaluation criteria. Through two focal points for normative decision-making, namely equity and bias, we analyze two edge cases and highlight where domain-specific refinement of general guidance is necessary. Our synthesis reveals a necessity to work towards domain-specific standards and regulations for bias analyses and to develop counter-measures against (intersectional) discrimination. Ultimately, this should lead to a stronger equity-focused practice in future.
... For a more comprehensive analysis of current research in LA, see the systematic review by Du et al. (2021). There are, also, a number of authors who have mapped alternative research agendas for LA such as Gunn (2014), Selwyn (2019Selwyn ( , 2020, Wise, Sarmiento, and Boothe (2021), and Prinsloo et al. (2021). ...
Chapter
Full-text available
Data, and specifically student data, has always been an integral part of good teaching as well as providing evidence for strategic and operational planning, resource allocation, pedagogy, and student support. As Open, Distance, and Digital Education (ODDE) become increasingly datafied, institutions have access to greater volumes, variety, and granularity of student data, from more diverse sources than ever before. This provides huge opportunity for institutions, and specifically educators and course support teams, to better understand learning, and provide more appropriate and effective student support. With the emergence of learning analytics (LA) in 2011, the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs, gained momentum, both as research focus and practice. Since then, LA have become institutionalized in many higher education institutions, mostly in residential institutions located in the Global North, and established a prolific presence in research on student learning in digitized environments. While LA has become institutionalized in the Open University (UK), it remains an emerging research focus and practice in many ODDE institutions across the world. This chapter considers the implications of LA for ODDE research and practice by first providing a brief overview of the evolution of LA, and specifically the theoretical influences in this evolution. A selection of major research findings and discourses in LA are then discussed, before the chapter is concluded with some open questions for a research agenda for LA in ODDE.
... The imbrication of ethics and pragmatism distinguish participatory design from other user-centered approaches, such as those traditionally used in human-computer interaction. This makes the approach especially apt to respond to some of the challenges that have emerged in LA related to questions of equity, bias, surveillance, ownership, control and agency [38,52]. As human-centered learning analytics approaches [4] become increasingly adopted in LA, it is timely to take a closer look at how researchers are applying the participatory tradition of design to better understand the potential of tools and methods and their impact on design and learning ecosystems. ...
... Given that the group profiles are somewhat varied, they provide a more nuanced view of learning. This in turn does not privilege normative ideas of what good collaborative learning might look like (Rummel et al., 2016;Wise et al., 2021). ...
Article
Full-text available
This exploratory paper highlights how problem‐based learning (PBL) provided the pedagogical framework used to design and interpret learning analytics from Crystal island: ecojourneys, a collaborative game‐based learning environment centred on supporting science inquiry. In Crystal island: ecojourneys, students work in teams of four, investigate the problem individually and then utilize a brainstorming board, an in‐game PBL whiteboard that structured the collaborative inquiry process. The paper addresses a central question: how can PBL support the interpretation of the observed patterns in individual actions and collaborative interactions in the collaborative game‐based learning environment? Drawing on a mixed method approach, we first analyzed students' pre‐ and post‐test results to determine if there were learning gains. We then used principal component analysis (PCA) to describe the patterns in game interaction data and clustered students based on the PCA. Based on the pre‐ and post‐test results and PCA clusters, we used interaction analysis to understand how collaborative interactions unfolded across selected groups. Results showed that students learned the targeted content after engaging with the game‐based learning environment. Clusters based on the PCA revealed four main ways of engaging in the game‐based learning environment: students engaged in low to moderate self‐directed actions with (1) high and (2) moderate collaborative sense‐making actions, (3) low self‐directed with low collaborative sense‐making actions and (4) high self‐directed actions with low collaborative sense‐making actions. Qualitative interaction analysis revealed that a key difference among four groups in each cluster was the nature of verbal student discourse: students in the low to moderate self‐directed and high collaborative sense‐making cluster actively initiated discussions and integrated information they learned to the problem, whereas students in the other clusters required more support. These findings have implications for designing adaptive support that responds to students' interactions with in‐game activities. Practitioner notes What is already known about this topic Learning analytic methods have been effective for understanding student learning interactions for the purposes of assessment, profiling student behaviour and the effectiveness of interventions. However, the interpretation of analytics from these diverse data sets are not always grounded in theory and challenges of interpreting student data are further compounded in collaborative inquiry settings, where students work in groups to solve a problem. What this paper adds Problem‐based learning as a pedagogical framework allowed for the design to focus on individual and collaborative actions in a game‐based learning environment and, in turn, informed the interpretation of game‐based analytics as it relates to student's self‐directed learning in their individual investigations and collaborative inquiry discussions. The combination of principal component analysis and qualitative interaction analysis was critical in understanding the nuances of student collaborative inquiry. Implications for practice and/or policy Self‐directed actions in individual investigations are critical steps to collaborative inquiry. However, students may need to be encouraged to engage in these actions. Clustering student data can inform which scaffolds can be delivered to support both self‐directed learning and collaborative inquiry interactions. All students can engage in knowledge‐integration discourse, but some students may need more direct support from teachers to achieve this.
... These practices may reinforce low expectations for certain student groups, such as exceptional students or English language learners. Wise et al. (2021) proposed alternative stances in learning analytics, such as prompting users to develop counter narratives that reflect on and recognize the different strengths of minoritized groups. We call for future research to consider how to detect and design for combinations of asking questions that are generative, as opposed to exacerbating biases. ...
Chapter
The proliferation of learning dashboards in K–12 education calls for deeper knowledge of how such tools fit into existing data use routines in schools. An exemplar routine is coaching cycles, where teachers and experienced coaches collaborate on using data to improve student learning. Within this context, our mixed-method study draws from interviews and think-aloud sessions about dashboard visualizations, conducted with teachers and instructional coaches from four school districts in the United States. Our analyses illuminate how different professional roles express varied patterns of response when facing LA dashboards. The analyses uncover a particular pattern of asking questions to resolve uncertainty that leads to further reflection and action. We discuss how uncertainty towards data and visualizations can be productive for teacher learning and the implications of designing for uncertainty in dashboards. Mapping LA dashboards to educators’ daily routines is important to promote the uptake of analytics towards improving instructional practices.
... Articulating the problem also allows us to ask second-order questions about what kinds of learning problems get addressed and who sets this agenda (Wise, Sarmiento, & Boothe, 2021). For example, the framing of "identifying and supporting struggling students" implies a situation in which changing the actions of the identified students is the best means to improve a problematic success rate, rather than considering changes to the material or curriculum that might make a course more accessible or inviting to particular groups of students. ...
Article
Full-text available
The ongoing changes and challenges brought on by the COVID-19 pandemic have exacerbated long-standing inequities in education, leading many to question basic assumptions about how learning can best benefit all students. Thirst for data about learning is at an all-time high, sometimes without commensurate attention to ensuring principles this community has long valued: privacy, transparency, openness, accountability, and fairness. How we navigate this dynamic context is critical for the future of learning analytics. Thinking about the issue through the lens of JLA publications over the last eight years, we highlight the important contributions of “problem-centric” rather than “tool-centric” research. We also value attention (proximal or distal) to the eventual goal of closing the loop, connecting the results of our analyses back to improve the learning from which they were drawn. Finally, we recognize the power of cycles of maturation: using information generated about real-world uses and impacts of a learning analytics tool to guide new iterations of data, analysis, and intervention design. A critical element of context for such work is that the learning problems we identify and choose to work on are never blank slates; they embed societal structures, reflect the influence of past technologies; and have previous enablers, barriers and social mediation acting on them. In that context, we must ask the hard questions: What parts of existing systems is our work challenging? What parts is it reinforcing? Do these effects, intentional or not, align with our values and beliefs? In the end what makes learning analytics matter is our ability to contribute to progress on both immediate and long-standing challenges in learning, not only improving current systems, but also considering alternatives for what is and what could be. This requires including stakeholder voices in tackling important problems of learning with rigorous analytic approaches to promote equitable learning across contexts. This journal provides a central space for the discussion of such issues, acting as a venue for the whole community to share research, practice, data and tools across the learning analytics cycle in pursuit of these goals.
Chapter
The first decade of research and thinking in learning analytics has seen shifting foci and evolving theoretical foundations. Indeed, the very role of theory in, about, and of learning analytics has been addressed in different ways across sub-sections of the field. From an early emphasis on data, computing and systems, the field has increasingly connected with theories and ideas from educational research, sociology, philosophy, and the learning sciences. The richness resulting from this confluence of theories provides a foundation for enhancing the use of data and analytics for learning, differentiating learning analytics from other pre-existing fields, and for deepening the understanding of how learning works. However, despite the broadening scope of theoretical perspectives in, about, and of learning analytics, old tensions remain, and new ones have emerged. As is evident in other areas of educational research, there are intractable differences in fundamental philosophies that create barriers to meaningful dialogue and the progression of the field. In this chapter, we will provide an overview of the key theoretical trends in learning analytics research and place these trends within a broader perspective. Specifically, we will describe the theorising of learning analytics, theory in learning analytics, and theories about learning analytics. Theoretical origins of learning analytics in historical context Theoretical critique from within learning analytics Theoretical critique from without learning analytics Theory used in learning analytics Theory arising from learning analytics
Chapter
Full-text available
This chapter makes a distinction between two approaches to theory in Learning Analytics (LA). The first approach pursues the identification of suitable theories ‘for’ LA and requires a commitment to the scientific study of learning. The second approach rests upon a broader sociological outlook on Learning Analytics as a field of knowledge. The chapter is concerned with the second approach and proposes a sociological theory of the ‘LA Gaze’. It examines the underlying principles of a theory of the LA Gaze and explores, using a genealogical approach informed by the sociology of scientific knowledge, the social, historical, and material relations between Learning Analytics and contiguous epistemic fields: computer and data science, Educational Data Mining (EDM) and educational research. The chapter contributes to the LA field by encouraging a historical and critical reflection on its origins and its future directions.
Article
Full-text available
The artificial intelligence in education (AIED) community has produced technologies that are widely used to support learning, teaching, assessment, and administration. This work has successfully enhanced test scores, course grades, skill acquisition, comprehension, engagement, and related outcomes. However, the prevailing approach to adaptive and personalized learning has two main steps. First, the process involves detecting the areas of knowledge and competencies where students are deficient. This process also identifies when or how a student is considered “at risk” or in some way “lacking.” Second, the approach involves providing timely, individualized assistance to address these deficiencies. However, a considerable body of research outside our field has established that such deficit framing, by itself, leads to reactive and less productive strategies. In deficit-based frameworks, powerful student strengths, skills, and schemas—their assets—are not explicitly leveraged. In this paper, we outline an asset-based paradigm for AIED research and development, proposing principles for our community to build upon learners’ rich funds of knowledge. We propose that embracing asset-based approaches will empower the AIED community (e.g., educators, developers, and researchers) to reach broader populations of learners. We discuss the potentially transformative role this approach could play in supporting learning and personal development for all learners, particularly for students who are historically underserved, marginalized, and “deficitized."
Article
Full-text available
To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community.
Chapter
Learning analytics have been argued as a key enabler to improving student learning at scale. Yet, despite considerable efforts by the learning analytics community across the world over the past decade, the evidence to support that claim is hitherto scarce, as is the demand from educators to adopt it into their practice. We introduce the concept of practicable learning analytics to illuminate what learning analytics may look like from the perspective of practice, and how this practice can be incorporated in learning analytics designs so as to make them more attractive for practitioners. As a framework for systematic analysis of the practice in which learning analytics tools and methods are to be employed, we use the concept of Information Systems Artifact (ISA) which comprises three interrelated subsystems: the informational, the social and the technological artefacts. The ISA approach entails systemic thinking which is necessary for discussing data-driven decision making in the context of educational systems, practices, and situations. The ten chapters in this book are presented and reflected upon from the ISA perspective, clarifying that detailed attention to the social artefact is critical to the design of practicable learning analytics.KeywordsLearning analyticsPracticableInformation systems artefactImpact
Preprint
Learning analytics (LA) provides data-driven feedback that aims to improve learning and inform action. For learners, LA-based feedback may scaffold self-regulated learning skills, which are crucial to learning success. For teachers, LA-based feedback may help the evaluation of teaching effects and the need for interventions. However, the current development of LA has presented problems related to the cognitive, social-affective, and structural dimensions of feedback. In light of this, this position paper argues that attention needs to shift from the design of LA as a feedback product to one that facilitates a process in which both teachers and students play active roles in meaning-making. To this end, implications for feedback literacy in the context of LA are discussed.
Article
There is a huge and growing amount of data that is already captured in the many, diverse digital tools that support learning. Additionally, learning data is often inaccessible to teachers or served in a manner that fails to support or inform their teaching and design practice. We need systematic, learner-centred ways for teachers to design learning data that supports them. Drawing on decades of Artificial Intelligence in Education (AIED) research, we show how to make use of important AIED concepts: (1) learner models; (2) Open Learner Models (OLMs); (3) scrutability and (4) Ontologies. We show how these concepts can be used in the design of OLMs, interfaces that enable a learner to see and interact with an externalised representation of their learning progress. We extend this important work by demonstrating how OLMs can also drive a learner-centred design process of learning data. We draw on the work of Biggs on constructive alignment (Biggs, 1996, 1999, 2011), which has been so influential in education. Like Biggs, we propose a way for teachers to design the learning data in their subjects and we illustrate the approach with case studies. We show how teachers can use this approach today, essentially integrating the design of learning data along with the learning design for their subjects. We outline a research agenda for designing the collection of richer learning data. There are three core contributions of this paper. First, we present the terms OLM, learner model, scrutability and ontologies, as thinking tools for systematic design of learning data. Second, we show how to integrate this into the design and refinement of a subject. Finally, we present a research agenda for making this process both easier and more powerful.
Article
Full-text available
Due to the ongoing digitalisation of workplaces and educational settings, human activity underpinning learning and work is increasingly mediated by technology. The advancement of artificial intelligence (AI) and its integration into everyday technologies influences how people are exposed to information, interact, learn and make decisions. We argue that technology, data and evolving AI applications affect how humans enact and experience life and work, changing the context for learning. Hence, as this paper argues, the current notion of lifelong learning needs a revisit to embrace technology at its foundation. To bring freely chosen goals and ownership in one's learning to the fore, in the context of the coming AI age, we argue for the telos of learning to shift from human capital to human development, with the spotlight on capabilities. The paper draws on the capability approach to inform individuals and organisations of how they can support human development throughout lifelong learning. We then move to provide examples of how technologies underpinning workplace practices can be seen with the focus on capabilities as individuals learn to create value. Practitioner notes What is known about the topic? The primary notion of lifelong learning refers to adult learning processes. The policy perspective that dominates organisation of lifelong learning opportunities focuses on human capital development. Technologies mediate learning and work. What this paper adds Technology is not explicitly addressed in meanings associated with lifelong learning. AI‐based technologies dynamically interact with human cognitive and social practices. The paper argues for a stronger focus on human development instead of human capital in the telos of lifelong learning opportunities. Capability approach is a viable alternative to human capital perspective on LLL. Data used to support learning can focus on learner agency and systemic factors that enable and constrain lifelong learning. Implications for practice and/or policy LLL interventions should promote systemic support for learner agency and ownership. LLL interventions should focus on negotiated value creation. Workplaces should embrace human‐machine integration but in ways that support capability and human development, not human capital.
Conference Paper
Full-text available
The potential for data-driven algorithmic systems to amplify existing social inequities, or create new ones, is receiving increasing popular and academic attention. A surge of recent work, across multiple researcher and practitioner communities, has focused on the development of design strategies and algorithmic methods to monitor and mitigate bias in such systems. Yet relatively little of this work has addressed the unique challenges raised in the design, development, and real-world deployment of learning analytics systems. This interactive workshop aims to provide a venue for researchers and practitioners to share work-in-progress related to fairness and equity in the design of learning analytics and to develop new research and design collaborations around these topics. The workshop will begin with a brief overview of research in fair AI and machine learning, followed by presentations of accepted and invited contributions. In addition, a key outcome of the workshop will be a research agenda for the LAK community, around fairness and equity. Workshop participants will collaboratively construct this agenda through a sequence of small-and whole-group design activities. At the end of the workshop, participating researchers and practitioners will then explore opportunities for collaboration around specific research and design thrusts within this agenda.
Article
Full-text available
In response to public scrutiny of data-driven algorithms, the field of data science has adopted ethics training and principles. Although ethics can help data scientists reflect on certain normative aspects of their work, such efforts are ill-equipped to generate a data science that avoids social harms and promotes social justice. In this article, I argue that data science must embrace a political orientation. Data scientists must recognize themselves as political actors engaged in normative constructions of society and evaluate their work according to its downstream impacts on people's lives. I first articulate why data scientists must recognize themselves as political actors. In this section, I respond to three arguments that data scientists commonly invoke when challenged to take political positions regarding their work. In confronting these arguments, I describe why attempting to remain apolitical is itself a political stance—a fundamentally conservative one—and why data science's attempts to promote “social good” dangerously rely on unarticulated and incrementalist political assumptions. I then propose a framework for how data science can evolve toward a deliberative and rigorous politics of social justice. I conceptualize the process of developing a politically engaged data science as a sequence of four stages. Pursuing these new approaches will empower data scientists with new methods for thoughtfully and rigorously contributing to social justice.
Article
Full-text available
Our 2019 editorial opened a dialogue about what is needed to foster an impactful field of learning analytics (Knight, Wise, & Ochoa, 2019). As we head toward the close of a tumultuous year that has raised profound questions about the structure and processes of formal education and its role in society, this conversation is more relevant than ever. That editorial, and a recent online community event, focused on one component of the impact: standards for scientific rigour and the criteria by which knowledge claims in an interdisciplinary, multi-methodology field should be judged. These initial conversations revealed important commonalities across statistical, computational, and qualitative approaches in terms of a need for greater explanation and justification of choices in using appropriate data, models, or other methodological approaches, as well as the many micro-decisions made in applying specific methodologies to specific studies. The conversations also emphasize the need to perform different checks (for overfitting, for bias, for replicability, for the contextual bounds of applicability, for disconfirming cases) and the importance of learning analytics research being relevant by situating itself within a set of educational values, making tighter connections to theory, and considering its practical mobilization to affect learning. These ideas will serve as the starting point for a series of detailed follow-up conversations across the community, with the goal of generating updated standards and guidance for JLA articles.
Article
Full-text available
In this article, we argue that dominant norms of demographic data are insufficient for accounting for the complexities that characterize many lesbian, gay, bisexual, transgender, and queer (LGBTQ, or broadly “queer”) lives. Here, we draw from the responses of 178 people who identified as non-heterosexual or non-cisgender to demographic questions we developed regarding gender and sexual orientation. Demographic data commonly imagines identity as fixed, singular, and discrete. However, our findings suggest that, for LGBTQ people, gender and sexual identities are often multiple and in flux. An overwhelming majority of our respondents reported shifting in their understandings of their sexual identities over time. In addition, for many of our respondents, gender identity was made up of overlapping factors, including the relationship between gender and transgender identities. These findings challenge researchers to reconsider how identity is understood as and through data. Drawing from critical data studies, feminist and queer digital media studies, and social justice initiatives like Data for Black Lives, we call for a reimagining of identity-based data as “queer data” or “data for queer lives.” We offer also recommendations for researchers to develop more inclusive survey questions. At the same time, we address the ways that queer perspectives destabilize the underlying logics of data by resisting classification and “capture.” For marginalized people, the stakes of this work extend beyond academia, especially in the era of algorithms and big data when the issue of who is or is not “counted” profoundly affects visibility, access, and power in the digital realm.
Article
Full-text available
Education is a particularly important site for the study of data and its consequences. The scale and diversity of education systems and practices means that datafication in education takes many forms, and has potential to exert significant effects on the lives of millions. Datafication of education needs to be understood and analyzed for its distinctive forms, practices and consequences. Enhanced data collection during mass university closures and online teaching as a result of the 2020 COVID-19 crisis makes this all the more urgent. In this brief editorial introduction to the special issue on ‘The datafication of teaching in higher education’, we situate the papers in wider debates and scholarship, and outline some key cross-cutting themes.
Conference Paper
Full-text available
As Learning Analytics (LA) moves from theory into practice, researchers have called for increased participation of stakeholders in design processes. The implementation of such methods, however, still requires attention and specification. In this report, we share strategies and insights from a co-design process that involved university students in the development of a LA tool. We describe the participatory design workshops and highlight three strategies for engaging students in the co-design of learning analytics tools.
Conference Paper
Full-text available
Empirical evidence of how background music benefits or hinders learning becomes the crux of optimizing music recommendation in educational settings. This study aims to further probe the underlying mechanism through an experiment in naturalistic setting. 30 participants were recruited to join a field experiment which was conducted in their own study places for one week. During the experiment, participants were asked to conduct learning sessions with music in the background and collect music tracks they deemed suitable for learning using a novel mobile-based music discovery application. A set of participant-related, context-related, and music-related data were collected via a pre-experiment questionnaire, surveys popped up in the music app, and the logging system of the music app. Preliminary results reveal correlations between certain music characteristics and learners’ task engagement and perceived task performance. This study is expected to provide evidence for understanding cognitive and emotional dimensions of background music during learning, as well as implications for the role of personalization in the selection of background music for facilitating learning.
Conference Paper
Full-text available
Educational recommender systems (ERSs) aim to adaptively recommend a broad range of personalised resources and activities to students that will most meet their learning needs. Commonly, ERSs operate as a "black box" and give students no insight into the rationale of their choice. Recent contributions from the learning analytics and educational data mining communities have emphasised the importance of transparent, understandable and open learner models (OLMs) that provide insight and enhance learners' understanding of interactions with learning environments. In this paper, we aim to investigate the impact of complementing ERSs with transparent and understandable OLMs that provide justification for their recommendations. We conduct a randomised control trial experiment using an ERS with two interfaces ("Non-Complemented Interface" and "Complemented Interface") to determine the effect of our approach on student engagement and their perception of the effectiveness of the ERS. Overall, our results suggest that complementing an ERS with an OLM can have a positive effect on student engagement and their perception about the effectiveness of the system despite potentially making the system harder to navigate. In some cases, complementing an ERS with an OLM has the negative consequence of decreasing engagement, understandability and sense of fairness.
Conference Paper
Full-text available
This paper describes the design and evaluation of personalized visualizations to support young learners' Self-Regulated Learning (SRL) in Adaptive Learning Technologies (ALTs). Our learning path app combines three Personalized Visualizations (PV) that are designed as an external reference to support learners' internal regulation process. The personalized visualizations are based on three pillars: grounding in SRL theory, the usage of trace data and the provision of clear actionable recommendations for learners to improve regulation. This quasi-experimental pre-posttest study finds that learners in the personalized visualization condition improved the regulation of their practice behavior, as indicated by higher accuracy and less complex moment-by-moment learning curves compared to learners in the control group. Learners in the PV condition showed better transfer on learning. Finally, students in the personalized visualizations condition were more likely to underestimate instead of overestimate their performance. Overall, these findings indicates that the personalized visualizations improved regulation of practice behavior, transfer of learning and changed the bias in relative monitoring accuracy.
Conference Paper
Full-text available
By devising a conceptual framework of the learning analytics ecosystem, we identify two types of bias that may stymie the efforts of leveraging learning analytics to produce fair and equitable virtual learning environments. First, Early-­-adopter Iteration Bias may lead learning analytics to derive insights about optimal course design based on preferences and behavior patterns of more prepared, lower need learners. Second, Research-­-praxis Bias prevents practitioners from properly utilizing insights derived from learning analytics and research.
Article
Full-text available
This paper examines visions of ‘learning’ across humans and machines in a near-future of intensive data analytics. Building upon the concept of ‘learnification’, practices of ‘learning’ in emerging big data-driven environments are discussed in two significant ways: the training of machines, and the nudging of human decisions through digital choice architectures. Firstly, ‘machine learning’ is discussed as an important example of how data-driven technologies are beginning to influence educational activity, both through sophisticated technical expertise and a grounding in behavioural psychology. Secondly, we explore how educational software design informed by behavioural economics is increasingly intended to frame learner choices to influence and ‘nudge’ decisions towards optimal outcomes. Through the growing influence of ‘data science’ on education, behaviourist psychology is increasingly and powerfully invested in future educational practices. Finally, it is argued that future education may tend toward very specific forms of behavioural governance – a ‘machine behaviourism’ – entailing combinations of radical behaviourist theories and machine learning systems, that appear to work against notions of student autonomy and participation, seeking to intervene in educational conduct and shaping learner behaviour towards predefined aims.
Conference Paper
Full-text available
Predictive modeling has been a core area of learning analytics research over the past decade, with such models currently deployed in a variety of educational contexts from MOOCs to K-12. However, analyses of the differential effectiveness of these models across demographic, identity, or other groups has been scarce. In this paper , we present a method for evaluating unfairness in predictive student models. We define this in terms of differential accuracy between subgroups, and measure it using a new metric we term the Absolute Between-ROC Area (ABROCA). We demonstrate the proposed method through a gender-based łslicing analysisž using five different models replicated from other works and a dataset of 44 unique MOOCs and over four million learners. Our results demonstrate (1) significant differences in model fairness according to (a) statistical algorithm and (b) feature set used; (2) that the gender imbalance ratio, curricular area, and specific course used for a model all display significant association with the value of the ABROCA statistic; and (3) that there is not evidence of a strict tradeoff between performance and fairness. This work provides a framework for quantifying and understanding how predictive models might inadvertently privilege, or disparately impact, different student subgroups. Furthermore, our results suggest that learning analytics researchers and practitioners can use slicing analysis to improve model fairness without necessarily sacrificing performance. 1
Article
Full-text available
This study investigates how multimodal user-generated data can be used to reinforce learner reflection, improve teaching practices, and close the learning analytics loop. In particular, the aim of the study is to utilize user gaze and action-based data to examine the role of a mirroring tool (i.e., Exercise View in Eclipse) in orchestrating basic behavioural regulation during debugging. The results demonstrated that students who processed the information presented in the Exercise View and acted upon it, improved their performance and achieved a higher level of success than those who failed to do so. The findings shed light on what constitutes relevant data within a particular learning context in programming using gaze patterns. Moreover, these findings could guide the collection of essential learner-centred analytics for designing usable, modular learning environments based on data-driven approaches.
Conference Paper
Full-text available
Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built with the purpose of empowering teachers and learners to make informed decisions about the learning process. Related work has investigated learning dashboards, yet none have explored the theoretical foundation that should inform the design and evaluation of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our analysis revealed that very few dashboard evaluations take into account the educational concepts that were used as a theoretical foundation for their design. Furthermore, we report findings suggesting that comparison with peers, a common reference frame for contextualising information on learning analytics dashboards, was not perceived positively by all learners. We summarise the insights gathered through our literature review in a set of recommendations for the design and evaluation of learning analytics dashboards for learners.
Conference Paper
Full-text available
In order to further the field of learning analytics (LA), researchers and experts may need to look beyond themselves and their own perspectives and expertise to innovate LA platforms and interventions. We suggest that by co-creating with the users of LA, such as educators and students, researchers and experts can improve the usability, usefulness, and draw greater understanding from LA interventions. Within this article we discuss the current LA issues and barriers and how co-creation strategies can help address many of these challenges. We further outline the considerations, both pre- and during interventions, which support and foster a co-created strategy for learning analytics interventions.
Article
Full-text available
Big Data refers to large and disparate volumes of data generated by people, applications and machines. It is gaining increasing attention from a variety of domains, including education. What are the challenges of engaging with Big Data research in education? This paper identifies a wide range of critical issues that researchers need to consider when working with Big Data in education. The issues identified include diversity in the conception and meaning of Big Data in education, ontological, epistemological disparity, technical challenges, ethics and privacy, digital divide and digital dividend, lack of expertise and academic development opportunities to prepare educational researchers to leverage opportunities afforded by Big Data. The goal of this paper is to raise awareness on these issues and initiate a dialogue. The paper was inspired partly by insights drawn from the literature but mostly informed by experience researching into Big Data in education.
Article
Full-text available
School accountability systems in the United States have been criticized on a number of fronts, mainly on grounds of completeness and fairness. This study examines an alternative school quality framework—one that seemingly responds to several core critiques of present accountability systems. Examining results from a pilot study in a diverse urban district, we find that this alternative system captures domains of school quality that are not reflected in the current state system, specifically those measuring opportunity to learn and socioemotional factors. Furthermore, we find a less deterministic relationship between school quality and poverty under the alternative system. We explore the policy implications of these findings vis-à-vis the future of accountability.
Article
Full-text available
In the socio-technical imaginary of higher education, algorithmic decision-making offers huge potential, but we also cannot deny the risks and ethical concerns. In fleeing from Frankenstein’s monster, there is a real possibility that we will meet Kafka on our path, and not find our way out of the maze of ethical considerations in the nexus between human and nonhuman agencies. In this conceptual article, I map seven dimensions of student surveillance on an experimental matrix of human-algorithmic interaction to consider some of the ethical implications of algorithmic decision-making in higher education. The experimental matrix of human-algorithmic decision-making uses the four tasks of ‘sensing’, ‘processing’, ‘acting’ and ‘learning’ to open up algorithmic-human agency as comprising a number of possibilities such as (1) where only humans perform the task; (2) where the task is shared between humans and algorithms; (3) where algorithms perform the task but with humans supervising; and (4) where algorithms perform the tasks with no human oversight. I use this matrix to engage with seven dimensions of how higher education institutions collect, analyse and use student data namely (1) automation; (2) visibility; (3) directionality; (4) assemblage; (5) temporality; (6) sorting; and (7) structuring. The article concludes by proposing a number of pointers to be taken into consideration when implementing algorithms in a higher education context from a position of an ethics of care.
Article
Full-text available
This paper draws on regulatory governance scholarship to argue that the analytic phenomenon currently known as ‘Big Data’ can be understood as a mode of ‘design-based’ regulation. Although Big Data decision-making technologies can take the form of automated decision-making systems, this paper focuses on algorithmic decision-guidance techniques. By highlighting correlations between data items that would not otherwise be observable, these techniques are being used to shape the informational choice context in which individual decision-making occurs, with the aim of channelling attention and decision-making in directions preferred by the ‘choice architect’. By relying upon the use of ‘nudge’ – a particular form of choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives, these techniques constitute a ‘soft’ form of design-based control. But, unlike the static Nudges popularised by Thaler and Sunstein [(2008). Nudge. London: Penguin Books] such as placing the salad in front of the lasagne to encourage healthy eating, Big Data analytic nudges are extremely powerful and potent due to their networked, continuously updated, dynamic and pervasive nature (hence ‘hypernudge’). I adopt a liberal, rights-based critique of these techniques, contrasting liberal theoretical accounts with selective insights from science and technology studies (STS) and surveillance studies on the other. I argue that concerns about the legitimacy of these techniques are not satisfactorily resolved through reliance on individual notice and consent, touching upon the troubling implications for democracy and human flourishing if Big Data analytic techniques driven by commercial self-interest continue their onward march unchecked by effective and legitimate constraints.
Article
Full-text available
The fairy tale is arguably one of the most important cultural and social influences on children's lives. But until the first publication of Fairy Tales and the Art of Subversion, little attention had been paid to the ways in which the writers and collectors of tales used traditional forms and genres in order to shape children's lives – their behavior, values, and relationship to society.
Article
Full-text available
The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational data mining. This paper first introduces the field of learning analytics and outlines the lessons learned from well-known case studies in the research literature. The paper then identifies the critical topics that require immediate research attention for learning analytics to make a sustainable impact on the research and practice of learning and teaching. The paper concludes by discussing a growing set of issues that if unaddressed, could impede the future maturation of the field. The paper stresses that learning analytics are about learning. As such, the computational aspects of learning analytics must be well integrated within the existing educational research.
Article
Full-text available
Nell’era di Internet, delle tecnologie mobili e dell’istruzione aperta, la necessità di interventi per migliorare l’efficienza e la qualità dell’istruzione superiore è diventata pressante. I big data e il Learning Analytics possono contribuire a condurre questi interventi, e a ridisegnare il futuro dell’istruzione superiore. Basare le decisioni su dati e sulle evidenze empiriche sembra incredibilmente ovvio. Tuttavia, l’istruzione superiore, un campo che raccoglie una quantità enorme di dati sui propri “clienti”, è stata tradizionalmente inefficiente nell’utilizzo dei dati, spesso operando con notevole ritardo nell’analizzarli, pur essendo questi immediatamente disponibili. In questo articolo, viene evidenziato il valore delle tecniche di analisi dei dati per l’istruzione superiore, e presentato un modello di sviluppo per i dati legati all’apprendimento. Ovviamente, l’apprendimento è un fenomeno complesso, e la sua descrizione attraverso strumenti di analisi non è semplice; pertanto, l’articolo presenta anche le principali problematiche etiche e pedagogiche connesse all’utilizzo delle tecniche di analisi dei dati in ambito educativo. Cionondimeno, il Learning Analytics può penetrare la nebbia di incertezza che avvolge il futuro dell’istruzione superiore, e rendere più evidente come allocare le risorse, come sviluppare vantaggi competitivi e, soprattutto, come migliorare la qualità e il valore dell’esperienza di apprendimento.
Article
Full-text available
The field of learning analytics has the potential to enable higher education institutions to increase their understanding of their students’ learning needs and to use that understanding to positively influence student learning and progression. Analysis of data relating to students and their engagement with their learning is the foundation of this process. There is an inherent assumption linked to learning analytics that knowledge of a learner’s behavior is advantageous for the individual, instructor, and educational provider. It seems intuitively obvious that a greater understanding of a student cohort and the learning designs and interventions they best respond to would benefit students and, in turn, the institution’s retention and success rate. Yet collection of data and their use face a number of ethical challenges, including location and interpretation of data; informed consent, privacy, and deidentification of data; and classification and management of data. Approaches taken to understand the opportunities and ethical challenges of learning analytics necessarily depend on many ideological assumptions and epistemologies. This article proposes a sociocritical perspective on the use of learning analytics. Such an approach highlights the role of power, the impact of surveillance, the need for transparency, and an acknowledgment that student identity is a transient, temporal, and context-bound construct. Each of these affects the scope and definition of learning analytics’ ethical use. We propose six principles as a framework for considerations to guide higher education institutions to address ethical issues in learning analytics and challenges in context-dependent and appropriate ways.
Article
This article summarizes some emerging concerns as learning analytics become implemented throughout education. The article takes a sociotechnical perspective — positioning learning analytics as shaped by a range of social, cultural, political, and economic factors. In this manner, various concerns are outlined regarding the propensity of learning analytics to entrench and deepen the status quo, disempower and disenfranchise vulnerable groups, and further subjugate public education to the profit-led machinations of the burgeoning “data economy.” In light of these charges, the article briefly considers some possible areas of change. These include the design of analytics applications that are more open and accessible, that offer genuine control and oversight to users, and that better reflect students’ lived reality. The article also considers ways of rethinking the political economy of the learning analytics industry. Above all, learning analytics researchers need to begin talking more openly about the values and politics of data-driven analytics technologies as they are implemented along mass lines throughout school and university contexts.
Article
This brief paper develops a series of provocations against the current forms of Learning Analytics that are beginning to be implemented in higher education contexts. The paper highlights a number of ways in which Learning Analytics can be experienced as discriminatory, oppressive and ultimately disadvantaging across whole student populations, and considers the limitations of current efforts within educational data science to increase awareness of ‘ethics’ and ‘social good’. This culminates in a stark choice: is it possible to substantially improve the field of Learning Analytics as it currently stands, or should we abandon it in favour of new forms of applying data science that are aligned with the experiences of non-conforming ‘learners’ and un-categorizable forms of ‘learning’?
Article
‘Data as technology’ has always been, and continues to be an essential part of the structuring of South African society and education, during and post-colonialism and post-apartheid. In the reconfiguration of South African education post-apartheid, student data constitutes a data frontier as un-mapped, under-utilised and ready for the picking. This article maps the data frontier in the nexus of higher education in the global/colonial present and the data imaginary that provides a particular vision in service of a neoliberal discursive position and ideological orientation. As such, the data frontier acts as ‘generative matrix’ for educational policy attempting to address the legacies of colonialism and apartheid. The value contribution of this article lies in its positioning of the data imaginary in the context of a neoliberal approach to education in the context of the Global South.
Article
The state of the world keeps me up at night, questioning my role as a social justice educator. I think with, through, and around what social change means. Reflecting on my practice, I have followed Western/colonial research and educational methodologies, knowing that they need to be challenged but often being unable to do so. I make present this living in contradiction in this personal narrative, a research methodology practiced for generations by people in the global south and by marginalized people in the United States. It is a reckoning of my work as a researcher, teacher, activist, and director of programs in the academic industrial complex. My desire for a decolonial option in art education requires me to interrogate its classificatory lenses. I explore social optics, drawing on examples through three lenses: art as inherently progressive; the interrelationship between visibility and invisibility; and artistic activism for organizing and building solidarity.
Article
This paper contributes a theoretical framework informed by historical, philosophical and ethnographic studies of science practice to argue that data should be considered to be actively produced, rather than passively collected. We further argue that traditional school science laboratory investigations misconstrue the nature of data and overly constrain student agency in their production. We use our “Data Production” framework to analyze activity of and interviews with high school students who created data using sensors and software in a ninth-grade integrated science class. To understand the opportunities for students to develop act with and perceive agency in data production, we analyze in detail the case of one student as she came to use unfamiliar technologies to produce data for her own personally relevant purposes. We find that her purposes for producing data emerged as she worked, and that resistances to her purposes were opportunities to act with and perceive her own agency, and to see data in new ways. We discuss implications for designing science learning experiences in which students act as agents in producing and using data.
Article
The design of effective learning analytics extends beyond sound technical and pedagogical principles. If these analytics are to be adopted and used successfully to support learning and teaching, their design process needs to take into account a range of human factors, including why and how they will be used. In this editorial, we introduce principles of human-centred design developed in other, related fields that can be adopted and adapted to support the development of Human-Centred Learning Analytics (HCLA). We draw on the papers in this special section, together with the wider literature, to define human-centred design in the field of learning analytics and to identify the benefits and challenges that this approach offers. We conclude by suggesting that HCLA will enable the community to achieve more impact, more quickly, with tools that are fit for purpose and a pleasure to use.
Conference Paper
This exploratory study challenges the current practices in cognitive load measurement by using multichannel data to investigate cognitive load affordances during online complex problem solving. Moreover, it is an attempt to investigate how cognitive load is related to strategy use. Accordingly, in the current study a well- and an ill-structured problem were developed in a virtual learning environment. Online support was provided. Participants were 15 students from the teacher training program. This study incorporated subjective measurements of students' cognitive load (i.e., intrinsic, extraneous, germane load and their mental effort) combined with physiological data containing galvanic skin response (GSR) and skin temperature (ST). A first aim was to investigate whether there was a significant difference for the subjective measurements, physiological data and consultation of support between the well-and ill-structured problem. Secondly this study investigated how individual differences of subjective measurements are related to individual differences of physiological data and consultation of support. Results reveal significant differences for intrinsic load, mental effort between a well- and ill-structured problem. Moreover, when investigating individual differences, findings reveal that GSR might be related to mental effort. Additionally, results indicate that cognitive load influences strategy use. Future research with larger sample sizes should verify these findings in order to have more insight into how we can measure cognitive load and how its related to self-directed learning. These insights should allow us to provide adaptive support in virtual learning environments.
Conference Paper
We study the usage of a self-guided online tutoring platform called Algebra Nation, which is widely by middle school and high school students who take the End-of-Course Algebra I exam at the end of the school year. This article aims to study how the platform contributes to increasing students' exam scores by examining users' logs over a three year period. The platform under consideration was used by more than 36,000 students in the first year, to nearly 67,000 by the third year, thus enabling us to examine how usage patterns evolved and influenced students' performance at scale. We first identify which Algebra Nation usage factors in conjunction with math overall preparation and socioeconomic factors contribute to the students' exam performance. Subsequently, we investigate the effect of increased teacher familiarity level with the Algebra Nation on students' scores across different grades through mediation analysis. The results show that the indirect effect of teacher's familiarity with the platform through increasing student's usage dosage is more significant in higher grades.
Conference Paper
The decision on what item to learn next in a course can be supported by a recommender system (RS), which aims at making the learning process more efficient and effective. However, learners and learning activities frequently change over time. The question is: how are timely appropriate recommendations of learning resources actually evaluated and how can they be compared? Researchers have found that, in addition to a standardized dataset definition, there is also a lack of standardized definitions of evaluation procedures for RS in the area of Technology Enhanced Learning. This paper argues that, in a closed-course setting, a time-dependent split into the training set and test set is more appropriate than the usual cross-validation to evaluate the Top-N recommended learning resources at various points in time. Moreover, a new measure is introduced to determine the timeliness deviation between the point in time of an item recommendation and the point in time of the actual access by the user. Different recommender algorithms, including two novel ones, are evaluated with the time-dependent evaluation framework and the results, as well as the appropriateness of the framework, are discussed.
Book
A guide to understanding the inner workings and outer limits of technology and why we should never assume that computers always get it right. In Artificial Unintelligence, Meredith Broussard argues that our collective enthusiasm for applying computer technology to every aspect of life has resulted in a tremendous amount of poorly designed systems. We are so eager to do everything digitally—hiring, driving, paying bills, even choosing romantic partners—that we have stopped demanding that our technology actually work. Broussard, a software developer and journalist, reminds us that there are fundamental limits to what we can (and should) do with technology. With this book, she offers a guide to understanding the inner workings and outer limits of technology—and issues a warning that we should never assume that computers always get things right. Making a case against technochauvinism—the belief that technology is always the solution—Broussard argues that it's just not true that social problems would inevitably retreat before a digitally enabled Utopia. To prove her point, she undertakes a series of adventures in computer programming. She goes for an alarming ride in a driverless car, concluding “the cyborg future is not coming any time soon”; uses artificial intelligence to investigate why students can't pass standardized tests; deploys machine learning to predict which passengers survived the Titanic disaster; and attempts to repair the U.S. campaign finance system by building AI software. If we understand the limits of what we can do with technology, Broussard tells us, we can make better choices about what we should do with it to make the world better for everyone.
Book
As seen in Wired and Time A revealing look at how negative biases against women of color are embedded in search engine results and algorithms Run a Google search for “black girls”—what will you find? “Big Booty” and other sexually explicit terms are likely to come up as top search terms. But, if you type in “white girls,” the results are radically different. The suggested porn sites and un-moderated discussions about “why black women are so sassy” or “why black women are so angry” presents a disturbing portrait of black womanhood in modern society. In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color. Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance—operating as a source for email, a major vehicle for primary and secondary school learning, and beyond—understanding and reversing these disquieting trends and discriminatory practices is of utmost importance. An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.
Conference Paper
Developing communication skills in higher education students could be a challenge to professors due to the time needed to provide formative feedback. This work presents RAP, a scalable system to provide automatic feedback to entry-level students to develop basic oral presentation skills. The system improves the state-of-the-art by analyzing posture, gaze, volume, filled pauses and the slides of the presenters through data captured by very low-cost sensors. The system also provides an off-line feedback report with multimodal recordings of their performance. An initial evaluation of the system indicates that the system's feedback highly agrees with human feedback and that students considered that feedback useful to develop their oral presentation skills.
Article
Knowledges from academic and professional research-based institutions have long been valued over the organic intellectualism of those who are most affected by educational and social inequities. In contrast, participatory action research (PAR) projects are collective investigations that rely on indigenous knowledge, combined with the desire to take individual and/or collective action. PAR with youth (YPAR) engages in rigorous research inquiries and represents a radical effort in education research to value the inquiry-based knowledge production of the youth who directly experience the educational contexts that scholars endeavor to understand. In this chapter, we outline the foundations of YPAR and examine the distinct epistemological, methodological, and pedagogical contributions of an interdisciplinary corpus of YPAR studies and scholarship. We outline the origins and disciplines of YPAR and make a case for its role in education research, discuss its contributions to the field and the tensions and possibilities of YPAR across disciplines, and close by proposing a YPAR critical-epistemological framework that centers youth and their communities, alongside practitioners, scholars, and researchers, as knowledge producers and change agents for social justice.
Article
Pro-market and business approaches to management in the public sector (new public management – NPM) have created an audit culture in schools driven by top-down, high stakes accountability, and the fetishization of data. Within this context, authentic, qualitative, and democratic forms of inquiry, both in universities and schools, become easily co-opted. I argue in this article that the use of a community-based, participatory action research (PAR) stance has the potential to disrupt NPM and open up authentic and democratic spaces in which to engage in inquiry. The goal of democratization through a PAR stance is not an attempt to return to a pre-data driven past nor to make current neoliberal reforms more palatable, but rather to create more horizontal relationships among professionals, colleges of education, public schools, and low-income communities.
Conference Paper
As higher education increasingly moves to online and digital learning spaces, we have access not only to greater volumes of student data, but also to increasingly fine-grained and nuanced data. A significant body of research and existing practice are used to convince key stakeholders within higher education of the potential of the collection, analysis and use of student data to positively impact on student experiences in these environments. Much of the recent focus in learning analytics is around predictive modeling and uses of artificial intelligence to both identify learners at risk, and to personalize interventions to increase the chance of success. In this paper we explore the moral and legal basis for the obligation to act on our analyses of student data. The obligation to act entails not only the protection of student privacy and the ethical collection, analysis and use of student data, but also, the effective allocation of resources to ensure appropriate and effective interventions to increase effective teaching and learning. The obligation to act is, however tempered by a number of factors, including inter and intra-departmental operational fragmentation and the constraints imposed by changing funding regimes. Increasingly higher education institutions allocate resources in areas that promise the greatest return. Choosing (not) to respond to the needs of specific student populations then raises questions regarding the scope and nature of the moral and legal obligation to act. There is also evidence that students who are at risk of failing often do not respond to institutional interventions to assist them. In this paper we build and expand on recent research by, for example, the LACE and EP4LA workshops to conceptually map the obligation to act which flows from both higher education’s mandate to ensure effective and appropriate teaching and learning and its fiduciary duty to provide an ethical and enabling environment for students to achieve success. We examine how the collection and analysis of student data links to both the availability of resources and the will to act and also to the obligation to act. Further, we examine how that obligation unfolds in two open distance education providers from the perspective of a key set of stakeholders – those in immediate contact with students and their learning journeys – the tutors or adjunct faculty.
Article
There are good reasons for higher education institutions to use learning analytics to risk-screen students. Institutions can use learning analytics to better predict which students are at greater risk of dropping out or failing, and use the statistics to treat ‘risky’ students differently. This paper analyses this practice using normative theories of discrimination. The analysis suggests the principal ethical concern with the differing treatment is the failure to recognize students as individuals, which may impact on students as agents. This concern is cross-examined drawing on a philosophical argument that suggests there is little or no distinctive difference between assessing individuals on group risk statistics and using more ‘individualized’ evidence. This paper applies this argument to the use of learning analytics to risk-screen students in higher education. The paper offers reasons to conclude that judgment based on group risk statistics does involve a distinctive failure in terms of assessing persons as individuals. However, instructional design offers ways to mitigate this ethical concern with respect to learning analytics. These include designing features into courses that promote greater use of effort-based factors and dynamic rather than static risk factors, and greater use of sets of statistics specific to individuals.
Article
In this position paper we contrast a Dystopian view of the future of adaptive collaborative learning support (ACLS) with a Utopian scenario that – due to better-designed technology, grounded in research – avoids the pitfalls of the Dystopian version and paints a positive picture of the practice of computer-supported collaborative learning 25 years from now. We discuss research that we see as important in working towards a Utopian future in the next 25 years. In particular, we see a need to work towards a comprehensive instructional framework building on educational theory. This framework will allow us to provide nuanced and flexible (i.e. intelligent) ACLS to collaborative learners – the type of support we sketch in our Utopian scenario.
Article
Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.
Conference Paper
Contrary to the fairly established notion in the learning sciences that un-scaffolded processes rarely lead to meaningful learning, this study reports a hidden efficacy of such processes and a method for extracting it. Compared to scaffolded, well-structured problem-solving groups, un-scaffolded, ill-structured problem-solving groups struggled with defining and solving the problems. Their discussions were chaotic and divergent, resulting in poor group performance. However, despite failing in their problem-solving efforts, these participants outperformed their counterparts in the well-structured condition on transfer measures, suggesting a latent productivity in the failure. The study's contrasting-case design provided participants in the un-scaffolded condition with an opportunity to contrast the ill-structured problems that they had solved in groups with the well-structured problems they solved individually afterwards. This contrast facilitated a spontaneous transfer, helping them perform significantly better on the individual ill-structured problem-solving tasks subsequently. Implications of productive failure for the development of adaptive expertise are discussed.