PosterPDF Available

Students' Use of Learning Analytics Dashboards and the Impact on Self-Concepts

Authors:

Abstract

College is a critical time when changes in students' attitudes, knowledge, personality characteristics, and self-concepts are affected by their face-to-face and online interactions with educators, peers, and the campus climate (Astin, 1997). The growing use of big data and analytics in higher education has fostered research that supports human judgement in the analysis of information about learning and the application of interventions that can aid students in their development and improve retention rates (Siemens & Baker, 2012). This information is often displayed in the form of learning analytics dashboards (LADs), which are individual displays with multiple visualizations of indicators about learners, their learning activities, and/or features of the learning context both at the individual and group levels (Schwendimann et al., 2017). The information presented in LADs is intended to support students' learning competencies that include metacognitive, cognitive, behavioral, and emotional self-regulation (Jivet et al., 2018). We investigated the impact of a student-facing LAD on students' self-concepts and viewing preferences to address the following questions: What are students' viewing preferences (i.e., for individual vs. comparative performance feedback)? How does viewing performance information affect the development of students' metacognitive skills and self-concepts? And, what are students' perceptions about the usability of LADs? In an end-of-term survey, 111 students at a large research university responded to 10 Likert scale and three open-ended questions. Overall, the students reported understanding the information that was presented to them through the LAD and that it was useful, although some students expressed concerns about its accuracy and wanted more detailed information. Students also reported that they preferred to view comparisons to other students over just viewing their own performance information, and that LAD use increased positive affect about performance. Students also reported that dashboard use affected how much they believed they understood the course material, the time and effort they were willing to put into the course, and that it lessened their anxiety. We concluded that course-specific or program-specific related outcomes may require different LAD design and evaluation approaches, and the nonuse of the LAD may be linked to self-confidence, forgetfulness, and a lack of innovative dashboard features. Our study was limited by the analysis of survey data (without trace data), and the sample size. This research contributes to the literature on student-facing learning analytics dashboards (LADs) by investigating students' reasons for interacting with dashboards, their viewing preferences, and how their interactions affect their performance and tying these insights to educational concepts that were a part of the LAD design. Further research is needed to determine whether presenting students with the option to turn on the dashboard for any or all of their courses over the course of the semester is important,
Students’ Use of Learning Analytics Dashboards and the Impact on Self-Concepts
Carl C. Haynes1, Stuart A. Karabenick2 & Stephanie D. Teasley1
School of Information1, School of Education2
University of Michigan 105 S. State St., Ann Arbor, MI 48103 USA
[cchaynes, skaraben, steasley]@umich.edu
Keywords: Learning Analytics, Performance Dashboards, Higher Education, Survey, Evaluation
Abstract: College is a critical time when changes in students’ attitudes, knowledge, personality
characteristics, and self-concepts are affected by their face-to-face and online interactions with
educators, peers, and the campus climate (Astin, 1997). The growing use of big data and analytics
in higher education has fostered research that supports human judgement in the analysis of
information about learning and the application of interventions that can aid students in their
development and improve retention rates (Siemens & Baker, 2012). This information is often
displayed in the form of learning analytics dashboards (LADs), which are individual displays with
multiple visualizations of indicators about learners, their learning activities, and/or features of the
learning context both at the individual and group levels (Schwendimann et al., 2017).
The information presented in LADs is intended to support students’ learning competencies
that include metacognitive, cognitive, behavioral, and emotional self-regulation (Jivet et al., 2018).
We investigated the impact of a student-facing LAD on students’ self-concepts and viewing
preferences to address the following questions: What are students’ viewing preferences (i.e., for
individual vs. comparative performance feedback)? How does viewing performance information
affect the development of students’ metacognitive skills and self-concepts? And, what are
students’ perceptions about the usability of LADs?
In an end-of-term survey, 111 students at a large research university responded to 10 Likert
scale and three open-ended questions. Overall, the students reported understanding the information
that was presented to them through the LAD and that it was useful, although some students
expressed concerns about its accuracy and wanted more detailed information. Students also
reported that they preferred to view comparisons to other students over just viewing their own
performance information, and that LAD use increased positive affect about performance. Students
also reported that dashboard use affected how much they believed they understood the course
material, the time and effort they were willing to put into the course, and that it lessened their
anxiety.
We concluded that course-specific or program-specific related outcomes may require
different LAD design and evaluation approaches, and the nonuse of the LAD may be linked to
self-confidence, forgetfulness, and a lack of innovative dashboard features. Our study was limited
by the analysis of survey data (without trace data), and the sample size. This research contributes
to the literature on student-facing learning analytics dashboards (LADs) by investigating students’
reasons for interacting with dashboards, their viewing preferences, and how their interactions
affect their performance and tying these insights to educational concepts that were a part of the
LAD design. Further research is needed to determine whether presenting students with the option
to turn on the dashboard for any or all of their courses over the course of the semester is important,
and whether this increases cognitive load. Finally, researchers should use caution when
interpreting students’ acceptance of the dashboard features, as well as nonuse, as these may be
related to a lack of awareness of available dashboard features or the student data institutions have,
as well as students’ self-confidence.
References
Astin, A. W. (1977). Four Critical Years. Effects of College on Beliefs, Attitudes, and Knowledge.
Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate: Preparing learning
analytics dashboards for educational practice.
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S.,
Holzer, A., ... & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic
literature review of learning dashboard research. IEEE Transactions on Learning
Technologies, 10(1), 30-41.
Siemens, G., & d Baker, R. S. (2012, April). Learning analytics and educational data mining:
towards communication and collaboration. In Proceedings of the 2nd international
conference on learning analytics and knowledge (pp. 252-254). ACM.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Drawing from a range of ethnographic information gathered within a participatory design project on after‐school learning and educational justice, this paper describes the new ways of seeing and relating that emerged when researchers and educators “examined the obvious” (Erickson 1973/84) and closely analyzed the embodied—physical, gestural, artifact‐mediated—dimensions of learning interactions. As participants co‐analyzed field notes, photographs and video recordings of educator–student interactions during making and tinkering activities, they began to notice the forms of embodied assistance that cultivated or stifled rich forms of joint activity and consider the subjective meanings this layer of experience held for participants. This emergent lens was shared through various forms of professional development and gave life to a sustained practice of intentionality and reflective intuition among educators. Our analyses contribute to prior research on embodiment by tracing the emergence of “hands and eyes” (as it was referred to by participants) as a form of ethical perception and considering how educators learned to revise their embodied actions to organize more generative learning experiences with children.
Article
Productive adaptations at the classroom level are evidence-based curriculum adaptations that are responsive to the demands of a particular classroom context and still consistent with the core design principles and intentions of a curriculum intervention. The model of design-based implementation research (DBIR) offers insights into complexities and challenges of enacting productive curriculum adaptations. We draw from empirical research in mathematics and science classrooms to illustrate criteria for productive adaptations. From these examples, we identify resources needed to encourage and sustain practices to promote productive adaptations in classrooms.
Article
Applied researchers, whether working with the framework of design-based research or intervention science, face a similar implementation challenge: practitioners who enact their programs typically do so in varied, context-specific ways. Although this variability is often seen as a problem for those who privilege fidelity and standardization, we argue for the advantages of researcher-practitioner collaborations that encourage local adaptation and ingenuity. We develop this argument for adaptive interventions by discussing two design-based research projects, Critical Civic Inquiry (CCI) and Science Literacy through Science Journalism (SciJourn), which create opportunities for youth to develop civics and science literacy respectively. CCI and SciJourn aim to build curricula that will travel to new schools and districts, but not through standardization. This is a delicate combination: the program must be flexible enough to enable productive adaptation, without being so protean that practitioners’ implementations lack substantive commonalities. We present two cases that show how project designers have sought to distinguish between invariant principles that define the intervention and heterogeneous practices that vary across sites. The cases also show how the model has improved when teachers can adapt it to their institutional context and when teachers and researchers establish social norms that encourage dialogic interactions.
Article
Background/Context Education scholars have examined how state policy and informal practice can widen or reproduce racial and gender inequalities in graduate education. Just one empirical study, which focused on psychology programs, has identified organizational practice that supports recruitment and retention of graduate students of color. Focus of Study To identify organizational conditions and specific activities that support diversity in STEM graduate programs, the authors conducted a yearlong case study of a physics program that, for the last decade, has trained about 10% of the Black Ph.D.s in physics, nationally. They identified and described concrete efforts to enhance access and inclusion, and sought to understand how this program distinguished itself from a traditional physics department. Participants Study participants consisted of 16 faculty, administrators, administrative staff, and students affiliated with the Applied Physics program at the University of Michigan. Research Design Data for this qualitative case study was collected through eighteen interviews, two student focus groups, observations of everyday life and special events in the program, and a large amount of documentary data. Guided by the constant comparative method, the analysis assessed convergence and divergence across types of data and across faculty, administrator, staff, and student perspectives. Major findings represent four areas of consensus across participant roles. Findings/Results Four themes explain how Applied Physics has increased access to and inclusion in a field known for its inequality. The program institutionalized a flexible, interdisciplinary intellectual paradigm; they reconceputalized their vision of the ideal student and reformed admissions accordingly; they empowered administrative staff to serve as cultural translators across racial and faculty-student boundaries; and they worked to create a familylike climate that gave them a competitive advantage over other physics programs. Conclusions/Recommendations We interpret the findings from the perspective of Charles Tilly's boundary change mechanisms, and conclude that the common thread among the four themes was the program's willingness to erase, relocate, and/or deactivate boundaries that had implicitly created barriers to access and inclusion for underrepresented students. The paper recommends specific steps that graduate programs can take to analyze the symbolic boundaries operating in their own programs, and invites scholars to utilize the boundaries perspective in future research on educational inequality.