Content uploaded by Carl Christopher Haynes-Magyar
Author content
All content in this area was uploaded by Carl Christopher Haynes-Magyar on Feb 08, 2022
Content may be subject to copyright.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonC ommercial-NoDer ivs 3.0 Unported (CC BY-NC-ND 3.0)
1
How am I Doing?: Student-Facing Performance Dashboards in
Higher Education
Carl C. Haynes1, Stephanie D. Teasley1, Stephanie Hayley2, Meghan Oster3 & John Whitmer4
1School of Information, 2Academic Innovation, 3School of Education
University of Michigan 105 S. State St., Ann Arbor, MI 48103 USA
4Blackboard, Inc. 58 Maiden Lane, San Francisco, CA 94108 USA
[cchaynes, steasley, swooten, omeghan]@umich.edu, john.whitmer@blackboard.com
ABSTRACT: Interest in implementing student-facing performance dashboards in higher
education is increasing, but there are few studies directly addressing how students perceive
and interpret these dashboards. We presented 47 undergraduate students with simulated
activity and performance feedback from a Learning Management System dashboard
representing 3 time points in an academic semester (early, mid, late). Using a 2 (Feedback
Condition: High, Low Performance) x 2 (GPA: High, Low) design, we investigated how students
interpret this information, looking specifically at whether the content of the feedback and its
consistency with students’ overall academic achievement affected students’ responses to the
dashboard. Our results showed that most students found the dashboard visualizations
informative. There were differences between students’ potential use of such systems and how
they interpreted the impact of the information provided based on their prior academic
achievement and the specific feedback provided by the system. Low-GPA students, in
particular, found the dashboard messages more useful than students with high-GPAs.
Keywords: Learning Analytics; Higher Education; Academic Technology; Dashboards;
Feedback; Student Motivation.
1 INTRODUCTION
Learning analytics dashboards provide a powerful means to present information to different
stakeholders in academia (Bodily & Verbert, 2017). Most existing educational dashboards are aimed
at academic professionals such as administrators, advisors, and instructors; however, increasingly the
users are students (Teasley, 2017). Student-facing dashboards have typically included features that
display comparative feedback about students’ performance relative to course peers (Beheshitha et
al., 2016) intended to produce “actionable insight” (Broos et al., 2017). However, little is known about
the impact of student-facing dashboards. Are students able to interpret the information provided by
such systems, and do they know what to do with it? Perhaps most importantly, which students find
this information motivating versus demotivating, and under which circumstances? We designed a
study to investigate the following research questions based on those raised by the existing literature:
RQ1: Do students value dashboard notifications providing information about their academic
standing in their courses?
RQ2: Do students value visualizations that present information about their course
performance relative to their peers?
RQ3: Do students find comparative performance information motivating or demotivating, and
does this vary by the nature of the feedback delivered (high vs. low performance) and
academic standing (GPA)?
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonC ommercial-NoDer ivs 3.0 Unported (CC BY-NC-ND 3.0)
2
2 METHOD
Forty-seven undergraduates from a large research university in the United States participated in this
study. Students were randomly assigned to one of two groups, a “high” performance feedback
condition and a “low” performance feedback condition where they viewed a summary message and
graphic presentation of their login activity and performance (grade summary) representing three time
points in a semester (early, mid late). A survey administered after the session captured students’ views
(scale from 1 = Strongly Disagree to 5 = Strong Agree) about the dashboard content and students’
general preferences for future use of this system.
3 RESULTS
Using a 2 (Performance condition) x 2 (GPA) design, we ran ANOVAs to test for significant differences
between experimental conditions. The results are shown in Table 2 on the poster.
RQ1: Do students value dashboard notifications providing information about their academic standing
in their courses? Students’ opinions about the system were generally high, with no survey
questions rated below M = 3.64. Two of the three highest overall ratings were for questions
related to future use (Q10) M = 4.55 (range 4.20-4.70) and finding the graphs informative (Q7) M
= 4.55 (range 4.43-4.80). Overall, the students preferred the graphs (Qs5-8) to the activity stream
messages (Qs1-4).
RQ2: Do students value visualizations that present information about their course performance
relative to their peers? Question 5, which asked students whether the graphs helped them
understand their position in the course, received the highest overall rating of all the survey
questions M = 4.68 (range 4.28-4.93).
RQ3: Do students find comparative performance information (de)motivating, and does this vary by the
nature of the feedback delivered (high vs. low performance) and academic standing (GPA)? The
only statistically significant differences were due to GPA for the two questions regarding activity
messages. High GPA students in both feedback conditions were less motivated to take immediate
action (Q1: F (2, 43) = 6.12, p < .02), less likely to turn on the summary messages (Q2: F (2, 44) =
5.31, p < .03) and would check them less often (Q3: F (2, 42) = 4.94, p < .03) than those with a Low
GPA. Students in the Low-Performance condition were significantly more likely than students in
the High-Performance conditions to find the follow-up actions useful (Q4: F (2, 44) = 6.07, p < .02).
REFERENCES
Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016, April). The role of achievement goal
orientations when studying effect of learning analytics visualizations. In Proceedings of the
Sixth International Conference on Learning Analytics & Knowledge (pp. 54-63). ACM.
Bodily, R., & Verbert, K. (2017, March). Trends and issues in student-facing learning analytics reporting
systems research. In Proceedings of the Seventh International Learning Analytics &
Knowledge Conference (pp. 309-318). ACM.
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for
Actionable Feedback on Learning Skills: Scalability and Usefulness. In International
Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
Teasley, S. D. (2017). Student Facing Dashboards: One Size Fits All?. Technology, Knowledge and
Learning, 1-8.