PosterPDF Available

How am I Doing?: Student-Facing Performance Dashboards in Higher Education

Authors:

Abstract

Interest in implementing student-facing performance dashboards in higher education is increasing, but there are few studies directly addressing how students perceive and interpret these dashboards. We presented 47 undergraduate students with simulated activity and performance feedback from a Learning Management System dashboard representing 3 time points in an academic semester (early, mid, late). Using a 2 (Feedback Condition: High, Low Performance) x 2 (GPA: High, Low) design, we investigated how students interpret this information, looking specifically at whether the content of the feedback and its consistency with students' overall academic achievement affected students' responses to the dashboard. Our results showed that most students found the dashboard visualizations informative. There were differences between students' potential use of such systems and how they interpreted the impact of the information provided based on their prior academic achievement and the specific feedback provided by the system. Low-GPA students, in particular, found the dashboard messages more useful than students with high-GPAs.
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonC ommercial-NoDer ivs 3.0 Unported (CC BY-NC-ND 3.0)
1
How am I Doing?: Student-Facing Performance Dashboards in
Higher Education
Carl C. Haynes1, Stephanie D. Teasley1, Stephanie Hayley2, Meghan Oster3 & John Whitmer4
1School of Information, 2Academic Innovation, 3School of Education
University of Michigan 105 S. State St., Ann Arbor, MI 48103 USA
4Blackboard, Inc. 58 Maiden Lane, San Francisco, CA 94108 USA
[cchaynes, steasley, swooten, omeghan]@umich.edu, john.whitmer@blackboard.com
ABSTRACT: Interest in implementing student-facing performance dashboards in higher
education is increasing, but there are few studies directly addressing how students perceive
and interpret these dashboards. We presented 47 undergraduate students with simulated
activity and performance feedback from a Learning Management System dashboard
representing 3 time points in an academic semester (early, mid, late). Using a 2 (Feedback
Condition: High, Low Performance) x 2 (GPA: High, Low) design, we investigated how students
interpret this information, looking specifically at whether the content of the feedback and its
consistency with students’ overall academic achievement affected students’ responses to the
dashboard. Our results showed that most students found the dashboard visualizations
informative. There were differences between students’ potential use of such systems and how
they interpreted the impact of the information provided based on their prior academic
achievement and the specific feedback provided by the system. Low-GPA students, in
particular, found the dashboard messages more useful than students with high-GPAs.
Keywords: Learning Analytics; Higher Education; Academic Technology; Dashboards;
Feedback; Student Motivation.
1 INTRODUCTION
Learning analytics dashboards provide a powerful means to present information to different
stakeholders in academia (Bodily & Verbert, 2017). Most existing educational dashboards are aimed
at academic professionals such as administrators, advisors, and instructors; however, increasingly the
users are students (Teasley, 2017). Student-facing dashboards have typically included features that
display comparative feedback about students’ performance relative to course peers (Beheshitha et
al., 2016) intended to produce “actionable insight” (Broos et al., 2017). However, little is known about
the impact of student-facing dashboards. Are students able to interpret the information provided by
such systems, and do they know what to do with it? Perhaps most importantly, which students find
this information motivating versus demotivating, and under which circumstances? We designed a
study to investigate the following research questions based on those raised by the existing literature:
RQ1: Do students value dashboard notifications providing information about their academic
standing in their courses?
RQ2: Do students value visualizations that present information about their course
performance relative to their peers?
RQ3: Do students find comparative performance information motivating or demotivating, and
does this vary by the nature of the feedback delivered (high vs. low performance) and
academic standing (GPA)?
Companion Proceedings 8th International Conference on Learning Analytics & Knowledge (LAK18)
Creative Commons License, Attribution - NonC ommercial-NoDer ivs 3.0 Unported (CC BY-NC-ND 3.0)
2
2 METHOD
Forty-seven undergraduates from a large research university in the United States participated in this
study. Students were randomly assigned to one of two groups, a “high” performance feedback
condition and a “low” performance feedback condition where they viewed a summary message and
graphic presentation of their login activity and performance (grade summary) representing three time
points in a semester (early, mid late). A survey administered after the session captured students’ views
(scale from 1 = Strongly Disagree to 5 = Strong Agree) about the dashboard content and students’
general preferences for future use of this system.
3 RESULTS
Using a 2 (Performance condition) x 2 (GPA) design, we ran ANOVAs to test for significant differences
between experimental conditions. The results are shown in Table 2 on the poster.
RQ1: Do students value dashboard notifications providing information about their academic standing
in their courses? Students’ opinions about the system were generally high, with no survey
questions rated below M = 3.64. Two of the three highest overall ratings were for questions
related to future use (Q10) M = 4.55 (range 4.20-4.70) and finding the graphs informative (Q7) M
= 4.55 (range 4.43-4.80). Overall, the students preferred the graphs (Qs5-8) to the activity stream
messages (Qs1-4).
RQ2: Do students value visualizations that present information about their course performance
relative to their peers? Question 5, which asked students whether the graphs helped them
understand their position in the course, received the highest overall rating of all the survey
questions M = 4.68 (range 4.28-4.93).
RQ3: Do students find comparative performance information (de)motivating, and does this vary by the
nature of the feedback delivered (high vs. low performance) and academic standing (GPA)? The
only statistically significant differences were due to GPA for the two questions regarding activity
messages. High GPA students in both feedback conditions were less motivated to take immediate
action (Q1: F (2, 43) = 6.12, p < .02), less likely to turn on the summary messages (Q2: F (2, 44) =
5.31, p < .03) and would check them less often (Q3: F (2, 42) = 4.94, p < .03) than those with a Low
GPA. Students in the Low-Performance condition were significantly more likely than students in
the High-Performance conditions to find the follow-up actions useful (Q4: F (2, 44) = 6.07, p < .02).
REFERENCES
Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016, April). The role of achievement goal
orientations when studying effect of learning analytics visualizations. In Proceedings of the
Sixth International Conference on Learning Analytics & Knowledge (pp. 54-63). ACM.
Bodily, R., & Verbert, K. (2017, March). Trends and issues in student-facing learning analytics reporting
systems research. In Proceedings of the Seventh International Learning Analytics &
Knowledge Conference (pp. 309-318). ACM.
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for
Actionable Feedback on Learning Skills: Scalability and Usefulness. In International
Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
Teasley, S. D. (2017). Student Facing Dashboards: One Size Fits All?. Technology, Knowledge and
Learning, 1-8.
... In a follow-up study, Aguilar (2016) provided performance dashboards to students considered to be ''at risk'' by their university program and found that these students were sensitive to comparative information, although they indicated that they would it seek out if it was offered in a LMS. In a recent project (Teasley et al. 2017), we examined how students' prior academic performance affected how they reacted to comparative information shown in a dashboard. We looked specifically at whether the content of the feedback (above vs. below class average performance) and its consistency with students' overall academic achievement (existing grade point average) affected students' responses to the dashboard. ...
Article
Full-text available
This emerging technology report reviews a new development in educational technology, student-facing dashboards, which provide comparative performance feedback to students calculated by Learning Analytics-based algorithms on data generated from university students’ use of educational technology. Instructor- and advisor-facing dashboards emerged as one of the first direct applications of Learning Analytics, but the results from early implementations of these displays for students provide mixed results about the effects of their use. In particular, the “one-size-fits-all” design of many existing systems is questioned based on findings in related research on performance feedback and student motivation which has shown that various internal and external student-level factors affect the impact of feedback interventions, especially those using social comparisons. Integrating data from student information systems into underlying algorithms to produce personalized dashboards may mediate the possible negative effects of feedback, especially comparative feedback, and support more consistent benefits from the use of such systems.
Conference Paper
Full-text available
In the transition from secondary to higher education, students are expected to develop a set of learning skills. This paper reports on a dashboard implemented and designed to support this development, hereby bridging the gap between Learning Analytics research and the daily practice of supporting students. To demonstrate the scalability and usefulness of the dashboard, this paper reports on an intervention with 1406 first-year students in 12 different programs. The results show that the dashboard is perceived as clear and useful. While students not accessing the dashboard have lower learning skills, they make more use of the extra remediation possibilities in the dashboard.
Article
Full-text available
This emerging technology report reviews a new development in educational technology, student-facing dashboards, which provide comparative performance feedback to students calculated by Learning Analytics-based algorithms on data generated from university students’ use of educational technology. Instructor- and advisor-facing dashboards emerged as one of the first direct applications of Learning Analytics, but the results from early implementations of these displays for students provide mixed results about the effects of their use. In particular, the “one-size-fits-all” design of many existing systems is questioned based on findings in related research on performance feedback and student motivation which has shown that various internal and external student-level factors affect the impact of feedback interventions, especially those using social comparisons. Integrating data from student information systems into underlying algorithms to produce personalized dashboards may mediate the possible negative effects of feedback, especially comparative feedback, and support more consistent benefits from the use of such systems.
Conference Paper
Full-text available
When designing learning analytics tools for use by learners we have an opportunity to provide tools that consider a particular learner's situation and the learner herself. To afford actual impact on learning, such tools have to be informed by theories of education. Particularly, educational research shows that individual differences play a significant role in explaining students' learning process. However, limited empirical research in learning analytics has investigated the role of theoretical constructs, such as motiva-tional factors, that are underlying the observed differences between individuals. In this work, we conducted a field experiment to examine the effect of three designed learning analytics visuali-zations on students' participation in online discussions in authentic course settings. Using hierarchical linear mixed models, our results revealed that effects of visualizations on the quantity and quality of messages posted by students with differences in achievement goal orientations could either be positive or negative. Our findings highlight the methodological importance of considering individual differences and pose important implications for future design and research of learning analytics visualizations.
Conference Paper
We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not meet the inclusion criteria, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student-facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work.