ChapterPDF Available

Visualization of Learning for Students: A Dashboard for Study Progress – Development, Design Details, Implementation, and User Feedback

Authors:

Abstract and Figures

At Graz University of Technology (TU Graz, Austria), the learning management system based on Moodle (https://moodle.org/ – last accessed February 10, 2021) is called TeachCenter. Together with a campus management system – called TUGRAZonline – it is the main infrastructure for digital teaching and general study issues. As central instances for both teachers and students, various services and support for them are offered. The latest developments include the design and implementation of a study progress dashboard for students. This dashboard is intended to provide students a helpful overview of their activities: It shows their academic performance in ECTS compared to the average of their peers, their own study progress, and the official study recommendation as well as the progress in the various compulsory and optional courses. The first dashboard prototype was introduced to all computer science students in May 2020, and a university-wide rollout started in December 2020. The chapter describes design considerations and design development work, implementation, as well as the user feedback on the implementation. Finally, the authors present recommendations as guidelines for similar projects based on their experience and students’ feedback and give an outlook for future development and research.
Content may be subject to copyright.
Preliminary version / draft original published in: Leitner P., Ebner M., Geisswinkler H., Schön S. (2021) Visualization of
Learning for Students: A Dashboard for Study Progress Development, Design Details, Implementation, and User Feedback. In:
Sahin M., Ifenthaler D. (eds) Visualizations and Dashboards for Learning Analytics. Advances in Analytics for Learning and
Teaching. Springer, Cham. https://doi.org/10.1007/978-3-030-81222-5_19
VISUALIZATION OF LEARNING FOR STUDENTS: A DASHBOARD
FOR STUDY PROGRESS
Development, design details, implementation, and user feedback
Philipp Leitner, Graz University of Technology, philipp.leitner@tugraz.at
Martin Ebner, Graz University of Technology, martin.ebner@tugraz.at
Hanna Geisswinkler, Graz University of Technology, hanna.geisswinkler@tugraz.at
Sandra Schön, Graz University of Technology, sandra.schoen@tugraz.at
Abstract: At Graz University of Technology (TU Graz, Austria) the learning management system based on Moodle
1
is called
TeachCenter. Together with a campus management system called TUGRAZonlineit is the main infrastructure
for digital teaching and general study issues. As central instances for both teachers and students, various services
and support for them are offered. The latest developments include the design and implementation of a study
progress dashboard for students. This dashboard is intended to provide students a helpful overview of their
activities: It shows their academic performance in ECTS compared to the average of their peers, their own study
progress and the official study recommendation as well as the progress in the various compulsory and optional
courses. The first dashboard prototype was introduced to all computer science students in May 2020, and a
university-wide rollout started in December 2020. The chapter describes design considerations and design
development work, implementation as well the user feedback on the implementation. Finally, the authors present
recommendations as guidelines for similar projects based on their experience and students’ feedback and give an
outlook for future development and research.
Key words: Learner support, feedback, dashboard, study progress, university
1. INTRODUCTION
At Graz University of Technology (TU Graz) the organizational unit Educational Technology has
intensive experience in learning analytics and visualizations, including for our Austria-wide MOOC
platform iMooX.at (Maier, Leitner & Ebner, 2019; Leitner, Maier & Ebner 2020), for the university-wide
learning management system TeachCenter (Leitner, Ebner & Ebner, 2019) and through numerous
international research cooperation (De Laet et al. 2018a, De Laet et al., 2018b). When students expressed
the wish to get a better and easier overview of their study progress, we were happy to comply.
Not only TU Graz students share this wish. Reimers and Neovesky (2015) for example asked German
students (N=194) what they want at their dashboards. Their findings are: “Nearly all questioned students
stated that they would like to see all information relevant to their studies in one central place. 93% expressed
agreement by selecting 1 or 2 on the scale. Almost as many (85%) agreed (selecting 1 or 2) on wanting an
overview of deadlines to better organize their studies.” Building up on their investigations of typical LA
dashboard available for students, very often using the learning management system, but no student
information system with grades, the authors complain: “Yet, none of the online platforms discussed above
provides these two features” (p. 403).
In this article, we trace the development of the TU Graz students’ study progress dashboard and present
its design aspects in detail. We as well refer to our experiences and lessons learned so that practitioners
could take our approach as blueprint and get helpful insights into our challenges.
1
https://moodle.org/ - last accessed February 10th, 2021
2
2. INITIAL SITUATION AND ADRESSED OBJECTIVES
At Graz University of Technology (TU Graz) the learning management system (Moodle) is called
TeachCenter and together with the campus management system called TUGRAZonline it is the main
infrastructure for teaching. As central instances for both teachers and students, various services and support
for teachers and students are offered. The information students could find there concerning their study
progress in 2019, were the courses successfully completed to date and ECTS achieved, including data on
other examination candidates (for example failure rate for a particular examination). The design of this
information was largely based on textual information spread across several pages (see Figure 1).
[Add Fig. 1 here]
Figure 1. A screenshot of students’ information from the campus management system. Source: TU Graz.
The TU Graz study progress dashboard for students is intended to provide a helpful overview of students'
activities, for example their academic performance in ECTS compared to the average of their peers, their
own study progress and the official study recommendation as well as the progress in the various
compulsory and optional courses. By visualizing learning data, students should keep an eye on their own
learning process, which can ultimately lead to an improvement in their learning success. The advantages
and objectives for students should be in detail (TU Graz, 2021):
Students can see their learning achievements graphically.
Students can regularly check their own learning progress.
Students determine their individual learning status on the basis of a comparison group.
Students can optimize their learning process.
Each student only sees his/her own results.
Teachers as well as the university administration of Graz University of Technology should also benefit
from the use of learning analytics (TU Graz, 2021a): The visualizations and evaluations help to better
understand teaching and learning processes, reduce dropouts, lead to more transparency, result in higher
examination activity, make an important contribution to the optimization of study departments and student
counseling and are the subject of research and the careful handling of data.
3. STUDENTS’ DASHBOARDS AND DESIGN CONSIDERATIONS
As described above, we could build upon several years of experience and research in the field of LA in
higher education. Basically, we are aware of potential challenges: Greller and Drachsler (2016) have
developed a well-known framework model for the development and deployment of LA applications. From
our perspective for the higher education context (in particular building upon Ferguson et al., 2016) we see
seven challenges as crucial (Leitner, Ebner & Ebner, 2019): purpose and benefits of learning analytics,
privacy protection, development of a clear procedure with regard to desired and undesired scenarios (ethics),
the data, infrastructure, development and operation (incl. estimation of foreseeable costs), as well as other
activities, e. g. training of supervisors.
Building upon this, we decided for example to only use data which is practically already available for
students, for example the results of lectures with more than 5 years. This way we can be sure that no new
examinations could arise for the use of the data.
At the same time, we had to make sure that only students could receive the information on their grades
and study status, as nothing else has been provided for so far. We did this as well be aware that students do
not appreciate their data being used for student counseling purposes: West et al. (2020a) asked more than
2,000 Australian students about their view on learning analytics and more than a half of them had concerns
3
that their data could be used “to trigger support services to contact you” or “trigger academic staff to contact
you” or “your data being used by the university for research” (amongst other options, p. 80).
With regard to the design and visualizations, we were able to draw on corresponding previous work and
developments: A literature review of the state-of-the-art of research on learning dashboards by
Schwendimann et al. (2017) included 55 papers, primarily focusing on learning dashboards in learning
management systems and MOOC platforms. Concerning their analysis, 14 papers (25%) describe
dashboards which builds upon data of several platforms. 28 papers (51%) are about dashboards (also for)
students. Further, the literature review gives insights into visualization types used. We have selected the
results for papers that see students as dashboard users and present them in Figure 1. However, we do not
know how many of these papers deal with study dashboards.
Figure 2. Visualization types in dashboard addressing students in papers according to a literature review of 55 papers
by Schwendimann et al. (2017). Source: Own visualization of the results presented in Schwedimann et al. (2017) fig.
7, using the number of papers for this target group as a base (n=28)
The study shows that bar charts, line crafts, tables and pie charts are very often used in dashboards for
students. Study dashboard have a focus on study results and not on learners’ activities, for example their
interaction with other learners or procrastination time. Therefore, visualization at a study dashboard might
use different or special visualization schemes. The type of visualization can be related with the aim of the
dashboard and the addressed learning support (see Sedrakyan, Mannens & Verbert 2018).
In our case, it is of deeper interest what kind of visualization is used for which data in existing
dashboards. For example, Charleer et al. (2018) developed "LISSA". With this tool, students and student
counselors receive visualizations of students' grades and study status, also in comparison with their peers.
Further, the LALA project describes several study dashboards, partly inspired by LISSA (Henriquez, 2019).
In Figure 3, we show the abstract visualization approach of three different implemented dashboards from
LALA partner universities to visualize students’ progress use colors to mark the students’ status (passed,
failed, running) as well as different needs to show marks or format of the lectures.
[Add Fig. 3 here]
4
Figure 3. Three abstract versions of study dashboard visualizations from partner universities of the LALA project
(from left to right: AvAc, TrAC and SiCa). Source: Own visualization based on screen shots in Henriquez (2019).
In all three cases, clicking on a particular lecture gives an overview of the grades for all students,
including previous years. These are typically provided as bar charts of cumulative results with lines or dots
for the individual student. It is important to emphasize that all of these dashboards described are not intended
for the students themselves, but for counselors.
Nevertheless, for us here the students were a central partner in the development and we planned from
the outset to involve them strongly in the development (see de Laet et al., 2018b). With this, we would like
to create a helpful tool that meets with as little rejection as possible (Schumacher & Ifenthaler, 2018).
Generally, this focus is demanded again and again, but not so often implemented: Although the learning
analytics field “acknowledges the central role of students”, West et al. (2020b) states that “much of the
literature reflects an academic, teacher-centric or institutional view” (from abstract).
With regard to the visualization to be developed, we were aware that visualizations are not per se
understood by everyone, legends and assistance must be provided (e.g. Stofer, 2016 for maps). So, we were
aware that we always had to question whether what we were visualizing was actually understood in the way
we thought. We as well know, that especially comparisons with other students are not always motivating,
and can as well also to let arise feelings of superiority (Teasley, 2017). Very exact positioning, especially at
the worse end or also at the better end of the student distribution or a clear ranking should be avoided. But
we can also imagine that visualizations with clear rankings might be less problematic in cultures where
students are used to that.
4. DEVELOPMENT AND CHALLENGES
This chapter describes the development and implementation of the students’ dashboard for the TU Graz
over time as well the challenges we have encountered along the way.
4.1 Development and implementation over time
Our development was planned and implemented within a time frame of two years. In the following we
describe the process from the first idea development till the current status.
5
Figure 4. Students’ dashboard development to university-wide implementation over time.
As can be seen in Figure 4, stakeholder involvement, especially student involvement was a core feature of
the development process. Design and technical development were developed in the “Educational
Technology” team together with colleagues from the IT-Services at TU Graz. Information and
communication issues for students were developed by the “Higher Education and Program Development”
team.
03/2019: The Educational Technology teams organized workshops with in summary 7 students and
faculty representatives from 5 disciplines on general creative ideas on enhancing the study support and
situation: Amongst many other ideas, students highlighted that a better overview about their study
process would be helpful.
03/2019-10/2019: We did a first analysis of data structures and origins, and development of
visualizations.
07-2019-09/2019: We organized a co-design workshops for the intermediate dashboards with student
representatives.
11/2019-12/2019: We did several test in the internal testing phase in small groups of students.
02/2020-03/2020: Several meetings were organized with stakeholders, including student
representatives, teachers, vice-rector for academic affairs, works council, legal department. In parallel,
information materials for students were developed (TU Graz 2021a, TU Graz, 2021b).
06/2020: 18 months after the first vague idea, we implemented the new dashboard for study progress
for all BA-students at the Faculty of Computer Science. We collected user feedback and made small
revisions.
12/2020: After 6 months of test phase, we did the university-wide implementation for all BA-students,
including several information materials, set-up of advisory structure, user feedback and small
revisions.
The development of the dashboard was delayed by a few weeks over time due to the closure of the
university in March 2020, as all resources were needed at short notice to provide the necessary technical
support for emergency teaching (Ebner et al., 2020). Overall, however, in retrospect, the implementation
took place quickly and smoothly, probably also due to the existing experience with similar projects.
6
4.2 Development of design and visualization
As shown, our visualization and design team co-designed the dashboard together and within several
meetings with the key (and only) user group, which are students. The design decisions are aimed at
making it as easy as possible for users to get started and to understand the visualizations correctly. Design
considerations were done concerning the needed information, the available data, main type of data chart,
colors as well additional information such as labels and legends. Within the students’ co-design workshop,
we worked in smaller groups and asked for a brainstorming concerning the question on needed, wished
information and visualization on a perfect students’ dashboard (see Fig. 5 left). Afterward, we asked the
groups to make a first sketch (see Fig. 5 middle). In a later phase, we asked students for feedback with the
help of paper prototypes (see Fig. 5 on the right).
[Add Fig. 5 here]
Figure 5. Artifacts from the co-design with students’ sessions and feedback rounds. Left: Students’ needs and first
ideas, middle: a first sketch, right: annotated paper prototype.
Concerning the design, we checked for familiar color schemes and chart styles that are used already at
other parts of the TU Graz pages for students. The color scheme of red, orange, yellow and green as signal
color, related to the traffic light scheme was one of the first options which made it through the
development process. These colors help to identify difficult courses, conflicts and the need for action at a
glance. No signal colors are grey and blues.
Although there are several types of visualizations of the concept “parts-to-a-whole” (Ribecca, 2021;
see Fig. 6), only some are well-known and easy to understand according to the students’ feedback within
our workshops: Bar and pie charts are the two best-known and easy to understand chart types. The donut
chart, a variant of the pie chart, can be as well used as optical boundary.
[Add Fig. 6 here]
Figure 6. Selection of a donut chart from several possible visualizations of parts-to-a-whole. Own illustration of the
collection by Ribecca (2021)
As it will be shown later, the donut chart serves in our dashboard as a good delineation of different
courses: Important key figures are arranged radially in the mid. This makes it easier to recognize and
understand related data. A cluttered user interface is prevented by displaying labels and legends primarily
via 'mouse-over' (a graphical element that is activated when the user moves or hovers the pointer over a
trigger area). This minimalism contributes to clarity and is intended to reveal the functions to the user step
by step.
4.3 Challenges in implementation
Again, and again, smaller challenges arose. With regard to the available data and its quality, or challenges
in visualization, the following are among them:
It is not clear whether students study according to the current study regulations or according to the
older ones that may still be valid.
If students are registered for several study programs (up to 4 are possible), it is unclear to which study
program the representations should refer.
Some degree programs are partly carried out at Graz University of Technology, partly at the University
of Graz. In some cases, students are free to choose their courses. A data comparison with the
University of Graz is not directly possible, also because different systems are used.
7
Finally, we found several data issues such as flaws in the course lists of study programs or grading
entries, which normally does not interfere with perception, as they tend to be hidden, but are presented
more clearly in the new overview.
Within the user involvement and discussion, the following issues were part of our discussion.
Whether students need to compare themselves with other students and how this comparison might
affect student motivation and study behavior. We decided to include this information, as it is already
available due to transparency issues for all students.
Whether students can "so easily" see information about the pass rates in the last exams of a course
when they take it. We decided again, that these data are available anyhow already, so we do not see
why we should not show it.
Whether teachers and counselors should or should not be allowed to view students’ dashboards with or
without the consent of the student.
However, despite recurring discussion with students and teachers the agreed objectives were not
changed or, for example, insights into professors' examination data were restricted.
5. DASHBOARD DESIGN AND VISUALISATION DETAILS
The students’ dashboard design and its several features will be described within the following
paragraphs.
5.1 Dashboard overview
The dashboard is displayed to all students of Graz University of Technology who are in current Bachelor's
program (Fig. 7). It is accessible as well for some and individual selected discontinuing Bachelor's degree
programs. The dashboard was designed to provide students with visualization to assist with the following:
Students can see their success in their studies: the dashboard shows which courses have been
completed and how the student has performed in detail. At the same time, it shows which
achievements are still missing for graduation.
Students can plan their study progress: Students can better assess their performance status and plan
further learning steps with the help of the recommended semester target.
Students focus on courses: Students can see the grade distribution of all courses in their degree
program so far in the dashboard. The grade distribution can help them focus on courses that other
students have found particularly challenging.
Students can compare their performance. The dashboard allows students to compare their performance
with their peers. They can find out if they are ranked in the top 10 for individual courses or where
their performance compares to students in their year.
[Add Fig. 7 here]
Figure 7. A screen shot of a TU Graz students’ dashboard. Source: TU Graz.
5.2 Details of the dashboard visualization
The dashboard is divided into the following functional areas (see Fig. 8): Study selection (1), year of
study ECTS (2), semester recommendation (3), course of study (4), courses (5), history (6), legend (7),
support (8) log-out (9), and feedback (10).
8
Figure 8. Different parts of the TU Graz students’ dashboard. Source: TU Graz.
By clicking on different components, further detailed evaluations can be displayed. The selection of the
degree program (1) is only possible for students who are enrolled in several Bachelor's degree programs.
The section “Academic Year ECTS” (2) shows the achievements in the current academic year in the
form of ECTS credits (Fig. 9). The displayed total number of ECTS (a) ranks in the scale of study
progress (b). The values 16, 30 and 60 describe ECTS points to be achieved: From 16 ECTS onwards,
students are considered to be exam-active. Achieving 16 ECTS is linked to the entitlement to Austrian
family allowance for students. 30 ECTS are the extent of a semester 60 ECTS are the extent of an
academic year. In addition, students see where they stand in the context of a semester or the entire
academic year, but also in comparison to the performance of fellow students (cohort) (c).
[Add Fig. 9 here]
Figure 9. The “Academic Year ECTS” part of the TU Graz students’ dashboard. Source: TU Graz.
Under “Semester Recommendation” (3) the recommended courses per semester are displayed (see Fig.
10). The section is structured as follows: ECTS total number of studies (a), all semesters of studies (b),
electives and optional subjects of your studies (c).
[Add Fig. 10 here]
9
Figure 10. The “semester recommendation” part of the TU Graz students’ dashboard. Source: TU Graz.
The course of studies (4) shows the courses completed per semester (see Fig. 11). It is structured as
follows: The tile shows how many ECTS have been completed in the course of the Bachelor's degree, in
the example it is 82 ECTS (a). Under Recognitions(b) you will find all courses that have been
completed at Graz University of Technology or at other educational institutions and have been recognized
for the degree program at TU Graz. The completed semesters are mapped according to the course of study
(c). To display the completed courses per semester, the student has to click on the corresponding semester.
The last tab (d) in the course of studies is always the current semester. Since the current semester is
selected by default, courses completed in this semester are automatically displayed in the Courses section.
Figure 11. The “course of studies” part of the TU Graz students’ dashboard. Source: TU Graz.
Courses are marked with either a colored, a blue or a grey circle. Colored circles indicate the
distribution of grades from 1 (“very good”) to 5 (“unsatisfactory”; see Fig. 12 left). Blue circles refer to
completed courses with a group size of less than 5 persons (see Fig. 12 in the middle). For data protection
reasons, the breakdown of grades and the number of participants is not permitted for such small groups.
Courses that have not been completed are marked with a grey circle (see Fig. 12 right). Within the circle,
students will find details of their grades (colored small circle), die Number of ECTS, the number of
entries, the number of participants, an indication of whether the student is in the top 10, 20 or 50% (not
here) and an indication of the format of the course. Clicking on a course opens its history and displays the
distribution of grades from all previous courses. This function is intended above all to help students better
assess challenges when planning the semester.
10
Figure 12. Different ways to show lectures at TU Graz students’ dashboard. Completed course with more than 5
participants (left), completed course with less than 5 participants (middle) recommended but not yet completed
course (right. Source: TU Graz.
A detailed guide helps students to use all the functions and understand the information presented (TU
Graz, 2020). In addition, further support and counseling services are pointed out. The guide as well
addresses that the visualization cannot take account personal circumstances, “such as caring
responsibilities, employment or other studies you are pursuing. Study progress is very individual, so you
should interpret this information accordingly“ (TU Graz, 2020, p. 12, own translation).
6. IMPLEMENTATION AND USER FEEBACK
In summer term 2020, the dashboard was made accessible for all for Bachelor studies of the Faculty of
Computer Science and Biomedical Engineering at TU Graz. This introduction was accompanied by
further information on the dashboard and communicated through emails and teachers. By 9th of June
2020, 743 students have had access to the dashboard. 27 students had taken the opportunity to provide
feedback on the dashboard. The feedback was requested as free text. An analysis showed that more than
half, namely 54%, were clearly positive and reported minor errors or made suggestions for improvement,
38% others did not comment positively or negatively on the dashboard and only made suggestions for
improvement. The suggestions for improvement refer to improvements in the legend up to the
optimization of the mobile version. Only 8% of the responses were negative, with students saying it was
unnecessary or that they did not understand the point of it because the information already existed. The
comparison between students was also rejected. Among the explicitly positive feedback are those that
emphasize that the dashboard helps to better assess one's own performance and to get a good overview.
After this successful start the dashboard was made available university-wide in December 2020.
7. RECOMMENDATIONS AND OUTLOOK
Finally, the authors will present recommendations as guidelines for similar projects based on their
experience and students’ feedback and an outlook for future development and research.
1. Limit to data and information that is already available. When developing the dashboard, focus
on the things data and information that are already available to students but, for example, are on
different systems or several pages or mainly in text form. This avoids many discussions about
"what students should see in the first place", because the information is already available to them.
At least we were able to focus more on the visualization or technical aspects in all the discussions.
11
2. Limit the access to all who had it already. In our case at TU Graz teachers or students’ service
has no full access of student study progress. Although we had some discussion about this issue,
we stayed focused on a service and visualization only for the individual student. In this way, we
were able to take a clear position on this issue and did not open up a discussion space that could
distract from the actual plan and implementation of the student progress dashboard.
3. Use existing pattern and colors. Concerning visualization, all organizations as well as cultures
have a more or less formally (corporate identity) or informally established specifications and
pattern. Checking existing tools, pages and online services, in our case, especially such for
students make it easier to decide for colors and more. Practically, we use as well the traffic light
colours as they are used in other existing study dashboards as well (see Fig. 3)
4. Try unusual visualization type. As the literature review of Schwendimann et al. (2017) showed,
the donut data visualization is not very common in learning dashboards for students (see Fig. 2).
Nevertheless, the donut was highly supported by our students. We see that the students not only
liked our rather unusual solution, but that they also understood it well.
5. Co-design with students. This recommendation is not only based on the insight that key users
can also provide valuable feedback, but on the experience that they are highly engaged and
sometimes surprisingly competent (visually and technically, we are a technical university). We
have the impression that the strong involvement of the students made the process not only more
effective but also significantly more efficient.
6. Do not underestimate data management and integration of stakeholders. In our case, we were
able to carry out the described implementation relatively smoothly, but we already knew the data
and its origin, as well as the systems and possible challenges, and we were also aware of which
intentional units absolutely had to be involved and at which stage of the process. If a dashboard
development is the first measure in the field, significantly greater efforts should be expected here
at a large university.
From the perspective of the student dashboard, there are still some limitations that we have already
described above, but for which no solutions are currently foreseeable, for example for the necessary data
exchange with the university with which we offer joint degree programs. We have therefore not planned
any major further developments, also in view of the positive feedback on the dashboard. The extent to
which other forms of support for studying and learning can be used and have an effect for students is
something we will initially investigate primarily at our learning management system at the level of
individual courses or individual MOOCs on the MOOC platform.
ACKNOWLEDGEMENTS
Contributions and development were partly delivered within the project Learning Analytics: Effects of
data analysis on learning success(01/2020-12/2021) with Graz University of Technology and University
of Graz as partners and the Province of Styria as funding body (12. Zukunftsfonds Steiermark).
REFERENCES
Charleer, S.; Moere, A. V.; Klerkx, J.; Verbert, K.; De Laet, T. (2018). Learning Analytics Dashboards to Support Adviser-
Student Dialogue. In: IEEE Transactions on Learning Technologies, v11 n3 p389-399 Jul-Sep 2018
Ebner, M.; Schön, S.; Braun, C.; Ebner, M.; Grigoriadis, Y.; Haas, M.; Leitner, P.; Taraghi, B. COVID-19 Epidemic as E-
Learning Boost? Chronological Development and Effects at an Austrian University against the Background of the Concept of
“E-Learning Readiness”. Future Internet 2020, 12, 94. URL: https://www.mdpi.com/1999-5903/12/6/94
De Laet, T., Broos, T., van Staalduinen, J.-P., Ebner, M., Leitner, P. (2018a). Transferring learning dashboards to new contexts:
experiences from three case studies. In: Conference Proceeding Open Educational Global Conference 2018. p. 14. Delft,
Netherlands
12
De Laet, T., Broos, T., Verbert, K., van Staalduinen, J.-P., Ebner, M. & Leitner, P. (2018b) Involving Stakeholders in Learning
Analytics: Opportunity or Threat for Learning Analytics at Scale? Workshop. In: Companion Proceedings 8th International
Conference on Learning Analytics & Knowledge. Sydney, pp. 602-606.
Drachsler, H., & Greller, W. (2016). Privacy and analytics - it’s a DELICATE issue: A checklist to establish trusted learning
analytics. Proceedings of the 6th International Conference on Learning Analytics and Knowledge, S. 8996.
http://dx.doi.org/10.1145/2883851.2883893
Ferguson, R., Hoel, T., Scheffel, M., & Drachsler, H. (2016). Guest editorial: Ethics and privacy in learning analytics. Journal of
learning analytics, 3(1), 515. http://dx.doi.org/10.18608/jla.2016.31.2
Henriquez, V. et al. (2020). LALA Piloting. Project report of the LALA project. Version 3.1, 15.2.2021, URL:
https://www.lalaproject.org/wp-content/uploads/2021/03/VF4Pilots_English_-all_universities.pdf
Leitner P., Maier K., Ebner M. (2020). Web Analytics as Extension for a Learning Analytics Dashboard of a Massive Open
Online Platform. In: Ifenthaler D., Gibson D. (Eds.), Adoption of Data Analytics in Higher Education Learning and Teaching.
Advances in Analytics for Learning and Teaching. Springer, Cham; https://doi.org/10.1007/978-3-030-47392-1_19
Leitner P., Ebner M., Ebner M. (2019). Learning Analytics Challenges to Overcome in Higher Education Institutions. In:
Ifenthaler D., Mah DK., Yau JK. (eds.) Utilizing Learning Analytics to Support Study Success. Springer, Cham
Maier, K., Leitner, P., & Ebner, M. (2019). Learning Analytics Cockpit for MOOC Platforms. In Emerging Trends in Learning
Analytics. Leiden, Netherlands: Brill | Sense. doi: https://doi.org/10.1163/9789004399273_014
Reimers, G., & Neovesky, A. (2015). Student focused dashboardsAn analysis of current student dashboards and what students
really want. In Proceedings of the 7th international conference on computer supported education (CSEDU) (pp. 399404).
Ribecca, S. (2021). Chart Selection Guide. In: The Data Visualisation Catalogue, Posting from 1st of January 2021, URL:
https://datavizcatalogue.com/blog/chart-selection-guide/
Schwendimann, B. A., Rodríguez-Triana, M. J.; Vozniuk, A.; Prieto, L.P., Boroujeni, M.S. and Holzer, A., Gillet, D.,
Dillenbourg, P. (1017). Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research, in
IEEE Transactions on Learning Technologies, vol. 10, no. 1, pp. 30-41, 1 Jan.-March 2017, doi: 10.1109/TLT.2016.2599522.
Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior,
78, 397407. https://doi.org/10.1016/j.chb.2017.06.030
Sedrakyan, G., Mannens, E., & Verbert, K. (2018). Guiding the choice of learning dashboard visualizations: Linking dashboard
design and data visualization concepts. Journal of Visual Languages & Computing. doi:10.1016/j.jvlc.2018.11.002
Stofer, K.A. (2016). When a Picture Isn't Worth 1000 Words: Learners Struggle to Find Meaning in Data Visualizations. In:
Journal of Geoscience Education, v64 n3 p231-241 Aug 2016.
Teasley, S.D. (2017). Student Facing Dashboards: One Size Fits All? In: Technology, Knowledge and Learning, v22 n3 p377-384
Oct 2017, URL: https://link.springer.com/article/10.1007/s10758-017-9314-3#ref-CR27
TU Graz (2021). Learning Analytics (Internal Webpage). URL: https://tu4u.tugraz.at/studierende/mein-laufendes-
studium/learning-analytics/ (2021-01-15)
TU Graz (2020). Leitfaden: Grundlagen des Studierenden-Dashboards. Internal document. URL:
https://tu4u.tugraz.at/fileadmin/Studierende_und_Bedienstete/Anleitungen/Studierenden-
Dashboard_Funktionen_Leitfaden.pdf (2021-01-15).
West, D., Luzeckyj, A., Searle, B., Toohey, D., Vanderlelie, J., & Bell, K. R. (2020a). Perspectives from the stakeholder:
Students’ views regarding learning analytics and data collection. Australasian Journal of Educational Technology, 36(6), 72-
88. https://doi.org/10.14742/ajet.5957
West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J., & Searle, B. (2020b). Do academics and university administrators really
know better? The ethics of positioning student perspectives in learning analytics. Australasian Journal of Educational
Technology, 36(2), 60-70. https://doi.org/10.14742/ajet.4653
... It has been classified into a single-member category with the same name. Specifically, prior research works have investigated the role of visual and/or multimedia elements in helping students monitor learning progress [72,112], serving as a motivation for formulating this design principle. To support learners with guidance on their learning progress, provide visual elements for the learning progress. ...
Conference Paper
Providing argumentation feedback is considered helpful for students preparing to work in collaborative environments, helping them with writing higher-quality argumentative texts. Domain-independent natural language processing (NLP) methods, such as generative models, can utilize learner errors and fallacies in argumentation learning to help students write better argumentative texts. To test this, we collect design requirements, and then design and implement two different versions of our system called ALure to improve the students’ argumentation skills. We test how ALure helps students learn argumentation in a university lecture with 305 students and compare the learning gains of the two versions of ALure with a control group using video tutoring. We find and discuss the differences of learning gains in argument structure and fallacies in both groups after using ALure, as well as the control group. Our results shed light on the applicability of computer-supported systems using recent advances in NLP to help students in learning argumentation as a necessary skill for collaborative working settings.
... Das Dashboard wurde dabei auf Wunsch und gemeinsam mit Studierenden entwickelt, die auch die einzigen sind die Informationen und Visualisierungen einsehen können (s. Leitner et al., 2021). Wie in Abbildung 4 gezeigt, erhalten Studierende einen Überblick über die bisherigen Lehrveranstaltungsabschlüsse, auch im Vergleich zu den anderen Studierenden und erhalten Einblicke in die Notenverteilung bei den noch zu belegenden Lehrveranstaltungen. ...
... [Place Fig. 1 Although we already have been conducting learning analytics at iMooX.at for several years and have published on the topic, e.g. [3][4][5][6], we have not regularly filed data on details of video watching. Only one of our papers analyzed video data and compared how learners in a MOOC deal with H5P-based interactive videos versus videos without such interactions [6]. ...
Chapter
Full-text available
Many MOOCs use units with videos and quizzes, where a successful attempt after several tries is the basis for a MOOC certificate. A first in-depth analysis of quiz behavior within a MOOC at the Austrian MOOC platform iMooX.at had shown several quiz attempts patterns (Mair et al. 2022). As a next step, the researchers now collected details on video watching within a new edition of the same MOOC and therefore could combine data on quiz and video behavior. This analysis shows similar distribution of the quiz attempt patterns as in our first analysis. Additionally, the analysis indicates that learners who completed more quiz attempts than needed for full point results or passing have a higher average video watching time than learners who only made attempts until reaching a full score or passing.KeywordsMOOC; quiz behaviorVideo behaviorLearningLearning analytics
... Idea workshops with students and teachers were introduced here, which also provided results for the unit. One example of such a result from a students' workshop is the implementation of a students' dashboard (Leitner et al., 2021). About the development of the strategy, it should be mentioned that a combination of the classic top-down approach and the bottom-up approach is represented, especially through the involvement of stakeholders (see Ebner et al., 2021). ...
Chapter
Full-text available
The use of digital technologies in teaching to make it more varied, better, more diverse, or even more accessible is being pursued systematically at many universities. This article shows the developments in the digital transformation of teaching at Graz University of Technology (TU Graz) over the last 17 years. In the process, the various activities of Graz University of Technology and of the central department of teaching and learning technologies about the digital transformation of teaching and its focus and change during this period are described in the form of a workshop report. The consequences and developments of the Covid-19 pandemic on digital transformation efforts are also addressed. This is contrasted with results of two students’ surveys from 2014 (N = 1,502 complete questionnaires) and 2021 (N = 1,386 complete questionnaires). Within this contribution, the authors use the survey’s data to assess how students’ attitudes towards technology-enhanced teaching were changing at TU Graz. Mean indices were created to be able to compare the two surveys. This shows that despite the less good experience with teaching at TU Graz during the pandemic the attitude towards digital teaching is relatively satisfying. Nevertheless, the authors point out that the students clearly indicate that digital (distance) learning has a negative impact on communication between students and teachers as well as between students themselves, and that measures would be desirable here.KeywordsUniversity teachingTechnology-enhanced learningDigital transformationOrganizational development
Presentation
Full-text available
Learning Analytics is a possibility to understand how teaching and learning in online courses might work. This presentation summarizes research studies done in the last 10 years
Chapter
Full-text available
Human-centred design is a well-established approach within research fields such as human-computer interaction, ergonomics, and human factors. Recently Learning Analytics (LA) researchers and practitioners have manifested great interest in exploring methods and techniques associated with this approach to manage the design process in ways that can enhance human interaction with LA technology. The project “Learning Analytics – Students in Focus” aims to use student-related data to support the learning and teaching process in a higher educational context. Our interdisciplinary team investigates LA tools that leverage students’ academic success by acquiring or developing self-regulated learning skills. We adopted a Human-Centred Learning Analytics (HCLA) approach involving students, teachers, and other educational stakeholders in the iterative design of our LA tools. This article contributes to the discussion on how to design LA tools using a human-centred approach. We describe the analysis, design, implementation, and evaluation process of three LA tools comprised in our students’ dashboard, i.e., the planner, the activity graph, and the learning diary. In addition, we present key results gained in several empirical studies which had an implication on the tools’ design. Finally, we provide insights about our experience with the HCLA approach, pointing out benefits and limitations in practice.KeywordsHuman-centred learning analyticsSelf-regulated learningLearning analytics dashboard
Chapter
Full-text available
Schön, Sandra; Leitner, Philipp; Lindner, Jakob & Ebner, Martin (2023). Learning Analytics in Hochschulen und Künstliche Intelligenz. Eine Übersicht über Einsatzmöglichkeiten, erste Erfahrungen und Entwicklungen von KI-Anwendungen zur Unterstützung des Lernens und Lehrens. In: Tobias Schmohl, Alice Watanabe, Kathrin Schelling (Hg.), Künstliche Intelligenz in der Hochschulbildung, Bielefeld: transkript, S. 27-49. Online zugänglich unter: https://www.transcript-verlag.de/media/pdf/c9/16/59/oa9783839457696.pdf Erschienen unter der Lizenz CC BY SA 4.0 International (https://creativecommons.org/lice nses/by-sa/4.0/deed.de)
Article
As a result of the COVID-19 pandemic, the learning and evaluation processes have been moved to an online modality to keep social distance and reduce the spreading of the virus. The strategies implemented for assessment and proctoring in this online remote teaching and assessment emergency are no exception when proctoring test-takers. This problem is addressed from a practical context of study: the English Language Proficiency Tests of a University in southeast Mexico. Considering an iterative user-centered mixed methodology, a set of dashboards was designed, implemented and evaluated to visualize the information generated by test-takers during the administration process. An increase in the Usability of the dashboards is observed in all heuristic categories, with visual design being greater. The use of the mixed methodology and the constant user feedback during the process helped us to reduce development time compared with other works found in the literature. Moreover, it is possible to use the proposed dashboards in other application domains like medicine, or care facilities where user activity monitoring is needed to make informed decisions. categoryHuman-centered computing; Information visualization
Article
Full-text available
This article reports on a study exploring student perspectives on the collection and use of student data for learning analytics. With data collected via a mixed methods approach from 2,051 students across six Australian universities, it provides critical insights from students as a key stakeholder group. Findings indicate that while students are generally comfortable with the use of data to support their learning, they do have concerns particularly in relation to the use of demographic data, location data and data collected from wireless networks, social media and mobile applications. Two key themes emerged related to the need for transparency to support informed consent and personal-professional boundary being critical. This supports findings from other research, which reflects the need for a nuanced approach when providing information to students about the data we collect, including what we are collecting, why and how this is being used. Implications for practice or policy: When implementing the use of dashboards, institutions should ideally include opportunities for students to opt in and out, rather than being set so that students have agency over their data and learning. When undertaking work in relation to learning analytics, staff need to ensure the focus of their work relates to student learning rather than academic research. When institutions and academic staff collect and use student data (regardless of the purpose for doing so), all aspects of these processes need to be transparent to students.
Chapter
Full-text available
Massive open online courses (MOOCs) provide anyone with Internet access the chance to study at university level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of data representing their learning process. Learning Analytics (LA) can help identifying, quantifying, and understanding these data traces. Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit, it shall act as a future proof framework to be reused and improved over time. Aimed toward administrators and educators, the dashboard contains interactive widgets letting the user explore their datasets themselves rather than presenting categories. This supports the data literacy and improves the understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heatmaps to identify regions of learner’s interest and help separating structure and content. Activity over time is aggregated in a calendar view, making timely reoccurring patterns otherwise not deductible, now visible. Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become easier to improve the teaching and learning process by tailoring the provided content to the needs of the online learning community.
Article
Full-text available
The COVID-19 crisis influenced universities worldwide in early 2020. In Austria, all universities were closed in March 2020 as a preventive measure, and meetings with over 100 people were banned and a curfew was imposed. This development also had a massive impact on teaching, which in Austria takes place largely face-to-face. In this paper we would like to describe the situation of an Austrian university regarding e-learning before and during the first three weeks of the changeover of the teaching system, using the example of Graz University of Technology (TU Graz). The authors provide insights into the internal procedures, processes and decisions of their university and present figures on the changed usage behaviour of their students and teachers. As a theoretical reference, the article uses the e-learning readiness assessment according to Alshaher (2013), which provides a framework for describing the status of the situation regarding e-learning before the crisis. The paper concludes with a description of enablers, barriers and bottlenecks from the perspective of the members of the Educational Technology department.
Article
Full-text available
Increasingly learning analytics (LA) has begun utilising staff- and student-facing dashboards capturing visualisations to present data to support student success and improve learning and teaching. The use of LA is complex, multifaceted and raises many issues for consideration, including ethical and legal challenges, competing stakeholder views and implementation decisions. It is widely acknowledged that LA development requires input from various stakeholders. This conceptual article explores the LA literature to determine how student perspectives are positioned as dashboards and visualisations are developed. While the sector acknowledges the central role of students, as demonstrated here, much of the literature reflects an academic, teacher-centric or institutional view. This view reflects some of the key ethical concerns related to informed consent and the role of power translating to a somewhat paternalistic approach to students. We suggest that as students are the primary stakeholders – they should be consulted in the development and application of LA. An ethical approach to LA requires that we engage with our students in their learning and the systems and information that support that process rather than assuming we know we know what students want, what their concerns are or how they would like data presented.
Chapter
Full-text available
Within the sector of education, Learning Analytics (LA) has become an interdisciplinary field aiming to support learners and teachers in their learning process. Most standard tools available for Learning Analytics in Massive Open Online Courses (MOOCs) do not cater to the individual's conception of where Learning Analytics should provide them with insights and important key figures. We propose a prototype of a highly configurable and customizable Learning Analytics Cockpit for MOOC-platforms. The ultimate goal of the cockpit is to support administrators, researchers, and especially teachers in evaluating the engagement of course participants within a MOOC. Furthermore, comparing learner's individual activity to course wide average scores should enhance the self-assessment of students, motivate their participation, and boost completion rates. Therefore, several metrics were defined which represent and aggregate learner's activity. From this predefined list, stakeholders can customize the cockpit by choosing from multiple visualization widgets. Although, the current prototype focuses only on a minimal group of stakeholders, namely administrators and researchers. Therefore, it is designed in a modular, highly configurable and customizable way to ensure future extensibility. It can be strongly carried out that customization is integral to deepen the understanding of Learning Analytic tools and represented metrics, to enhance the student's learning progress.
Conference Paper
Full-text available
This papers focuses on the use of learning dashboards in higher education to foster self-regulated learning and open education. Students in higher education have to evolve to independent and lifelong learners. Actionable feedback during learning that evokes critical self-reflection, helps to set learning goals, and strengthens self-regulation will be supportive in the process. Therefore, this paper presents three case studies of learning analytics in higher education and the experiences in transferring them from one higher education institute than the other. The learning dashboard from the three case studies is based on two common underlying principles. First, they focus on the inherent scalability and transferability of the dashboard: both considering the underlying data and the technology involved. Second, the dashboard use as underlying theoretical principles Actionable Feedback and the Social Comparison Theory. The learning dashboards from the case studies are not considered as the contribution of this paper, as they have been presented elsewhere. This paper however describes the three learning dashboards using the general framework of Greller and Drachsler (2012) to enhance understanding and comparability. For each of the case study, the actual experiences of transferability obtained within a European collaboration project (STELA, 2017) are reported. This transferability and scalability is the first-step of creating truly effective Open Educational Resources from the Learning Analtyics Feedback dashboards. The paper discusses how this collaboration impacted and transformed the institutes involved and beyond. The use of open education technology versus proprietary solutions is described, discussed, and translated in recommendations. As such the research work provides insight on how learning analytics resources could be transformed into open educational resources, freely usable in other higher education institutes.
Conference Paper
Full-text available
This article introduces the goal and activities of the LAK 2018 half-day workshop on the involvement of stakeholders for achieving learning analytics at scale. The goal of the half-day workshop is to gather different stakeholders to discuss at-scale learning analytics interventions. In particular the workshop focuses on learning analytics applications and learning dashboards that go beyond the implementation in a single course or context, but that have at least the potential for scaling across different courses, programs, and institutes. The main theme of the workshop is to explore how the involvement of different stakeholders can strengthen or hinder learning analytics at scale. The key findings, recommendations, and conclusions of the workshop will be presented in a summarizing report, which will be shaped as a SWOT analysis for stakeholder involvement for achieving learning analytics at scale.
Article
Learning dashboards are known to improve decision-making by visualizing learning processes and helping to track where learning processes evolve as expected and where potential issues (may) occur. Despite the popularity of such dashboards, little is known theoretically on the design principles. Our earlier research reports on the gap between dashboards’ design and learning science concepts, subsequently proposing a conceptual model that links dashboard design principles with learning process and feedback concepts. This paper extends our previous work by mapping the dashboard design and data/information visualization concepts. Based on a conceptual analysis and empirical evidence from earlier research, recommendations are proposed to guide the choice of visual representations of data in learning dashboards corresponding to the intended feedback typology.
Article
This paper presents LISSA ("Learning dashboard for Insights and Support during Study Advice"), a learning analytics dashboard designed, developed, and evaluated in collaboration with study advisers. The overall objective is to facilitate communication between study advisers and students by visualising grade data that is commonly available in any institution. More specifically, the dashboard attempts to support the dialogue between adviser and student through an overview of study progress, peer comparison, and by triggering insights based on facts as a starting point for discussion and argumentation. We report on the iterative design process and evaluation results of a deployment in 97 advising sessions. We have found that the dashboard supports the current adviser-student dialogue, helps them motivate students, triggers conversation and provides tools to add personalisation, depth, and nuance to the advising session. It provides insights at a factual, interpretative, and reflective level and allows both adviser and student to take an active role during the session.
Article
More and more learning in higher education settings is being facilitated through online learning environments. Students’ ability to self-regulate their learning is considered a key factor for success in higher education. Learning analytics offer a promising approach to supporting and understanding students’ learning processes better. The purpose of this study was to investigate students’ expectations toward features of learning analytics systems and their willingness to use these features for learning. A total of 20 university students participated in an initial qualitative exploratory study. They were interviewed about their expectations of learning analytics features. The findings of the qualitative study were complemented by a quantitative study with 216 participating students. Findings show that students expect learning analytics features to support their planning and organization of learning processes, provide self-assessments, deliver adaptive recommendations, and produce personalized analyses of their learning activities.