Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Dec 20, 2021
Content may be subject to copyright.
Preliminary version / draft – original published in: Leitner P., Ebner M., Geisswinkler H., Schön S. (2021) Visualization of
Learning for Students: A Dashboard for Study Progress – Development, Design Details, Implementation, and User Feedback. In:
Sahin M., Ifenthaler D. (eds) Visualizations and Dashboards for Learning Analytics. Advances in Analytics for Learning and
Teaching. Springer, Cham. https://doi.org/10.1007/978-3-030-81222-5_19
VISUALIZATION OF LEARNING FOR STUDENTS: A DASHBOARD
FOR STUDY PROGRESS
Development, design details, implementation, and user feedback
Philipp Leitner, Graz University of Technology, philipp.leitner@tugraz.at
Martin Ebner, Graz University of Technology, martin.ebner@tugraz.at
Hanna Geisswinkler, Graz University of Technology, hanna.geisswinkler@tugraz.at
Sandra Schön, Graz University of Technology, sandra.schoen@tugraz.at
Abstract: At Graz University of Technology (TU Graz, Austria) the learning management system based on Moodle
1
is called
TeachCenter. Together with a campus management system – called TUGRAZonline – it is the main infrastructure
for digital teaching and general study issues. As central instances for both teachers and students, various services
and support for them are offered. The latest developments include the design and implementation of a study
progress dashboard for students. This dashboard is intended to provide students a helpful overview of their
activities: It shows their academic performance in ECTS compared to the average of their peers, their own study
progress and the official study recommendation as well as the progress in the various compulsory and optional
courses. The first dashboard prototype was introduced to all computer science students in May 2020, and a
university-wide rollout started in December 2020. The chapter describes design considerations and design
development work, implementation as well the user feedback on the implementation. Finally, the authors present
recommendations as guidelines for similar projects based on their experience and students’ feedback and give an
outlook for future development and research.
Key words: Learner support, feedback, dashboard, study progress, university
1. INTRODUCTION
At Graz University of Technology (TU Graz) the organizational unit Educational Technology has
intensive experience in learning analytics and visualizations, including for our Austria-wide MOOC
platform iMooX.at (Maier, Leitner & Ebner, 2019; Leitner, Maier & Ebner 2020), for the university-wide
learning management system TeachCenter (Leitner, Ebner & Ebner, 2019) and through numerous
international research cooperation (De Laet et al. 2018a, De Laet et al., 2018b). When students expressed
the wish to get a better and easier overview of their study progress, we were happy to comply.
Not only TU Graz students share this wish. Reimers and Neovesky (2015) for example asked German
students (N=194) what they want at their dashboards. Their findings are: “Nearly all questioned students
stated that they would like to see all information relevant to their studies in one central place. 93% expressed
agreement by selecting 1 or 2 on the scale. Almost as many (85%) agreed (selecting 1 or 2) on wanting an
overview of deadlines to better organize their studies.” Building up on their investigations of typical LA
dashboard available for students, very often using the learning management system, but no student
information system with grades, the authors complain: “Yet, none of the online platforms discussed above
provides these two features” (p. 403).
In this article, we trace the development of the TU Graz students’ study progress dashboard and present
its design aspects in detail. We as well refer to our experiences and lessons learned so that practitioners
could take our approach as blueprint and get helpful insights into our challenges.
1
https://moodle.org/ - last accessed February 10th, 2021
2
2. INITIAL SITUATION AND ADRESSED OBJECTIVES
At Graz University of Technology (TU Graz) the learning management system (Moodle) is called
TeachCenter and together with the campus management system – called TUGRAZonline – it is the main
infrastructure for teaching. As central instances for both teachers and students, various services and support
for teachers and students are offered. The information students could find there concerning their study
progress in 2019, were the courses successfully completed to date and ECTS achieved, including data on
other examination candidates (for example failure rate for a particular examination). The design of this
information was largely based on textual information spread across several pages (see Figure 1).
[Add Fig. 1 here]
Figure 1. A screenshot of students’ information from the campus management system. Source: TU Graz.
The TU Graz study progress dashboard for students is intended to provide a helpful overview of students'
activities, for example their academic performance in ECTS compared to the average of their peers, their
own study progress and the official study recommendation as well as the progress in the various
compulsory and optional courses. By visualizing learning data, students should keep an eye on their own
learning process, which can ultimately lead to an improvement in their learning success. The advantages
and objectives for students should be in detail (TU Graz, 2021):
• Students can see their learning achievements graphically.
• Students can regularly check their own learning progress.
• Students determine their individual learning status on the basis of a comparison group.
• Students can optimize their learning process.
• Each student only sees his/her own results.
Teachers as well as the university administration of Graz University of Technology should also benefit
from the use of learning analytics (TU Graz, 2021a): The visualizations and evaluations help to better
understand teaching and learning processes, reduce dropouts, lead to more transparency, result in higher
examination activity, make an important contribution to the optimization of study departments and student
counseling and are the subject of research and the careful handling of data.
3. STUDENTS’ DASHBOARDS AND DESIGN CONSIDERATIONS
As described above, we could build upon several years of experience and research in the field of LA in
higher education. Basically, we are aware of potential challenges: Greller and Drachsler (2016) have
developed a well-known framework model for the development and deployment of LA applications. From
our perspective for the higher education context (in particular building upon Ferguson et al., 2016) we see
seven challenges as crucial (Leitner, Ebner & Ebner, 2019): purpose and benefits of learning analytics,
privacy protection, development of a clear procedure with regard to desired and undesired scenarios (ethics),
the data, infrastructure, development and operation (incl. estimation of foreseeable costs), as well as other
activities, e. g. training of supervisors.
Building upon this, we decided for example to only use data which is practically already available for
students, for example the results of lectures with more than 5 years. This way we can be sure that no new
examinations could arise for the use of the data.
At the same time, we had to make sure that only students could receive the information on their grades
and study status, as nothing else has been provided for so far. We did this as well be aware that students do
not appreciate their data being used for student counseling purposes: West et al. (2020a) asked more than
2,000 Australian students about their view on learning analytics and more than a half of them had concerns
3
that their data could be used “to trigger support services to contact you” or “trigger academic staff to contact
you” or “your data being used by the university for research” (amongst other options, p. 80).
With regard to the design and visualizations, we were able to draw on corresponding previous work and
developments: A literature review of the state-of-the-art of research on learning dashboards by
Schwendimann et al. (2017) included 55 papers, primarily focusing on learning dashboards in learning
management systems and MOOC platforms. Concerning their analysis, 14 papers (25%) describe
dashboards which builds upon data of several platforms. 28 papers (51%) are about dashboards (also for)
students. Further, the literature review gives insights into visualization types used. We have selected the
results for papers that see students as dashboard users and present them in Figure 1. However, we do not
know how many of these papers deal with study dashboards.
Figure 2. Visualization types in dashboard addressing students in papers according to a literature review of 55 papers
by Schwendimann et al. (2017). Source: Own visualization of the results presented in Schwedimann et al. (2017) fig.
7, using the number of papers for this target group as a base (n=28)
The study shows that bar charts, line crafts, tables and pie charts are very often used in dashboards for
students. Study dashboard have a focus on study results and not on learners’ activities, for example their
interaction with other learners or procrastination time. Therefore, visualization at a study dashboard might
use different or special visualization schemes. The type of visualization can be related with the aim of the
dashboard and the addressed learning support (see Sedrakyan, Mannens & Verbert 2018).
In our case, it is of deeper interest what kind of visualization is used for which data in existing
dashboards. For example, Charleer et al. (2018) developed "LISSA". With this tool, students and student
counselors receive visualizations of students' grades and study status, also in comparison with their peers.
Further, the LALA project describes several study dashboards, partly inspired by LISSA (Henriquez, 2019).
In Figure 3, we show the abstract visualization approach of three different implemented dashboards from
LALA partner universities to visualize students’ progress use colors to mark the students’ status (passed,
failed, running) as well as different needs to show marks or format of the lectures.
[Add Fig. 3 here]
4
Figure 3. Three abstract versions of study dashboard visualizations from partner universities of the LALA project
(from left to right: AvAc, TrAC and SiCa). Source: Own visualization based on screen shots in Henriquez (2019).
In all three cases, clicking on a particular lecture gives an overview of the grades for all students,
including previous years. These are typically provided as bar charts of cumulative results with lines or dots
for the individual student. It is important to emphasize that all of these dashboards described are not intended
for the students themselves, but for counselors.
Nevertheless, for us here the students were a central partner in the development and we planned from
the outset to involve them strongly in the development (see de Laet et al., 2018b). With this, we would like
to create a helpful tool that meets with as little rejection as possible (Schumacher & Ifenthaler, 2018).
Generally, this focus is demanded again and again, but not so often implemented: Although the learning
analytics field “acknowledges the central role of students”, West et al. (2020b) states that “much of the
literature reflects an academic, teacher-centric or institutional view” (from abstract).
With regard to the visualization to be developed, we were aware that visualizations are not per se
understood by everyone, legends and assistance must be provided (e.g. Stofer, 2016 for maps). So, we were
aware that we always had to question whether what we were visualizing was actually understood in the way
we thought. We as well know, that especially comparisons with other students are not always motivating,
and can as well also to let arise feelings of superiority (Teasley, 2017). Very exact positioning, especially at
the worse end or also at the better end of the student distribution or a clear ranking should be avoided. But
we can also imagine that visualizations with clear rankings might be less problematic in cultures where
students are used to that.
4. DEVELOPMENT AND CHALLENGES
This chapter describes the development and implementation of the students’ dashboard for the TU Graz
over time as well the challenges we have encountered along the way.
4.1 Development and implementation over time
Our development was planned and implemented within a time frame of two years. In the following we
describe the process from the first idea development till the current status.
5
Figure 4. Students’ dashboard development to university-wide implementation over time.
As can be seen in Figure 4, stakeholder involvement, especially student involvement was a core feature of
the development process. Design and technical development were developed in the “Educational
Technology” team together with colleagues from the IT-Services at TU Graz. Information and
communication issues for students were developed by the “Higher Education and Program Development”
team.
• 03/2019: The Educational Technology teams organized workshops with in summary 7 students and
faculty representatives from 5 disciplines on general creative ideas on enhancing the study support and
situation: Amongst many other ideas, students highlighted that a better overview about their study
process would be helpful.
• 03/2019-10/2019: We did a first analysis of data structures and origins, and development of
visualizations.
• 07-2019-09/2019: We organized a co-design workshops for the intermediate dashboards with student
representatives.
• 11/2019-12/2019: We did several test in the internal testing phase in small groups of students.
• 02/2020-03/2020: Several meetings were organized with stakeholders, including student
representatives, teachers, vice-rector for academic affairs, works council, legal department. In parallel,
information materials for students were developed (TU Graz 2021a, TU Graz, 2021b).
• 06/2020: 18 months after the first vague idea, we implemented the new dashboard for study progress
for all BA-students at the Faculty of Computer Science. We collected user feedback and made small
revisions.
• 12/2020: After 6 months of test phase, we did the university-wide implementation for all BA-students,
including several information materials, set-up of advisory structure, user feedback and small
revisions.
The development of the dashboard was delayed by a few weeks over time due to the closure of the
university in March 2020, as all resources were needed at short notice to provide the necessary technical
support for emergency teaching (Ebner et al., 2020). Overall, however, in retrospect, the implementation
took place quickly and smoothly, probably also due to the existing experience with similar projects.
6
4.2 Development of design and visualization
As shown, our visualization and design team co-designed the dashboard together and within several
meetings with the key (and only) user group, which are students. The design decisions are aimed at
making it as easy as possible for users to get started and to understand the visualizations correctly. Design
considerations were done concerning the needed information, the available data, main type of data chart,
colors as well additional information such as labels and legends. Within the students’ co-design workshop,
we worked in smaller groups and asked for a brainstorming concerning the question on needed, wished
information and visualization on a perfect students’ dashboard (see Fig. 5 left). Afterward, we asked the
groups to make a first sketch (see Fig. 5 middle). In a later phase, we asked students for feedback with the
help of paper prototypes (see Fig. 5 on the right).
[Add Fig. 5 here]
Figure 5. Artifacts from the co-design with students’ sessions and feedback rounds. Left: Students’ needs and first
ideas, middle: a first sketch, right: annotated paper prototype.
Concerning the design, we checked for familiar color schemes and chart styles that are used already at
other parts of the TU Graz pages for students. The color scheme of red, orange, yellow and green as signal
color, related to the traffic light scheme was one of the first options which made it through the
development process. These colors help to identify difficult courses, conflicts and the need for action at a
glance. No signal colors are grey and blues.
Although there are several types of visualizations of the concept “parts-to-a-whole” (Ribecca, 2021;
see Fig. 6), only some are well-known and easy to understand according to the students’ feedback within
our workshops: Bar and pie charts are the two best-known and easy to understand chart types. The donut
chart, a variant of the pie chart, can be as well used as optical boundary.
[Add Fig. 6 here]
Figure 6. Selection of a donut chart from several possible visualizations of parts-to-a-whole. Own illustration of the
collection by Ribecca (2021)
As it will be shown later, the donut chart serves in our dashboard as a good delineation of different
courses: Important key figures are arranged radially in the mid. This makes it easier to recognize and
understand related data. A cluttered user interface is prevented by displaying labels and legends primarily
via 'mouse-over' (a graphical element that is activated when the user moves or hovers the pointer over a
trigger area). This minimalism contributes to clarity and is intended to reveal the functions to the user step
by step.
4.3 Challenges in implementation
Again, and again, smaller challenges arose. With regard to the available data and its quality, or challenges
in visualization, the following are among them:
• It is not clear whether students study according to the current study regulations or according to the
older ones that may still be valid.
• If students are registered for several study programs (up to 4 are possible), it is unclear to which study
program the representations should refer.
• Some degree programs are partly carried out at Graz University of Technology, partly at the University
of Graz. In some cases, students are free to choose their courses. A data comparison with the
University of Graz is not directly possible, also because different systems are used.
7
• Finally, we found several data issues such as flaws in the course lists of study programs or grading
entries, which normally does not interfere with perception, as they tend to be hidden, but are presented
more clearly in the new overview.
Within the user involvement and discussion, the following issues were part of our discussion.
• Whether students need to compare themselves with other students and how this comparison might
affect student motivation and study behavior. We decided to include this information, as it is already
available due to transparency issues for all students.
• Whether students can "so easily" see information about the pass rates in the last exams of a course
when they take it. We decided again, that these data are available anyhow already, so we do not see
why we should not show it.
• Whether teachers and counselors should or should not be allowed to view students’ dashboards with or
without the consent of the student.
However, despite recurring discussion with students and teachers the agreed objectives were not
changed or, for example, insights into professors' examination data were restricted.
5. DASHBOARD DESIGN AND VISUALISATION DETAILS
The students’ dashboard design and its several features will be described within the following
paragraphs.
5.1 Dashboard overview
The dashboard is displayed to all students of Graz University of Technology who are in current Bachelor's
program (Fig. 7). It is accessible as well for some and individual selected discontinuing Bachelor's degree
programs. The dashboard was designed to provide students with visualization to assist with the following:
• Students can see their success in their studies: the dashboard shows which courses have been
completed and how the student has performed in detail. At the same time, it shows which
achievements are still missing for graduation.
• Students can plan their study progress: Students can better assess their performance status and plan
further learning steps with the help of the recommended semester target.
• Students focus on courses: Students can see the grade distribution of all courses in their degree
program so far in the dashboard. The grade distribution can help them focus on courses that other
students have found particularly challenging.
• Students can compare their performance. The dashboard allows students to compare their performance
with their peers. They can find out if they are ranked in the top 10 for individual courses or where
their performance compares to students in their year.
[Add Fig. 7 here]
Figure 7. A screen shot of a TU Graz students’ dashboard. Source: TU Graz.
5.2 Details of the dashboard visualization
The dashboard is divided into the following functional areas (see Fig. 8): Study selection (1), year of
study ECTS (2), semester recommendation (3), course of study (4), courses (5), history (6), legend (7),
support (8) log-out (9), and feedback (10).
8
Figure 8. Different parts of the TU Graz students’ dashboard. Source: TU Graz.
By clicking on different components, further detailed evaluations can be displayed. The selection of the
degree program (1) is only possible for students who are enrolled in several Bachelor's degree programs.
The section “Academic Year ECTS” (2) shows the achievements in the current academic year in the
form of ECTS credits (Fig. 9). The displayed total number of ECTS (a) ranks in the scale of study
progress (b). The values 16, 30 and 60 describe ECTS points to be achieved: From 16 ECTS onwards,
students are considered to be exam-active. Achieving 16 ECTS is linked to the entitlement to Austrian
family allowance for students. 30 ECTS are the extent of a semester 60 ECTS are the extent of an
academic year. In addition, students see where they stand in the context of a semester or the entire
academic year, but also in comparison to the performance of fellow students (cohort) (c).
[Add Fig. 9 here]
Figure 9. The “Academic Year ECTS” part of the TU Graz students’ dashboard. Source: TU Graz.
Under “Semester Recommendation” (3) the recommended courses per semester are displayed (see Fig.
10). The section is structured as follows: ECTS total number of studies (a), all semesters of studies (b),
electives and optional subjects of your studies (c).
[Add Fig. 10 here]
9
Figure 10. The “semester recommendation” part of the TU Graz students’ dashboard. Source: TU Graz.
The course of studies (4) shows the courses completed per semester (see Fig. 11). It is structured as
follows: The tile shows how many ECTS have been completed in the course of the Bachelor's degree, in
the example it is 82 ECTS (a). Under “Recognitions” (b) you will find all courses that have been
completed at Graz University of Technology or at other educational institutions and have been recognized
for the degree program at TU Graz. The completed semesters are mapped according to the course of study
(c). To display the completed courses per semester, the student has to click on the corresponding semester.
The last tab (d) in the course of studies is always the current semester. Since the current semester is
selected by default, courses completed in this semester are automatically displayed in the Courses section.
Figure 11. The “course of studies” part of the TU Graz students’ dashboard. Source: TU Graz.
Courses are marked with either a colored, a blue or a grey circle. Colored circles indicate the
distribution of grades from 1 (“very good”) to 5 (“unsatisfactory”; see Fig. 12 left). Blue circles refer to
completed courses with a group size of less than 5 persons (see Fig. 12 in the middle). For data protection
reasons, the breakdown of grades and the number of participants is not permitted for such small groups.
Courses that have not been completed are marked with a grey circle (see Fig. 12 right). Within the circle,
students will find details of their grades (colored small circle), die Number of ECTS, the number of
entries, the number of participants, an indication of whether the student is in the top 10, 20 or 50% (not
here) and an indication of the format of the course. Clicking on a course opens its history and displays the
distribution of grades from all previous courses. This function is intended above all to help students better
assess challenges when planning the semester.
10
Figure 12. Different ways to show lectures at TU Graz students’ dashboard. Completed course with more than 5
participants (left), completed course with less than 5 participants (middle) recommended but not yet completed
course (right. Source: TU Graz.
A detailed guide helps students to use all the functions and understand the information presented (TU
Graz, 2020). In addition, further support and counseling services are pointed out. The guide as well
addresses that the visualization cannot take account personal circumstances, “such as caring
responsibilities, employment or other studies you are pursuing. Study progress is very individual, so you
should interpret this information accordingly“ (TU Graz, 2020, p. 12, own translation).
6. IMPLEMENTATION AND USER FEEBACK
In summer term 2020, the dashboard was made accessible for all for Bachelor studies of the Faculty of
Computer Science and Biomedical Engineering at TU Graz. This introduction was accompanied by
further information on the dashboard and communicated through emails and teachers. By 9th of June
2020, 743 students have had access to the dashboard. 27 students had taken the opportunity to provide
feedback on the dashboard. The feedback was requested as free text. An analysis showed that more than
half, namely 54%, were clearly positive and reported minor errors or made suggestions for improvement,
38% others did not comment positively or negatively on the dashboard and only made suggestions for
improvement. The suggestions for improvement refer to improvements in the legend up to the
optimization of the mobile version. Only 8% of the responses were negative, with students saying it was
unnecessary or that they did not understand the point of it because the information already existed. The
comparison between students was also rejected. Among the explicitly positive feedback are those that
emphasize that the dashboard helps to better assess one's own performance and to get a good overview.
After this successful start the dashboard was made available university-wide in December 2020.
7. RECOMMENDATIONS AND OUTLOOK
Finally, the authors will present recommendations as guidelines for similar projects based on their
experience and students’ feedback and an outlook for future development and research.
1. Limit to data and information that is already available. When developing the dashboard, focus
on the things data and information that are already available to students but, for example, are on
different systems or several pages or mainly in text form. This avoids many discussions about
"what students should see in the first place", because the information is already available to them.
At least we were able to focus more on the visualization or technical aspects in all the discussions.
11
2. Limit the access to all who had it already. In our case at TU Graz teachers or students’ service
has no full access of student study progress. Although we had some discussion about this issue,
we stayed focused on a service and visualization only for the individual student. In this way, we
were able to take a clear position on this issue and did not open up a discussion space that could
distract from the actual plan and implementation of the student progress dashboard.
3. Use existing pattern and colors. Concerning visualization, all organizations as well as cultures
have a more or less formally (corporate identity) or informally established specifications and
pattern. Checking existing tools, pages and online services, in our case, especially such for
students make it easier to decide for colors and more. Practically, we use as well the traffic light
colours as they are used in other existing study dashboards as well (see Fig. 3)
4. Try unusual visualization type. As the literature review of Schwendimann et al. (2017) showed,
the donut data visualization is not very common in learning dashboards for students (see Fig. 2).
Nevertheless, the donut was highly supported by our students. We see that the students not only
liked our rather unusual solution, but that they also understood it well.
5. Co-design with students. This recommendation is not only based on the insight that key users
can also provide valuable feedback, but on the experience that they are highly engaged and
sometimes surprisingly competent (visually and technically, we are a technical university). We
have the impression that the strong involvement of the students made the process not only more
effective but also significantly more efficient.
6. Do not underestimate data management and integration of stakeholders. In our case, we were
able to carry out the described implementation relatively smoothly, but we already knew the data
and its origin, as well as the systems and possible challenges, and we were also aware of which
intentional units absolutely had to be involved and at which stage of the process. If a dashboard
development is the first measure in the field, significantly greater efforts should be expected here
at a large university.
From the perspective of the student dashboard, there are still some limitations that we have already
described above, but for which no solutions are currently foreseeable, for example for the necessary data
exchange with the university with which we offer joint degree programs. We have therefore not planned
any major further developments, also in view of the positive feedback on the dashboard. The extent to
which other forms of support for studying and learning can be used and have an effect for students is
something we will initially investigate primarily at our learning management system at the level of
individual courses or individual MOOCs on the MOOC platform.
ACKNOWLEDGEMENTS
Contributions and development were partly delivered within the project “Learning Analytics: Effects of
data analysis on learning success” (01/2020-12/2021) with Graz University of Technology and University
of Graz as partners and the Province of Styria as funding body (12. Zukunftsfonds Steiermark).
REFERENCES
Charleer, S.; Moere, A. V.; Klerkx, J.; Verbert, K.; De Laet, T. (2018). Learning Analytics Dashboards to Support Adviser-
Student Dialogue. In: IEEE Transactions on Learning Technologies, v11 n3 p389-399 Jul-Sep 2018
Ebner, M.; Schön, S.; Braun, C.; Ebner, M.; Grigoriadis, Y.; Haas, M.; Leitner, P.; Taraghi, B. COVID-19 Epidemic as E-
Learning Boost? Chronological Development and Effects at an Austrian University against the Background of the Concept of
“E-Learning Readiness”. Future Internet 2020, 12, 94. URL: https://www.mdpi.com/1999-5903/12/6/94
De Laet, T., Broos, T., van Staalduinen, J.-P., Ebner, M., Leitner, P. (2018a). Transferring learning dashboards to new contexts:
experiences from three case studies. In: Conference Proceeding Open Educational Global Conference 2018. p. 14. Delft,
Netherlands
12
De Laet, T., Broos, T., Verbert, K., van Staalduinen, J.-P., Ebner, M. & Leitner, P. (2018b) Involving Stakeholders in Learning
Analytics: Opportunity or Threat for Learning Analytics at Scale? Workshop. In: Companion Proceedings 8th International
Conference on Learning Analytics & Knowledge. Sydney, pp. 602-606.
Drachsler, H., & Greller, W. (2016). Privacy and analytics - it’s a DELICATE issue: A checklist to establish trusted learning
analytics. Proceedings of the 6th International Conference on Learning Analytics and Knowledge, S. 89–96.
http://dx.doi.org/10.1145/2883851.2883893
Ferguson, R., Hoel, T., Scheffel, M., & Drachsler, H. (2016). Guest editorial: Ethics and privacy in learning analytics. Journal of
learning analytics, 3(1), 5–15. http://dx.doi.org/10.18608/jla.2016.31.2
Henriquez, V. et al. (2020). LALA Piloting. Project report of the LALA project. Version 3.1, 15.2.2021, URL:
https://www.lalaproject.org/wp-content/uploads/2021/03/VF4Pilots_English_-all_universities.pdf
Leitner P., Maier K., Ebner M. (2020). Web Analytics as Extension for a Learning Analytics Dashboard of a Massive Open
Online Platform. In: Ifenthaler D., Gibson D. (Eds.), Adoption of Data Analytics in Higher Education Learning and Teaching.
Advances in Analytics for Learning and Teaching. Springer, Cham; https://doi.org/10.1007/978-3-030-47392-1_19
Leitner P., Ebner M., Ebner M. (2019). Learning Analytics Challenges to Overcome in Higher Education Institutions. In:
Ifenthaler D., Mah DK., Yau JK. (eds.) Utilizing Learning Analytics to Support Study Success. Springer, Cham
Maier, K., Leitner, P., & Ebner, M. (2019). Learning Analytics Cockpit for MOOC Platforms. In Emerging Trends in Learning
Analytics. Leiden, Netherlands: Brill | Sense. doi: https://doi.org/10.1163/9789004399273_014
Reimers, G., & Neovesky, A. (2015). Student focused dashboards—An analysis of current student dashboards and what students
really want. In Proceedings of the 7th international conference on computer supported education (CSEDU) (pp. 399–404).
Ribecca, S. (2021). Chart Selection Guide. In: The Data Visualisation Catalogue, Posting from 1st of January 2021, URL:
https://datavizcatalogue.com/blog/chart-selection-guide/
Schwendimann, B. A., Rodríguez-Triana, M. J.; Vozniuk, A.; Prieto, L.P., Boroujeni, M.S. and Holzer, A., Gillet, D.,
Dillenbourg, P. (1017). Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research, in
IEEE Transactions on Learning Technologies, vol. 10, no. 1, pp. 30-41, 1 Jan.-March 2017, doi: 10.1109/TLT.2016.2599522.
Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior,
78, 397–407. https://doi.org/10.1016/j.chb.2017.06.030
Sedrakyan, G., Mannens, E., & Verbert, K. (2018). Guiding the choice of learning dashboard visualizations: Linking dashboard
design and data visualization concepts. Journal of Visual Languages & Computing. doi:10.1016/j.jvlc.2018.11.002
Stofer, K.A. (2016). When a Picture Isn't Worth 1000 Words: Learners Struggle to Find Meaning in Data Visualizations. In:
Journal of Geoscience Education, v64 n3 p231-241 Aug 2016.
Teasley, S.D. (2017). Student Facing Dashboards: One Size Fits All? In: Technology, Knowledge and Learning, v22 n3 p377-384
Oct 2017, URL: https://link.springer.com/article/10.1007/s10758-017-9314-3#ref-CR27
TU Graz (2021). Learning Analytics (Internal Webpage). URL: https://tu4u.tugraz.at/studierende/mein-laufendes-
studium/learning-analytics/ (2021-01-15)
TU Graz (2020). Leitfaden: Grundlagen des Studierenden-Dashboards. Internal document. URL:
https://tu4u.tugraz.at/fileadmin/Studierende_und_Bedienstete/Anleitungen/Studierenden-
Dashboard_Funktionen_Leitfaden.pdf (2021-01-15).
West, D., Luzeckyj, A., Searle, B., Toohey, D., Vanderlelie, J., & Bell, K. R. (2020a). Perspectives from the stakeholder:
Students’ views regarding learning analytics and data collection. Australasian Journal of Educational Technology, 36(6), 72-
88. https://doi.org/10.14742/ajet.5957
West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J., & Searle, B. (2020b). Do academics and university administrators really
know better? The ethics of positioning student perspectives in learning analytics. Australasian Journal of Educational
Technology, 36(2), 60-70. https://doi.org/10.14742/ajet.4653