Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Jun 19, 2017
Content may be subject to copyright.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
On Using Interactivity to Monitor the Attendance of Students
at Learning-Videos
Josef Wachtler
Educational Technology
Graz University of Technology
Austria
josef.wachtler@tugraz.at
Martin Ebner
Educational Technology
Graz University of Techn olo gy
Austria
martin.ebner@tugraz.at
Abstract: Many studies are claiming that compulsory attendance has its benefits for students
because it is positively influencing their performance in consequence. This means that fully
attending students are receiving a better grade than those who skip parts of the course. Whereas
there are existing technologies to monitor the attendance of students in standard classroom
situations the possibilities to provide such functionalities at online videos are limited. Because of
that this work introduces an approach to assess the performance of students at videos. For that a
web-platform which offers a detailed recording of the watched parts of videos is presented. The
evaluation shows the performance of the students at a course consisting of videos with compulsory
attendance and furthermore it points out that the approach for monitoring the attendance at online
videos is basically working.
Introduction
At institutions of higher education there are courses with compulsory attendance. This is done because it is assumed
that students will benefit from attending classes (Rodgers 2002). This assumption is based on the results of many
studies regarding compulsory attendance and its benefits.
For instance it is claimed by (Devadoss and Foltz 1996) that students who are attending all units of a course are
prone to receive a better grade. This means that in comparison to students with a rate of attending of 50% the grade
is a full degree higher. A similar finding is reported by (Romer 1993). It is stated that full attendance leads to the
second best grade on average whereas students employing an attendance of one quarter are scoring the third best
grade. Furthermore it is stated that a key factor to avoid a negative grade is attendance at most of the in-class
sessions (Park and Kerr1990). In addition it is revealed by (Bai & Chang 1996) that students are feeling more
supported by their teachers if occasionally attendance checks are performed.
Due to the fact that in a standard classroom situation as well as at videos students are typically passive listeners,
such settings have only a consuming character. Based on this it seems to be obvious that interaction as well as
communication could be considered as major influencing factors of the learning success. To transform passive
watchers to active learners it is important to offer different methods of interaction and to provide possibilities of
communication in all forms and directions. (Carr-Chellman & Duchastel 2000) (Ebner & Holzinger 2003)
Because of the evidence presented by the mentioned studies compulsory attendance as well as interactive
possibilities of participation are used to support the performance of the students. Obviously a mechanism to monitor
the attendance is required. In case of standard classroom situations there are some techniques available however this
is not entirely true for videos. Due to that and also encouraged by the evolving trend of so called MOOCs (short for
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
Massive Open Online Courses) (Khalil & Ebner 2013) (Ebner et al. 2014) this paper aims to introduce and evaluate a
possibility to monitor the attendance of students at compulsory learning-videos by using interactive components.
The research question we would like to address: How to use interactivity to assess the performance of students at
learning videos?
At first some possibilities to monitor the attendance in standard classroom situations are presented by the Section
Related Work. After that a web-platform implementing an approach to assess the attendance of students at videos is
introduced (see Section Interactive Video-Platform). This is followed by the description of the research design
which is used to evaluate the web-platform. Finally the results are presented and discussed by the Sections
Evaluation and Discussion.
Related Work
To evaluate the possibility of monitoring the attendance at videos it is required to compare this approach with
methods used in standard classroom situations. Because of that this section presents some of these techniques.
One of the most common ways of controlling the attendance of students is the usage of a list which is handed from
student to student in the lecture theatre and has to be signed by each attending student. This approach has several
drawbacks. On the one hand it is possible that after signing the list a student could leave the lecture theatre and on
the other hand it could happen that one student signs the list on behalf of many others especially in huge classes.
This approach could be improved by placing some personnel at the doors of the lecture theatre to oversee the signing
process and furthermore it is also possible to check the ID-card of the students.
A completely different approach is the usage of an ARS (short for Audience Response System) (Haintz et al. 2014).
Such a system is generally used to ask questions during the lecture to the students. These questions have to be
answered by the students using a special handset or their mobile phone. If it is required for the students to
authenticate at the ARS it is possible to use the list of answering students as a proof for their attendance. It is clear
that this only works if the ARS is used quite often during the whole lecture.
In comparison to that a very special way of attendance monitoring is planned by the Aoyama Gakuin University
(APA 2009). Each student is equipped with a smartphone by the university. With the help of the built-in GPS-
module the movement of the students at the campus is tracked and because of that it is possible to examine if the
students are at the correct location in terms of compulsory attendance. It is assumed that the students will not give
their smartphone to other students who are carrying multiple smartphones to falsify the attendance monitoring
because a smartphone usually contains sensitive and personal data. However at this time no research is available to
proof the accuracy of this approach.
Interactive Video Platform
To monitor the attendance of students at learning-videos a web-platform first introduced by (Ebner et al. 2013) is
developed. It is called LIVE (short for Live Interaction in Virtual Learning Environments) and it offers the
possibility to enrich videos or live-broadcastings with different methods of interactivity. Furthermore it is possible to
monitor the attendance of the students in a very detailed way.
Due the fact that the performance of individual students should be evaluated it is clear that LIVE is available for
registered and authenticated users only. A user management system offers the following roles of users (Wachtler &
Ebner 2014a):
• Ordinary students are solely able to watch the videos, to participate at the interactive questions as well as to
analyze their own performance.
• Users with teacher-privileges are additionally allowed to create video events with interactions. Moreover
they have the possibility to evaluate the performance of the attendees of the events created by them.
• Researchers have the clearance to download all the data generated by LIVE in the form of spreadsheets.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
• To manage the privileges of the users as well as to change the settings of the web-platform there are
Administrators.
Attendance Monitoring
An important part of assessing the attendance of students is a detailed monitoring of their behavior while watching
the video. For that LIVE records for each student who attends a video the following values (Wachtler & Ebner
2014b):
• Absolute time of joining (e.g. 2016-12-11 13:45:27)
• Relative time of joining (e.g. 00:00)
• Absolute time of leaving (e.g. 2016-12-11 13:48:49)
• Relative time of leaving (e.g. 03:22)
With these values it is possible to carry out for each student when she/he watched which part of the video.
Furthermore this is also the base for some functionalities of evaluation provided by LIVE.
At first there is a so called “Timeline Analysis” (Fig. 1). It acts as a first overview of the distribution of the number
of watchers (Wachtler et al. 2016a). The diagram shows on the x-axis the timeline of the video and on the y-axis the
number of attendees. The green line represents the number of different students who watched the video. Because of
the fact that users could watch a video more often than once the red line states the number of views. With the
mouse-pointer it is possible to move the vertical red crosshair along the x-axis of the video. This enables the teacher
to see for each second of the video the exact number of users and views in the box below the diagram.
Figure 1: The “Timeline Analysis” states the number of users and views for each second of the video.
The second part of the attendance monitoring is a list of all attendees (Fig. 2) (Wachtler & Ebner 2014a). This list
prints in its first column the name as well as the username of each student who watched at least a part of the video.
Behind that the second column draws a bar which indicates the watched timespan of each joined student. Below the
bar the watching time is printed. The percent value in the brackets states how much of the video the corresponding
student has watched.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
Figure 2: A list of attendees shows how much of the video they have watched
In addition to the list of all attendees there exists a more detailed analysis of each watcher which is accessible by
clicking on the name in the list. Now a complete history of the watched parts is displayed (Fig. 3) (Wachtler &
Ebner 2014a). In a timeline of the video each joined part is marked by a red bar. Further details are shown by
hovering such a bar with the mouse-pointer. This includes the time of joining and leaving in absolute as well as in
relative values. Moreover a so called attention level is calculated (Wachtler & Ebner 2014b). This is a value ranging
from 0% (very absent) to 100% (very attentive). It is manly based on the reaction times to the interactive
components.
Figure 3: The history highlights the watched parts of a video
Interactive Components
The detailed recording of the watched timespans and the different possibilities of analysis presented by the previous
section are useful but probably not accurate. This statement is motivated by the fact that a student could easily start
the video playback and do something completely different while the video plays in the background. To prevent such
an abuse LIVE offers interactive components during the videos.
It can be seen in (Fig. 4) that such interactive components are displayed on top of the video (Ebner et al. 2013)
(Wachtler & Ebner 2014a). In this example a multiple-choice question is shown and the video is automatically
paused. To resume playing it is required to react to this question. With this mechanism the students are transformed
from passive watchers to active participants and due to that the attendance monitoring is more accurate because they
are forced to do something to watch the full video.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
Figure 4: A video is interrupted by a multiple-choice question
LIVE offers a wide range of different types of interactions which could be displayed during the videos. The teacher
has to select at least one of the following methods of interaction while deploying a video (Wachtler & Ebner 2014a)
(Wachtler et al. 2016b):
• Simple Questions
o general questions which are not related to the content of the video
o random and automatic
o useful for overcoming a longer phase of no content-related questions
• Solve CAPTCHAs
o used for the same reasons as the simple questions above
• Text-Based Questions
o The teacher is enabled to ask text-based questions to the students.
o At live-broadcastings he could enter the question in a textbox and send it to the students instantly.
o At videos he has to place such a question at a specific position in the video before releasing it.
• Multiple-Choice Questions
o real multiple-choice questions or true/false questions
o Before deploying the video the teacher could add questions of this type at pre-defined positions
throughout the video
To invite the students to become even more active there are additionally methods of interaction which could be
triggered by the watchers (Wachtler & Ebner 2014a) (Wachtler et al. 2016b):
• Ask Teacher
o The attendees are able to ask a question to the teacher by entering it in a textbox.
o To answer the question the teacher could use a specific dialog or he could send it by e-mail.
• Report a Technical Problem
o The students are enabled to report a technical problem to the teacher via a dialog.
o This feature is mainly used at live-broadcastings to report problems with the video feed.
• Set Attention
o With the help of a slider the watchers are able to set their current level of attention
All of the mentioned methods of interaction provide analysis features to the teacher. The most important parts are
the evaluation of the multiple-choice questions as well as of the text-based questions (Wachtler & Ebner 2014a). It
can be seen (Fig. 5) that in case of multiple-choice questions the number of answers is printed in brackets behind
each possible answer. Below that a list with the performance of each student is shown. The first column states the
name of the student and the second one indicates if she/he tried to answer the question. Furthermore the last two
columns count the number of correct and wrong attempts because it is possible to watch a video more often than
once and therefore the question is displayed in every run. Based on that the column named “More Correct” sums up
if there are more correct, more wrong or evenly spread attempts.
Figure 5: The analysis of a multiple-choice question
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
In comparison to that the analysis of the text-based questions looks very similar but it is clear that in this case it is
not possible to evaluate the performance of the students automatically. Due to that it is required that the teacher
labels each listed answer manually. For that the appropriate buttons are offered.
Research Design
The web-platform presented by the previous section is used at a course at XXXXXXXXXXXXXXXXXX. This
course is named “Building Materials Basics” and is located in the second semester of the bachelor program for
“Civil Engineering Sciences and Construction Management”. In this course the students are introduced to the basics
of the utilization of building materials as well as to the relevant features and characteristic values of building
materials for carried and non-carried components.
The laboratory part of this course consists of practical demonstrations and 304 Students enrolled for it. It is vital and
therefore compulsory that all students are able to see all 17 demonstrations. It seems to be obvious that it is
impossible to stack this large number of students in a laboratory. Because of that the students are divided into 17
groups. Each group is responsible for one demonstration and the members of the other groups are forced to watch
video recordings of all demonstrations.
In (Fig. 6) the flow of events in this course is illustrated. At first the students are asked to form groups on their own
accord. After that the teacher assembles a final group list by distributing students without a group to already existing
ones. Furthermore a unique username is assigned to each student for an easy identification at the final evaluation.
Now each group starts with their demonstrations. At first it is required to write a script for the video and after that
the actual filming of the demonstration took place. Finally the raw video material is cut to a suitable learning-video
of the demonstration in the phase of post-production. The finished videos have to be delivered to the teacher not
later than the submission deadline.
After the teacher has received the videos he deploys them with LIVE. To ensure the accuracy of the attendance
monitoring he also adds some interactive questions to the video. Now all students are required to watch every video
and to participate to the interactive questions until a final deadline. Based on the performance of the students which
is recorded and displayed by LIVE the teacher evaluates the attendance of the students.
Figure 6: The flow of events in the course
Evaluation
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
This section evaluates both, the attendance of the students at the videos itself as well as their acceptance of the used
approach and web-platform. For the first part the analysis tools offered by LIVE (see above) are used and for the
latter an interactive component is placed in some videos.
A diagram (Fig. 7) sums up the number of students at each video. The green bar at the left hand side of the diagram
represents the number of students who initially registered for the course. On the right side of this bar there are bars
for every video. The blue ones indicate the number of students who started watching the videos. In comparison to
that the orange bars show how many of them watched the videos until the end. A threshold of 90% for the full video
is used because most of the videos are showing their credits at the end and due to that, students usually quit the
video with the beginning of the credits. Finally the yellow bars are showing how many students watched the videos
more often than once.
Figure 7: The number of students at each video
On examining the numbers presented by the diagram three issues are visible:
1. There are approximately 50 students who didn´t watch the videos.
2. The number of fully watching students has a decreasing tendency at the later videos.
3. Not very many students watched the videos more often than once.
The first issue is derived from the fact that there are 304 students who initially registered for the course and only
approximately 250 started watching the videos. A deeper examination points out that the vast majority of the
missing 50 students are the same students at each video.
In comparison to the first one the second issue addresses the fact that at the final videos the number of students who
finished the videos is lowering. Up to the sixth video the number of early leaving students is less or equal to seven.
Beginning with the ninth video this number is increasing to at least twelve and it reaches its maximum at the last
video with 26 students who did not watch the full video.
Finally the third issue points out that the number of students who watched the videos again is not very high. It can be
seen that only at the first and at the last video this number is above 60. In comparison to that all the other videos are
only re-watched by 20 students at the maximum.
The evaluation of the integrated interactive components revealed that nearly all students provided an answer to
them. In (Fig. 8) the results of multiple-choice questions placed in the middle of each video are printed. These
questions are related to the content to the videos and they are used for both, the support of students’ attention and to
evaluate their short-term learning success. It is visible that the correctness rate is quite high in general. Two
exceptions are occurring at the video number one and 17. In these cases the ratio of wrong answers is above 15%.
To measure the acceptance of both, the used approach to monitor the attendance as well as the web-platform
implementing it, an interactive survey was embedded in the videos. The students are asked to state how they liked
the usage of this interactive video platform to monitor their attendance. For that they were required to express their
acceptance by using the grading system of Austrian schools. This system consists of numbers ranging from 1 being
the best to 5 being the worst. It can be seen (Fig. 9) that the best two grades are the dominating answers (16.13% and
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
27.05%). Furthermore the middle grade (3) is also assigned quite often (23.37%). In summary this leads to a median
of 3 and a mean of 2.92 with a standard deviation of 1.26.
Figure 8: The correctness rate of multiple-choice questions at each video
Figure 9: The acceptance of the attendance monitoring
Discussion
This section discusses the results of the evaluation (see above) of the attendance monitoring as well as the used web-
platform. At first the issue that approximately 50 students did not watch the videos is tried to explain. This is done
because 50 of 304 students are approximately 16% which could be considered as a large number.
One valid explanation could be that there have been technical problems for the students to watch the videos.
However this is very unlikely because on the one hand the web-platform did not have any downtime during the
phase of watching. On the other hand there have been no reports of such problems in the forum of the course.
A further explanation could be that these 50 students are simply not interested in successfully finishing the lecture.
This seems to be the most likely reason because it has been clearly communicated that it is compulsory to watch the
videos. Furthermore it was public knowledge that missing the videos will lead to a negative grade.
It is also required to analyze the fact that the number of early leaving students is increasing with the growing number
of the videos. First it has to be noted that most of the students watched one video directly after another. Due to that it
is valid to consider the 17 videos as one video of approximately one hour in length. Based on that it seems to be
obvious that it was not possible to maintain the attention of the students long enough so that they started leaving
earlier to finish the task of watching sooner (Wachtler & Ebner 2015). If this is true it was probably not clear for
them that their attendance is monitored in such a detailed way.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
Furthermore it is tried to find an explanation for the fact that the number of students who watched the videos more
often than once is not very high. The two most likely reasons might be that on the one hand students are wanting to
get the task of watching done or on the other hand the videos are well-made that it is possible for the students to
absorb the content with one view only. Additionally the larger number of re-watching students at the first as well as
at the last video could be explained with the purpose of testing because the difficulty of the content is not higher at
these videos.
In comparison to that the high formal participation at the interactive components indicates that the watching students
did it in an active way. Because of that the attendance monitoring could be considered as accurate. In addition the
high correctness rate of the answers to the multiple-choice questions could be used as an indicator that the
attendance monitoring in conjunction with interactive components are leading to a success in terms of short-term
learning.
The result of the survey regarding the acceptance of the web-platform and the related approach for monitoring the
attendance points out that it is liked or at least accepted by the students. The comments of the students to justify their
assigned grade are mainly variances of the following examples:
• “I like interactive videos because with the help of the questions I am able to watch the videos actively.”
• “I don’t have to go to the lab for all demonstrations because of the videos with attendance monitoring.“
• “The interactive components are stopping me from watching the videos faster.”
An interpretation of these statements could be that on the one hand students who are seriously working with the
videos are recognizing the benefits of interactive videos with attendance monitoring. On the other hand students who
only want to get the task done are labeling the interactive components as disturbances.
The approach implemented by the used web-platform seems to be basically working because it more or less provides
the same accuracy as attendance monitoring in a standard classroom situation with an ARS. With the help of the
requirement that each student uses a given username it is possible to evaluate the performance of each individual
student. However this leads to the problem that students could easily share their credentials and because of that it is
not possible to determine if a student is really watching the videos herself/himself. It can be seen that this is the
same problem as in a standard classroom situation where a list has to be signed to confirm the attendance (see
Section Related Work).
Outlook
For a more detailed evaluation of the use and effects of compulsory attendance at videos it is recommended to
analyze the long-term learning success. For that there should be four groups. The first one should be taught in a
standard classroom situation and the second one should additionally employ compulsory attendance. For the third
and fourth group it is recommended to use videos with and without compulsory attendance. Now a final exam could
be used to compare the long-term learning successes of the groups.
As mentioned above there are problems in ensuring that the students are watching the videos by themselves. To
address this problem several solution are possible. For instance a mechanism based on cryptography (e.g. digital
signature) could be implemented. Furthermore the authentication process of the web-platform could be linked to the
official student management system of the university. In both cases it is valid to assume that students will not share
such sensitive data with others to undermine the attendance monitoring.
Conclusion
With this document an approach to monitor the attendance of students at videos is introduced. The implementing
web-platform records the time of watching of each individual student in a very detailed way. Furthermore it adds
some interactive components to the video to improve the accuracy of the attendance monitoring. The evaluation at a
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
large course at Graz University of Technology shows the performance of the students. It is pointed out that most of
the students who started watching the videos also watched them until their end. Furthermore a short-term learning
success is observed at the multiple-choice questions which are placed in the videos as interactive components. In
addition the web-platform worked in an acceptable way and the majority of students liked the approach.
Finally the research question (see Section Introduction) is answered because a web-platform which implements an
approach for monitoring the attendance at videos is presented. Additionally an evaluation points out that the
approach is basically working and it recommends some improvements.
References
APA (2009). Anwesenheitskontrolle mit dem iPhone. In Der Standard
Bai, Y., & Chang, T. S. (2016). Effects of class size and attendance policy on university classroom interaction in Taiwan.
Innovations in Education and Teaching International, 53(3), 316-328.
Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. British Journal of Educational Technology, 31(3), 229-241.
Devadoss, S., & Foltz, J. (1996). Evaluation of factors influencing student class attendance and performance. American Journal
of Agricultural Economics, 78(3), 499-507.
Ebner, M., & Holzinger, A. (2003). Instructional Use of Engineering Visualization: Interaction Design in e-Learning for Civil
Engineering. Human–computer interaction, theory and practice, 1, 926-930.
Ebner, M., Wachtler, J., & Holzinger, A. (2013). Introducing an information system for successful support of selective attention
in online courses. In Universal Access in Human-Computer Interaction. Applications and Services for Quality of Life (pp. 153-
162). Springer Berlin Heidelberg.
Ebner, M., Lackner, E., & Kopp, M. (2014, October). How to MOOC?-A pedagogical guideline for practitioners. In The
International Scientific Conference eLearning and Software for Education (Vol. 4, p. 215). " Carol I" National Defence
University.
Haintz, C., Pichler, K., & Ebner, M. (2014). Developing a Web-Based Question-Driven Audience Response System Supporting
BYOD. J. UCS, 20(1), 39-56.
Khalil, H., & Ebner, M. (2013). Interaction Possibilities in MOOCs–How Do They Actually Happen. In International Conference
on Higher Education Development (pp. 1-24).
Park, K. H., & Kerr, P. M. (1990). Determinants of academic performance: A multinomial logit approach. The Journal of
Economic Education, 21(2), 101-111.
Rodgers, J. R. (2002). Encouraging tutorial attendance at university did not improve performance. Australian Economic Papers,
41(3), 255-266.
Romer, D. (1993). Do students go to class? Should they?. The Journal of Economic Perspectives, 7(3), 167-174.
Wachtler, J., & Ebner, M. (2014a, June). Support of Video-Based lectures with Interactions-Implementation of a first prototype.
In World Conference on Educational Multimedia, Hypermedia and Telecommunications (Vol. 2014, No. 1, pp. 582-591).
Wachtler, J., & Ebner, M. (2014b). Attention Profiling Algorithm for Video-Based Lectures. In Learning and Collaboration
Technologies. Designing and Developing Novel Learning Experiences (pp. 358-367). Springer International Publishing.
Wachtler, J., & Ebner, M. (2015, June). Impacts of interactions in learning-videos: A subjective and objective analysis. In
EdMedia: World Conference on Educational Media and Technology (Vol. 2015, No. 1, pp. 1611-1619).
Wachtler, J., Khalil, M., Taraghi, B., & Ebner, M. (2016a). On Using Learning Analytics to Track the Activity of Interactive
MOOC Videos. In Proceedings of the workshop on Smart Environments and Analytics in Video-Based Learning (SE@ VBL),
LAK2016.
Originally published in: Wachtler, J. & Ebner, M. (2017). On Using Interactivity to Monitor the Attendance of Students at
Learning-Videos. In Proceedings of EdMedia: World Conference on Educational Media and Technology 2017 (pp. 278-288).
Association for the Advancement of Computing in Education (AACE).
Wachtler, J., Hubmann, M., Zöhrer, H., & Ebner, M. (2016b). An analysis of the use and effect of questions in interactive
learning-videos. Smart Learning Environments, 3(1), 13.