Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Jun 27, 2018
Content may be subject to copyright.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Increasing Learning Efficiency and Quality of Students´ Homework by
Attendance Monitoring and Polls at Interactive Learning Videos
Josef Wachtler
Educational Technology
Graz University of Technology
Austria
josef.wachtler@tugraz.at
Marco Scherz
Wor king G rou p Sus tain ab le Cons truc tion
Institute of Technology and Testing of Construction Materials
Graz University of Technology
Austria
marco.scherz@tugraz.at
Martin Ebner
Educational Technology
Graz University of Technology
Austria
martin.ebner@tugraz.at
Abstract: Due to the fact that students are confronted with a growing amount of texts, colours,
figures and shapes and due to their ability to process only a limited number of such information
simultaneously, it seems to be obvious that efforts should be made to increase the students’
attention-levels. This is important because research results have indicated that selective attention is
considered as the most valuable resource in the process of human learning. The application of
interaction and communication to the process of learning is a useful strategy to direct the students’
attention. It seems to be obvious that this is also true for learning videos. Therefore, this work
contains a description of how a video platform with interactive components can be used to support
the students and teacher. The video platform is explained and evaluated by analysing its usage in a
large teaching course at an institution of higher education. The application of this strategy
improved the students’ performance and optimized the teacher’s workload.
Introduction
It is widely known that students’ attention is heavily influenced by both interaction and communication. Therefore,
it is vital to offer many different forms of such methods in online courses. To achieve the highest benefit, interaction
and communication should be used in many forms and in all directions. This means that, on the one hand, the
methods offered should range from simple e-mail and face-to-face communication or forums to new interactive
learning methods. On the other hand, interaction and communication should take place not only between the teacher
and students, but the students should also communicate and interact with the content itself (Carr-Chellman &
Duchastel 2000) (Ebner & Holzinger 2003).
Furthermore, it is important to note that, in general, students are confronted daily with many pieces of text, figures,
colours and shapes. The number of such elements is also growing. It is obvious that students are only able to process
a limited amount of information at once (Shiffrin & Gardner 1972) and, therefore, most of this information is
filtered out centrally (Moran & Desimone 1985). It has been pointed out that the most crucial resource in human
learning is a mechanism called selective attention (Heinze et al. 1994). This seems to clearly indicate that supporting
and analysing this attention is of high importance (Ebner et al. 2013).
The attention analysis should be used to evaluate many aspects of the course as well as of the attending students.
Initially, the teacher can gather information about the depth of the students’ understanding. Furthermore, such an
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
analysis should be helpful, allowing the way of presenting the content to be adapted to the targeted audience and
evaluate whether the presented content is suitable for the attending students (Helmerich & Scherer 2007).
Based on the insights provided on interaction, communication and attention, it seems to be necessary to apply such
techniques to learning videos. This is also motivated by the increased offers of MOOCs (Massive Open Online
Courses), which are mainly based on videos (Khalil & Ebner 2013) (Ebner et al. 2014). For this reason, this
publication aims to point out a possibility of increasing the efficiency and quality of the students’ homework using
learning videos. Or with other words the investigated research question will be: “How can interactive components
added to learning videos be used to support the work of students and analyse this work?”
In the first section of this paper, the interactive video platform used for the course is explained. This is followed by a
description of the course itself (see Section Course Design). Subsequently, the work as well as the results of the
evaluation of the students’ attention-levels is discussed.
Interactive Video Platform
A video platform that offers the possibility to enrich videos with different methods of interactivity is used. This
platform is first introduced by (Ebner et al. 2013) and named LIVE (Live Interaction in Virtual learning
Environments). From the students’ points-of-view, the major benefit of this platform is the fact that it supports their
attention through the use of interactive components during the video. Additionally, the teachers are able to monitor
as well as to evaluate the students’ performance in great detail.
Based on this it seems to be obvious that it is necessary that LIVE is only available to registered and authenticated
users. To manage the privileges of the users, there are four different types of roles (Wachtler & Ebner 2014a)
(Wachtler & Ebner 2017):
• Ordinary students are only able to watch the videos, participate by answering the interactive questions and
analyse their own performance.
• Users with teacher-privileges are additionally allowed to create video events with interactions. Moreover,
they can evaluate the performance of those who attend events they create.
• Researchers have the clearance to download all the data generated by LIVE in the form of spreadsheets.
• To manage the privileges of the users as well as to change the settings of the web-platform, there are also
Administrators.
The primary functionality of LIVE is the addition of interactive features to videos or live-broadcastings. It can be
seen (Fig. 1) that such interactive components are displayed on top of a video. In this example, a multiple-choice
question is shown to the students. If such an interaction is presented, the video automatically pauses, and it is not
possible to continue watching until the student reacts to the interaction. In the case of this example, the question
must either be answered or the student refuses to answer by clicking the corresponding button. Furthermore,
students can trigger interactions by themselves. The control elements of such interactions are shown on the right-
hand side of the video. Above these control elements, the students are presented with a feedback regarding their
level of attention (see Section Attention-Profiling Algorithm) along with a button that, if clicked, allows them to
stop watching the video.
LIVE offers a wide range of possibilities to confront the students with varying types of interactive components,. The
teacher can add the following interactive components to the video while creating it (Wachtler & Ebner 2014a)
(Wachtler et al. 2016b) (Wachtler & Ebner 2017):
• Simple Questions
o general questions which are not related to the content of the video
o random and automatic
o useful for bridging a longer phase during which no content-related questions are asked
• Solve CAPTCHAs
o used for the same reasons as the simple questions above
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 1: A video is interrupted by a multiple-choice question
• Text-Based Questions
o the teacher can ask text-based questions to the students
o at live-broadcastings, she/he could enter the question into a textbox and send it to the students
instantly
o in videos, she/he must place such a question at a specific position in the video before releasing it
• Multiple-Choice Questions
o real multiple-choice questions or true/false questions
o before deploying the video, the teacher could add questions of this type at pre-defined timepoints
throughout the video
In addition, the teacher can encourage the students’ participation by offering them the possibilities to trigger
interactions. She/He can choose from the following methods of interaction (Wachtler & Ebner 2014a) (Wachtler et
al. 2016b) (Wachtler & Ebner 2017):
• Ask Teacher
o the attendees can ask a question to the teacher by entering it in a textbox
o to answer the question, the teacher could use a specific dialog or she/he could send it by e-mail
• Report a Technical Problem
o the students can report a technical problem to the teacher via a dialog
o this feature is mainly used during live-broadcastings to report problems with the video feed
• Set Attention
o with the help of a slider, the watchers can indicate their current level of attention
Post-Video Polls
LIVE also provides some interactivity after watching the videos in addition to the interactive components used
during the playback of the videos. The teacher can add polls which are displayed immediately after students stop
watching the video. While creating a poll, the teacher needs to enter a question and some answers. Furthermore,
she/he can enter a text that encourages the students to leave an explanation for their answer. Finally, the teacher
needs to specify how much of the video the students should have watched to display the poll. This step is necessary
for two reasons. First, if something is asked in the poll which requires knowledge of the whole video, it makes no
sense to display the poll to students who stop the video before it is finished. Second, it is also not helpful to show the
poll only at the end of the video due the fact that students often skip the last seconds of a video (e.g., lack of interest
in reading the credits).
After the students have participated in the polls, a detailed analysis is revealed to the teacher (Fig. 2). It can be seen
that at first the question is displayed. Below this question, a bar that indicates each possible answer is highlighted.
The lengths of the bars indicate how often the corresponding answer was selected by students. Below, the individual
answers of students are listed. In addition to the students’ names, the exact time they answered the question and the
selected answer is printed. Furthermore, the students’ explanation of their answer is only shown if the teacher has
asked for it while creating the poll.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 2: Analysis of a poll
Attention-Profiling Algorithm
During this study, the attention of the students is evaluated. For that, the functionalities of the Attention-Profiling
Algorithm (Wachtler & Ebner 2014b) of LIVE are used. This algorithm consists of two parts: On the one hand there
is a detailed record of the watched timespans of students and on the other hand there is the calculation of a so-called
attention-level.
A detailed monitoring of the watched timespans is done for each student watching a video. To cover the whole
watching-history of a student the following values are recorded for each video (Wachtler & Ebner 2014b):
• Absolute time of joining (e.g. 2017-08-11 13:45:27)
• Relative time of joining (e.g. 0 minutes 0 seconds)
• Absolute time of leaving (e.g. 2017-08-11 13:48:49)
• Relative time of leaving (e.g. 3 minutes 22 seconds)
Based on the values recorded, it is possible to track students’ activities (e.g., when they paused the video or used the
‘seeking’ function) during the video playback. Additionally, it is possible to visualize the watching-history of
students (Fig. 3). It can be seen that there is a timeline of the video and red bars are marking the watched periods. By
hovering the bars with the mouse pointer additional information could be displayed. The recorded values of joining
and leaving are displayed in both absolute and relative form. Furthermore, the lengths of the watched and the joined
timespan are shown. These lengths are sometimes different because the watched timespan is calculated from the
relative values and the joined timespan is derived from the absolute values. Finally, there is also the attention-level
for each timespan (see below).
Figure 3: The visualization of the recorded watched timespans
With these recorded values of joining and leaving further possibilities of evaluation are generated (Wachtler &
Ebner 2017). At first there is a list of all attending students presented to the teacher (Wachtler & Ebner 2014a). This
list shows the names of the students and the amount of the video they have watched. The watched timespan as well
as its equivalent as a percentage are shown. Furthermore, a visualization in the form of a line diagram shows the
timeline of a video along the x-axis and the number of the students watching the video along the y-axis. This
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
information allows the teacher to see the distribution of the students watching during the video playing period. This
helps the teacher identify which parts of the video are watched most often or watched the least.
As mentioned above, the second part of the Attention-Profiling Algorithm is the calculation of an attention-level.
This is a value ranging from 0% (completely absent) to 100% (fully attentive) which provides an indication of how
attentive a single student was while watching the video. The calculation of this value is based on the reaction time to
the interactive components. This means, in general, that the attention-level decreases if the reaction time increases. It
is clear that automatic and, therefore, simple questions can be much more quickly solved than more complex
interactive components such as multiple-choice questions. For this reason, the reaction time needed during each type
of interaction affects the attention-level in a different way. The attention-level is presented to the teacher in
combination with the watched timespans (Fig. 3), and could be used as a basic overview of how seriously students
watched the video.
Course Design
With in the course Building Materials Basics - Laboratory Practicals at Graz University of Technology, LIVE was
used for the second time while teaching and evaluating the contents of the course. During this semester, a focus was
placed on the newly-developed poll function.
Due to the newly-developed poll function, the concept introduced in the last year’s course, Building Materials
Basics - Laboratory Practicals, was adapted. The course is divided into theoretical and practical parts. The contents
of the theoretical part were presented in typical, individual lectures. This theoretical part was subdivided into the
following subject areas:
1. Aggregate
2. Binders 1
3. Binders 2
4. Fresh Concrete
5. Hardened Concrete 1
6. Hardened Concrete 2
7. Steel
8. Synthetic Materials
The sequence of the course Building Materials Basics - Laboratory Practicals is shown in Figure 4.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 4: Sequence of the course Building Materials Basics - Laboratory Practicals
In the practical part of the course, material experiments were shown to students at the Institute of Technology and
Testing of Construction Materials laboratories. Under the supervision of the teachers, the students were required to
carry out the respective material experiments by themselves. The students were divided into 16 groups (two groups
per subject area). The initial goal was to create two videos (created by different groups) for each subject area and
later carry out a comparative evaluation of the videos using the newly-developed poll function in LIVE. LIVE was
also used to query and assess the theoretical content of the subject areas with multiple-choice questions. After the
submission of the students’ videos, the aforementioned multiple-choice questions were inserted into the videos by
the teachers (Fig. 4).
While creating the videos, students were required to fulfil the following tasks:
• Carry out the material experiments
• Record the implementation of the material experiments
• Describe the material experiments by voice over
• Combine recorded footage and voice over
• Edit the videos
• Submit the videos
Figure 5: Section of an implemented video (Subject area Aggregate) in LIVE
The applicability of LIVE to many participants was proven in the course held in the 2016 spring term. The 2016
spring term course included 350 participants who had been taught and evaluated using LIVE. Due to curriculum
adaptions this year, about 150 students enrolled in the course.
After the videos had been uploaded to LIVE, students with assigned usernames were able to watch all videos as well
as answer the multiple-choice questions and participate in the implemented polls after each video. By inserting the
multiple-choice questions at regular intervals within the video sequences, not only the accuracy of the answers but
also the students’ attention-levels could be evaluated. The Attention-Profiling Algorithm of LIVE ensures that
students view the whole video as part of a continuous workflow. The evaluation of the attention level was made
possible by measuring the time between the appearance of the multiple-choice questions on the screen and the
timepoint the student answers the question. At the end of the videos, the polls for the respective video comparisons
could be completed.
Evaluation
This section presents the results of the lectures for the 2017 spring term by evaluating the data generated by LIVE..
Evaluation of the Attention-Level
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
In the illustration (Fig. 6), the evaluation of the level of attention paid by two, individually-selected students is
shown as an example. To compare the degree of attention paid by these two students, the video from the topic
‘Binders 1’ was selected. As already mentioned, two different groups needed to submit one video for the same
subject area. After their submission, the teachers combined the two videos into one video and inserted the four
multiple-choice questions into the combined video.
The total duration of this video is 12 minutes and 20 seconds and includes - like all other videos - four multiple-
choice questions. The analysis of the attention levels indicated that student 1 had a 77% attention-level and Student
2, 100%.
The difference in attention-levels is due to the fact that, after the first multiple-choice question (Q1) appeared (i.e., at
absolute time 2 minutes 19 seconds), student 1 required a time interval of 1 minute and 30 seconds to answer (A1)
and student 2, only 7 seconds (Fig. 6). Due to the differences in these time intervals, the subsequent multiple-choice
questions (Q2, Q3, and Q4) no longer appeared at the same absolute time. By adding the elapsed time between the
appearance of the multiple-choice questions and the answers given, student 1 has an elapsed time of 8 minutes and
58 seconds and student 2, 0 minutes and 30 seconds. The absolute end time of the video for student 1, thus, is 21
minutes and 18 seconds and 12 minutes and 50 seconds for student 2.
Figure 6: Comparison of attention-levels between two students
In summary, the degree of attention is very high for most students. The calculated average of the attention-levels per
video is between 94 and 97 percent. Only a small number of students had an attention level below 70 percent. This
shows that the attention level monitoring increases the students’ motivation to watch the entire video.
Evaluation of Multiple-Choice Questions
Using LIVE, it was possible to evaluate the accuracy of the answers to the multiple-choice questions for each
student. The following figure (Fig. 7) shows the individual questions (4 questions per video) and the students’
answers. The height of the bars indicates the number of students who answered the questions. The number is not
always the same, because certain students stopped watching the videos before the end and, thus, some multiple-
choice questions were not answered. The difficulty of the questions can be derived from the presentation of the bars
(green bars show the correct answers and the red ones, the incorrect answers). According to these results, Question 1
(Q1) from the topic area ‘Aggregate’ was the most difficult one. This question was answered correctly by 79
students and answered incorrectly by 86 students.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 7: Evaluation of multiple-choice questions for each subject area
Evaluation of the polls
At the end of the video, polls with comparative questions about the video content and quality were implemented.
The poll included the following questions:
• In which video are the standards more completely mentioned and better explained?
• In which video is the materials testing information more structured and clearer?
• In which video is the goal of materials testing clearer?
• In which video are the results of the materials testing presented more comprehensibly?
• Which video has better quality in terms of picture and sound?
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 8: Comparison of poll results for each subject area
Figure 8 (along the y-axis) shows the number of attendees who answered each of the comparative questions. The
number is not always the same, because the students could have also stopped watching the video before the poll
appeared and, thus, the polls at the end of the videos were no longer completed. The eight videos divided into the
subject area are shown along the x-axis. As already mentioned, two different student groups each submitted a video
on the same subject area. The bars show a similar tendency of student responses to all comparative questions.
It is striking that the same video is always favoured by the students for all five questions. For example, in the subject
area ‘Aggregate’, video 1 was rated better for all questions, and in subject area ‘Binders 1’, video 2 is always rated
better.
Discussion
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
The results of the analysis from LIVE and the results of grading the students enrolled in the course Building
Materials Basics - Laboratory Practicals indicated that students showed a better grade point average after than
before using LIVE.
This observed, increased learning efficiency can be attributed, on the one hand, to the development of the videos,
but also is positively associated with the application of attention-level monitoring and the application of the poll
function in LIVE. Because the students were aware that the teachers were analysing their level of attention to see if
they watched the videos in a single sequence and answered the questions within a normal timeframe, they were
strongly motivated to watch the videos to their ends and in a single sitting.
Furthermore, the students were informed early on in the Building Materials Basics - Laboratory Practicals course
that two different groups would have to submit one video for the same subject area. The teachers also told them that
they would combine the two videos to a single video and compare them using the poll function. The students were
also told that student groups with videos that received higher ratings would get extra points and student groups with
video that received lower ratings would lose points.
Therefore, their motivation was raised by asking the students comparative questions about the videos produced. By
asking these comparative questions, the students were motivated to increase the quality of the videos their group
produced - in terms of teaching content, picture and sound quality - as the video of the competing group.
Outlook
To increase the benefits of using LIVE still further, a plan has been developed to embed the interactive videos
provided by LIVE in the learning management system of the university, which is based on Moodle. To do so, an LTI
(Learning Tools Interoperability) provider has been implemented for LIVE. With this feature, students will no
longer be required to register and authenticate at LIVE because this will be done automatically by the LTI provider
through Moodle.
Conclusion
Using LIVE, the attention-levels of students while watching videos, as well as the quality of the videos submitted by
students as a class requirement, can be evaluated. Although LIVE was not developed as an assessment tool for
students, the results of the analysis of the students’ attention-levels - as measured by the time required to answer
questions that appear in pop-up windows during the video and which must be clicked away - seem to increase the
learning effect experienced while watching the videos. By implementing exam questions in these pop-up windows
within the videos, an assessment of students' understanding of the teaching content could also be performed. With
the last extension in LIVE (i.e., the poll function), individual polls could be appended and evaluated at the end of
videos. During the course, comparative questions about the videos submitted were compiled and included in the
course assessment.
Due to the application of LIVE in the Building Materials Basics - Laboratory Practicals course, the didactic concept
was improved by the obtained evaluations. In addition to the theoretical input of the teachers, as well as the
presentation of the material experiments, the time required to evaluate the practical part of the exercise has been
optimized by the application of LIVE since 2016. With the application of the newly-created overall concept of the
Building Materials Basics - Laboratory Practicals, which is one of the most important lectures attended by future
architects and civil engineers at Graz University of Technology, an intensive examination of the course contents was
conducted. This strategy could be used to demonstrably increase the students’ basic understanding of the teaching
content by intensifying their engagement with the videos, raising their attention-levels, and increasing their
motivation to create higher-quality videos by comparatively assessing the videos.
References
Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. British Journal of Educational Technology, 31(3), 229-241.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Ebner, M., & Holzinger, A. (2003). Instructional Use of Engineering Visualization: Interaction Design in e-Learning for Civil
Engineering. Human–computer interaction, theory and practice, 1, 926-930.
Ebner, M., Wachtler, J., & Holzinger, A. (2013). Introducing an information system for successful support of selective attention
in online courses. In Universal Access in Human-Computer Interaction. Applications and Services for Quality of Life (pp. 153-
162). Springer Berlin Heidelberg.
Ebner, M., Lackner, E., & Kopp, M. (2014, October). How to MOOC?-A pedagogical guideline for practitioners. In The
International Scientific Conference eLearning and Software for Education (Vol. 4, p. 215). " Carol I" National Defence
University.
Haintz, C., Pichler, K., & Ebner, M. (2014). Developing a Web-Based Question-Driven Audience Response System Supporting
BYOD. J. UCS, 20(1), 39-56.
Heinze, H. J., Mangun, G. R., Burchert, W., Hinrichs, H., Scholz, M., Münte, T. F., ... & Hillyard, S. A. (1994). Combined spatial
and temporal imaging of brain activity during visual selective attention in humans. Nature.
Helmerich, J., & Scherer, J. (2007). Interaktion zwischen lehrenden und lernenden in medien unterstützten veranstaltungen. In
Neue Trends im E-Learning (pp. 197-210). Physica-Verlag HD.
Khalil, H., & Ebner, M. (2013). Interaction Possibilities in MOOCs–How Do They Actually Happen. In International Conference
on Higher Education Development (pp. 1-24).
Moran, J., & Desimone, R. (1985). Selective attention gates visual processing in the extrastriate cortex. Frontiers in cognitive
neuroscience, 229, 342-345.
Shiffrin, R. M., & Gardner, G. T. (1972). Visual processing capacity and attentional control. Journal of experimental psychology,
93(1), 72.
Wachtler, J., & Ebner, M. (2014a, June). Support of Video-Based lectures with Interactions-Implementation of a first prototype.
In World Conference on Educational Multimedia, Hypermedia and Telecommunications (Vol. 2014, No. 1, pp. 582-591).
Wachtler, J., & Ebner, M. (2014b). Attention Profiling Algorithm for Video-Based Lectures. In Learning and Collaboration
Technologies. Designing and Developing Novel Learning Experiences (pp. 358-367). Springer International Publishing.
Wachtler, J., & Ebner, M. (2015, June). Impacts of interactions in learning-videos: A subjective and objective analysis. In
EdMedia: World Conference on Educational Media and Technology (Vol. 2015, No. 1, pp. 1611-1619).
Wachtler, J., Khalil, M., Taraghi, B., & Ebner, M. (2016a). On Using Learning Analytics to Track the Activity of Interactive
MOOC Videos. In Proceedings of the workshop on Smart Environments and Analytics in Video-Based Learning (SE@ VBL),
LAK2016.
Wachtler, J., Hubmann, M., Zöhrer, H., & Ebner, M. (2016b). An analysis of the use and effect of questions in interactive
learning-videos. Smart Learning Environments, 3(1), 13.
Wachtler, J., & Ebner, M. (2017, June). On Using Interactivity to Monitor the Attendance of Students at Learning-Videos. In
EdMedia: World Conference on Educational Media and Technology (pp. 356-366). Association for the Advancement of
Computing in Education (AACE).