Conference PaperPDF Available

Increasing Learning Efficiency and Quality of Students´HomeworkStudents´Homework by Attendance Monitoring and Polls at Interactive Learning Videos

Authors:

Abstract and Figures

Due to the fact that students are confronted with a growing amount of texts, colours, figures and shapes and due to their ability to process only a limited number of such information simultaneously, it seems to be obvious that efforts should be made to increase the students' attention-levels. This is important because research results have indicated that selective attention is considered as the most valuable resource in the process of human learning. The application of interaction and communication to the process of learning is a useful strategy to direct the students' attention. It seems to be obvious that this is also true for learning videos. Therefore, this work contains a description of how a video platform with interactive components can be used to support the students and teacher. The video platform is explained and evaluated by analysing its usage in a large teaching course at an institution of higher education. The application of this strategy improved the students' performance and optimized the teacher's workload.
Content may be subject to copyright.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Increasing Learning Efficiency and Quality of Students´ Homework by
Attendance Monitoring and Polls at Interactive Learning Videos
Josef Wachtler
Educational Technology
Graz University of Technology
Austria
josef.wachtler@tugraz.at
Marco Scherz
Wor king G rou p Sus tain ab le Cons truc tion
Institute of Technology and Testing of Construction Materials
Graz University of Technology
Austria
marco.scherz@tugraz.at
Martin Ebner
Educational Technology
Graz University of Technology
Austria
martin.ebner@tugraz.at
Abstract: Due to the fact that students are confronted with a growing amount of texts, colours,
figures and shapes and due to their ability to process only a limited number of such information
simultaneously, it seems to be obvious that efforts should be made to increase the students’
attention-levels. This is important because research results have indicated that selective attention is
considered as the most valuable resource in the process of human learning. The application of
interaction and communication to the process of learning is a useful strategy to direct the students’
attention. It seems to be obvious that this is also true for learning videos. Therefore, this work
contains a description of how a video platform with interactive components can be used to support
the students and teacher. The video platform is explained and evaluated by analysing its usage in a
large teaching course at an institution of higher education. The application of this strategy
improved the studentsperformance and optimized the teacher’s workload.
Introduction
It is widely known that students’ attention is heavily influenced by both interaction and communication. Therefore,
it is vital to offer many different forms of such methods in online courses. To achieve the highest benefit, interaction
and communication should be used in many forms and in all directions. This means that, on the one hand, the
methods offered should range from simple e-mail and face-to-face communication or forums to new interactive
learning methods. On the other hand, interaction and communication should take place not only between the teacher
and students, but the students should also communicate and interact with the content itself (Carr-Chellman &
Duchastel 2000) (Ebner & Holzinger 2003).
Furthermore, it is important to note that, in general, students are confronted daily with many pieces of text, figures,
colours and shapes. The number of such elements is also growing. It is obvious that students are only able to process
a limited amount of information at once (Shiffrin & Gardner 1972) and, therefore, most of this information is
filtered out centrally (Moran & Desimone 1985). It has been pointed out that the most crucial resource in human
learning is a mechanism called selective attention (Heinze et al. 1994). This seems to clearly indicate that supporting
and analysing this attention is of high importance (Ebner et al. 2013).
The attention analysis should be used to evaluate many aspects of the course as well as of the attending students.
Initially, the teacher can gather information about the depth of the students’ understanding. Furthermore, such an
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
analysis should be helpful, allowing the way of presenting the content to be adapted to the targeted audience and
evaluate whether the presented content is suitable for the attending students (Helmerich & Scherer 2007).
Based on the insights provided on interaction, communication and attention, it seems to be necessary to apply such
techniques to learning videos. This is also motivated by the increased offers of MOOCs (Massive Open Online
Courses), which are mainly based on videos (Khalil & Ebner 2013) (Ebner et al. 2014). For this reason, this
publication aims to point out a possibility of increasing the efficiency and quality of the students’ homework using
learning videos. Or with other words the investigated research question will be: “How can interactive components
added to learning videos be used to support the work of students and analyse this work?”
In the first section of this paper, the interactive video platform used for the course is explained. This is followed by a
description of the course itself (see Section Course Design). Subsequently, the work as well as the results of the
evaluation of the students’ attention-levels is discussed.
Interactive Video Platform
A video platform that offers the possibility to enrich videos with different methods of interactivity is used. This
platform is first introduced by (Ebner et al. 2013) and named LIVE (Live Interaction in Virtual learning
Environments). From the studentspoints-of-view, the major benefit of this platform is the fact that it supports their
attention through the use of interactive components during the video. Additionally, the teachers are able to monitor
as well as to evaluate the students’ performance in great detail.
Based on this it seems to be obvious that it is necessary that LIVE is only available to registered and authenticated
users. To manage the privileges of the users, there are four different types of roles (Wachtler & Ebner 2014a)
(Wachtler & Ebner 2017):
Ordinary students are only able to watch the videos, participate by answering the interactive questions and
analyse their own performance.
Users with teacher-privileges are additionally allowed to create video events with interactions. Moreover,
they can evaluate the performance of those who attend events they create.
Researchers have the clearance to download all the data generated by LIVE in the form of spreadsheets.
To manage the privileges of the users as well as to change the settings of the web-platform, there are also
Administrators.
The primary functionality of LIVE is the addition of interactive features to videos or live-broadcastings. It can be
seen (Fig. 1) that such interactive components are displayed on top of a video. In this example, a multiple-choice
question is shown to the students. If such an interaction is presented, the video automatically pauses, and it is not
possible to continue watching until the student reacts to the interaction. In the case of this example, the question
must either be answered or the student refuses to answer by clicking the corresponding button. Furthermore,
students can trigger interactions by themselves. The control elements of such interactions are shown on the right-
hand side of the video. Above these control elements, the students are presented with a feedback regarding their
level of attention (see Section Attention-Profiling Algorithm) along with a button that, if clicked, allows them to
stop watching the video.
LIVE offers a wide range of possibilities to confront the students with varying types of interactive components,. The
teacher can add the following interactive components to the video while creating it (Wachtler & Ebner 2014a)
(Wachtler et al. 2016b) (Wachtler & Ebner 2017):
Simple Questions
o general questions which are not related to the content of the video
o random and automatic
o useful for bridging a longer phase during which no content-related questions are asked
Solve CAPTCHAs
o used for the same reasons as the simple questions above
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 1: A video is interrupted by a multiple-choice question
Text-Based Questions
o the teacher can ask text-based questions to the students
o at live-broadcastings, she/he could enter the question into a textbox and send it to the students
instantly
o in videos, she/he must place such a question at a specific position in the video before releasing it
Multiple-Choice Questions
o real multiple-choice questions or true/false questions
o before deploying the video, the teacher could add questions of this type at pre-defined timepoints
throughout the video
In addition, the teacher can encourage the students’ participation by offering them the possibilities to trigger
interactions. She/He can choose from the following methods of interaction (Wachtler & Ebner 2014a) (Wachtler et
al. 2016b) (Wachtler & Ebner 2017):
Ask Teacher
o the attendees can ask a question to the teacher by entering it in a textbox
o to answer the question, the teacher could use a specific dialog or she/he could send it by e-mail
Report a Technical Problem
o the students can report a technical problem to the teacher via a dialog
o this feature is mainly used during live-broadcastings to report problems with the video feed
Set Attention
o with the help of a slider, the watchers can indicate their current level of attention
Post-Video Polls
LIVE also provides some interactivity after watching the videos in addition to the interactive components used
during the playback of the videos. The teacher can add polls which are displayed immediately after students stop
watching the video. While creating a poll, the teacher needs to enter a question and some answers. Furthermore,
she/he can enter a text that encourages the students to leave an explanation for their answer. Finally, the teacher
needs to specify how much of the video the students should have watched to display the poll. This step is necessary
for two reasons. First, if something is asked in the poll which requires knowledge of the whole video, it makes no
sense to display the poll to students who stop the video before it is finished. Second, it is also not helpful to show the
poll only at the end of the video due the fact that students often skip the last seconds of a video (e.g., lack of interest
in reading the credits).
After the students have participated in the polls, a detailed analysis is revealed to the teacher (Fig. 2). It can be seen
that at first the question is displayed. Below this question, a bar that indicates each possible answer is highlighted.
The lengths of the bars indicate how often the corresponding answer was selected by students. Below, the individual
answers of students are listed. In addition to the students’ names, the exact time they answered the question and the
selected answer is printed. Furthermore, the students’ explanation of their answer is only shown if the teacher has
asked for it while creating the poll.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 2: Analysis of a poll
Attention-Profiling Algorithm
During this study, the attention of the students is evaluated. For that, the functionalities of the Attention-Profiling
Algorithm (Wachtler & Ebner 2014b) of LIVE are used. This algorithm consists of two parts: On the one hand there
is a detailed record of the watched timespans of students and on the other hand there is the calculation of a so-called
attention-level.
A detailed monitoring of the watched timespans is done for each student watching a video. To cover the whole
watching-history of a student the following values are recorded for each video (Wachtler & Ebner 2014b):
Absolute time of joining (e.g. 2017-08-11 13:45:27)
Relative time of joining (e.g. 0 minutes 0 seconds)
Absolute time of leaving (e.g. 2017-08-11 13:48:49)
Relative time of leaving (e.g. 3 minutes 22 seconds)
Based on the values recorded, it is possible to track students’ activities (e.g., when they paused the video or used the
seeking’ function) during the video playback. Additionally, it is possible to visualize the watching-history of
students (Fig. 3). It can be seen that there is a timeline of the video and red bars are marking the watched periods. By
hovering the bars with the mouse pointer additional information could be displayed. The recorded values of joining
and leaving are displayed in both absolute and relative form. Furthermore, the lengths of the watched and the joined
timespan are shown. These lengths are sometimes different because the watched timespan is calculated from the
relative values and the joined timespan is derived from the absolute values. Finally, there is also the attention-level
for each timespan (see below).
Figure 3: The visualization of the recorded watched timespans
With these recorded values of joining and leaving further possibilities of evaluation are generated (Wachtler &
Ebner 2017). At first there is a list of all attending students presented to the teacher (Wachtler & Ebner 2014a). This
list shows the names of the students and the amount of the video they have watched. The watched timespan as well
as its equivalent as a percentage are shown. Furthermore, a visualization in the form of a line diagram shows the
timeline of a video along the x-axis and the number of the students watching the video along the y-axis. This
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
information allows the teacher to see the distribution of the students watching during the video playing period. This
helps the teacher identify which parts of the video are watched most often or watched the least.
As mentioned above, the second part of the Attention-Profiling Algorithm is the calculation of an attention-level.
This is a value ranging from 0% (completely absent) to 100% (fully attentive) which provides an indication of how
attentive a single student was while watching the video. The calculation of this value is based on the reaction time to
the interactive components. This means, in general, that the attention-level decreases if the reaction time increases. It
is clear that automatic and, therefore, simple questions can be much more quickly solved than more complex
interactive components such as multiple-choice questions. For this reason, the reaction time needed during each type
of interaction affects the attention-level in a different way. The attention-level is presented to the teacher in
combination with the watched timespans (Fig. 3), and could be used as a basic overview of how seriously students
watched the video.
Course Design
With in the course Building Materials Basics - Laboratory Practicals at Graz University of Technology, LIVE was
used for the second time while teaching and evaluating the contents of the course. During this semester, a focus was
placed on the newly-developed poll function.
Due to the newly-developed poll function, the concept introduced in the last years course, Building Materials
Basics - Laboratory Practicals, was adapted. The course is divided into theoretical and practical parts. The contents
of the theoretical part were presented in typical, individual lectures. This theoretical part was subdivided into the
following subject areas:
1. Aggregate
2. Binders 1
3. Binders 2
4. Fresh Concrete
5. Hardened Concrete 1
6. Hardened Concrete 2
7. Steel
8. Synthetic Materials
The sequence of the course Building Materials Basics - Laboratory Practicals is shown in Figure 4.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 4: Sequence of the course Building Materials Basics - Laboratory Practicals
In the practical part of the course, material experiments were shown to students at the Institute of Technology and
Testing of Construction Materials laboratories. Under the supervision of the teachers, the students were required to
carry out the respective material experiments by themselves. The students were divided into 16 groups (two groups
per subject area). The initial goal was to create two videos (created by different groups) for each subject area and
later carry out a comparative evaluation of the videos using the newly-developed poll function in LIVE. LIVE was
also used to query and assess the theoretical content of the subject areas with multiple-choice questions. After the
submission of the students videos, the aforementioned multiple-choice questions were inserted into the videos by
the teachers (Fig. 4).
While creating the videos, students were required to fulfil the following tasks:
Carry out the material experiments
Record the implementation of the material experiments
Describe the material experiments by voice over
Combine recorded footage and voice over
Edit the videos
Submit the videos
Figure 5: Section of an implemented video (Subject area Aggregate) in LIVE
The applicability of LIVE to many participants was proven in the course held in the 2016 spring term. The 2016
spring term course included 350 participants who had been taught and evaluated using LIVE. Due to curriculum
adaptions this year, about 150 students enrolled in the course.
After the videos had been uploaded to LIVE, students with assigned usernames were able to watch all videos as well
as answer the multiple-choice questions and participate in the implemented polls after each video. By inserting the
multiple-choice questions at regular intervals within the video sequences, not only the accuracy of the answers but
also the students’ attention-levels could be evaluated. The Attention-Profiling Algorithm of LIVE ensures that
students view the whole video as part of a continuous workflow. The evaluation of the attention level was made
possible by measuring the time between the appearance of the multiple-choice questions on the screen and the
timepoint the student answers the question. At the end of the videos, the polls for the respective video comparisons
could be completed.
Evaluation
This section presents the results of the lectures for the 2017 spring term by evaluating the data generated by LIVE..
Evaluation of the Attention-Level
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
In the illustration (Fig. 6), the evaluation of the level of attention paid by two, individually-selected students is
shown as an example. To compare the degree of attention paid by these two students, the video from the topic
Binders 1was selected. As already mentioned, two different groups needed to submit one video for the same
subject area. After their submission, the teachers combined the two videos into one video and inserted the four
multiple-choice questions into the combined video.
The total duration of this video is 12 minutes and 20 seconds and includes - like all other videos - four multiple-
choice questions. The analysis of the attention levels indicated that student 1 had a 77% attention-level and Student
2, 100%.
The difference in attention-levels is due to the fact that, after the first multiple-choice question (Q1) appeared (i.e., at
absolute time 2 minutes 19 seconds), student 1 required a time interval of 1 minute and 30 seconds to answer (A1)
and student 2, only 7 seconds (Fig. 6). Due to the differences in these time intervals, the subsequent multiple-choice
questions (Q2, Q3, and Q4) no longer appeared at the same absolute time. By adding the elapsed time between the
appearance of the multiple-choice questions and the answers given, student 1 has an elapsed time of 8 minutes and
58 seconds and student 2, 0 minutes and 30 seconds. The absolute end time of the video for student 1, thus, is 21
minutes and 18 seconds and 12 minutes and 50 seconds for student 2.
Figure 6: Comparison of attention-levels between two students
In summary, the degree of attention is very high for most students. The calculated average of the attention-levels per
video is between 94 and 97 percent. Only a small number of students had an attention level below 70 percent. This
shows that the attention level monitoring increases the students’ motivation to watch the entire video.
Evaluation of Multiple-Choice Questions
Using LIVE, it was possible to evaluate the accuracy of the answers to the multiple-choice questions for each
student. The following figure (Fig. 7) shows the individual questions (4 questions per video) and the students’
answers. The height of the bars indicates the number of students who answered the questions. The number is not
always the same, because certain students stopped watching the videos before the end and, thus, some multiple-
choice questions were not answered. The difficulty of the questions can be derived from the presentation of the bars
(green bars show the correct answers and the red ones, the incorrect answers). According to these results, Question 1
(Q1) from the topic area Aggregatewas the most difficult one. This question was answered correctly by 79
students and answered incorrectly by 86 students.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 7: Evaluation of multiple-choice questions for each subject area
Evaluation of the polls
At the end of the video, polls with comparative questions about the video content and quality were implemented.
The poll included the following questions:
In which video are the standards more completely mentioned and better explained?
In which video is the materials testing information more structured and clearer?
In which video is the goal of materials testing clearer?
In which video are the results of the materials testing presented more comprehensibly?
Which video has better quality in terms of picture and sound?
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Figure 8: Comparison of poll results for each subject area
Figure 8 (along the y-axis) shows the number of attendees who answered each of the comparative questions. The
number is not always the same, because the students could have also stopped watching the video before the poll
appeared and, thus, the polls at the end of the videos were no longer completed. The eight videos divided into the
subject area are shown along the x-axis. As already mentioned, two different student groups each submitted a video
on the same subject area. The bars show a similar tendency of student responses to all comparative questions.
It is striking that the same video is always favoured by the students for all five questions. For example, in the subject
area Aggregate, video 1 was rated better for all questions, and in subject area Binders 1, video 2 is always rated
better.
Discussion
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
The results of the analysis from LIVE and the results of grading the students enrolled in the course Building
Materials Basics - Laboratory Practicals indicated that students showed a better grade point average after than
before using LIVE.
This observed, increased learning efficiency can be attributed, on the one hand, to the development of the videos,
but also is positively associated with the application of attention-level monitoring and the application of the poll
function in LIVE. Because the students were aware that the teachers were analysing their level of attention to see if
they watched the videos in a single sequence and answered the questions within a normal timeframe, they were
strongly motivated to watch the videos to their ends and in a single sitting.
Furthermore, the students were informed early on in the Building Materials Basics - Laboratory Practicals course
that two different groups would have to submit one video for the same subject area. The teachers also told them that
they would combine the two videos to a single video and compare them using the poll function. The students were
also told that student groups with videos that received higher ratings would get extra points and student groups with
video that received lower ratings would lose points.
Therefore, their motivation was raised by asking the students comparative questions about the videos produced. By
asking these comparative questions, the students were motivated to increase the quality of the videos their group
produced - in terms of teaching content, picture and sound quality - as the video of the competing group.
Outlook
To increase the benefits of using LIVE still further, a plan has been developed to embed the interactive videos
provided by LIVE in the learning management system of the university, which is based on Moodle. To do so, an LTI
(Learning Tools Interoperability) provider has been implemented for LIVE. With this feature, students will no
longer be required to register and authenticate at LIVE because this will be done automatically by the LTI provider
through Moodle.
Conclusion
Using LIVE, the attention-levels of students while watching videos, as well as the quality of the videos submitted by
students as a class requirement, can be evaluated. Although LIVE was not developed as an assessment tool for
students, the results of the analysis of the students’ attention-levels - as measured by the time required to answer
questions that appear in pop-up windows during the video and which must be clicked away - seem to increase the
learning effect experienced while watching the videos. By implementing exam questions in these pop-up windows
within the videos, an assessment of students' understanding of the teaching content could also be performed. With
the last extension in LIVE (i.e., the poll function), individual polls could be appended and evaluated at the end of
videos. During the course, comparative questions about the videos submitted were compiled and included in the
course assessment.
Due to the application of LIVE in the Building Materials Basics - Laboratory Practicals course, the didactic concept
was improved by the obtained evaluations. In addition to the theoretical input of the teachers, as well as the
presentation of the material experiments, the time required to evaluate the practical part of the exercise has been
optimized by the application of LIVE since 2016. With the application of the newly-created overall concept of the
Building Materials Basics - Laboratory Practicals, which is one of the most important lectures attended by future
architects and civil engineers at Graz University of Technology, an intensive examination of the course contents was
conducted. This strategy could be used to demonstrably increase the students’ basic understanding of the teaching
content by intensifying their engagement with the videos, raising their attention-levels, and increasing their
motivation to create higher-quality videos by comparatively assessing the videos.
References
Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. British Journal of Educational Technology, 31(3), 229-241.
Originally published in: Wachtler, J., Scherz, M. & Ebner, M. (2018). Increasing Learning Efficiency and Quality of Students´
Homework by Attendance Monitoring and Polls at Interactive Learning Videos. In Proceedings of EdMedia: World Conference
on Educational Media and Technology (pp. 1337-1347). Amsterdam, Netherlands: Association for the Advancement of
Computing in Education (AACE).
Ebner, M., & Holzinger, A. (2003). Instructional Use of Engineering Visualization: Interaction Design in e-Learning for Civil
Engineering. Humancomputer interaction, theory and practice, 1, 926-930.
Ebner, M., Wachtler, J., & Holzinger, A. (2013). Introducing an information system for successful support of selective attention
in online courses. In Universal Access in Human-Computer Interaction. Applications and Services for Quality of Life (pp. 153-
162). Springer Berlin Heidelberg.
Ebner, M., Lackner, E., & Kopp, M. (2014, October). How to MOOC?-A pedagogical guideline for practitioners. In The
International Scientific Conference eLearning and Software for Education (Vol. 4, p. 215). " Carol I" National Defence
University.
Haintz, C., Pichler, K., & Ebner, M. (2014). Developing a Web-Based Question-Driven Audience Response System Supporting
BYOD. J. UCS, 20(1), 39-56.
Heinze, H. J., Mangun, G. R., Burchert, W., Hinrichs, H., Scholz, M., Münte, T. F., ... & Hillyard, S. A. (1994). Combined spatial
and temporal imaging of brain activity during visual selective attention in humans. Nature.
Helmerich, J., & Scherer, J. (2007). Interaktion zwischen lehrenden und lernenden in medien unterstützten veranstaltungen. In
Neue Trends im E-Learning (pp. 197-210). Physica-Verlag HD.
Khalil, H., & Ebner, M. (2013). Interaction Possibilities in MOOCsHow Do They Actually Happen. In International Conference
on Higher Education Development (pp. 1-24).
Moran, J., & Desimone, R. (1985). Selective attention gates visual processing in the extrastriate cortex. Frontiers in cognitive
neuroscience, 229, 342-345.
Shiffrin, R. M., & Gardner, G. T. (1972). Visual processing capacity and attentional control. Journal of experimental psychology,
93(1), 72.
Wachtler, J., & Ebner, M. (2014a, June). Support of Video-Based lectures with Interactions-Implementation of a first prototype.
In World Conference on Educational Multimedia, Hypermedia and Telecommunications (Vol. 2014, No. 1, pp. 582-591).
Wachtler, J., & Ebner, M. (2014b). Attention Profiling Algorithm for Video-Based Lectures. In Learning and Collaboration
Technologies. Designing and Developing Novel Learning Experiences (pp. 358-367). Springer International Publishing.
Wachtler, J., & Ebner, M. (2015, June). Impacts of interactions in learning-videos: A subjective and objective analysis. In
EdMedia: World Conference on Educational Media and Technology (Vol. 2015, No. 1, pp. 1611-1619).
Wachtler, J., Khalil, M., Taraghi, B., & Ebner, M. (2016a). On Using Learning Analytics to Track the Activity of Interactive
MOOC Videos. In Proceedings of the workshop on Smart Environments and Analytics in Video-Based Learning (SE@ VBL),
LAK2016.
Wachtler, J., Hubmann, M., Zöhrer, H., & Ebner, M. (2016b). An analysis of the use and effect of questions in interactive
learning-videos. Smart Learning Environments, 3(1), 13.
Wachtler, J., & Ebner, M. (2017, June). On Using Interactivity to Monitor the Attendance of Students at Learning-Videos. In
EdMedia: World Conference on Educational Media and Technology (pp. 356-366). Association for the Advancement of
Computing in Education (AACE).
... Numerous scholarly investigations, including those conducted by Wachtler et al. (2018), Gedera and Zalipour (2018), and Pramerta (2018), have substantiated the existence of a dearth in the utilization of interactive video as an instructional approach for certain subjects. These studies reveal that prevailing teaching methodologies continue to rely on traditional modes of instruction centered around indoctrination and rote memorization, thereby underscoring the limited adoption of alternative pedagogical strategies. ...
... The act of temporarily halting or immobilizing the frame. This process involves the utilization of auditory stimuli followed by a deliberate cessation at a predetermined juncture within the given occurrence or circumstance (Wachtler et al., 2018). Subsequently, the instructor proceeds to interrogate the student, seeking their understanding of the transpired events as well as their predictions regarding future developments. ...
Article
The primary aim of this study was to assess the impact of including an interactive educational video on enhancing academic achievement among secondary school students in Abha Governorate. The study consisted of a sample of 64 students, with an equal distribution of 32 participants in both the experimental and control groups. The results of the study revealed a statistically significant increase in post-test scores pertaining to academic success in the experimental group as compared to the control group. Furthermore, the inquiry revealed that there was a significant improvement in the mean scores of the experimental group in terms of academic achievement, as observed from the initial assessment to the second measurement. There was no observed statistically significant disparity in academic achievement scores between the experimental group and the control group, both immediately after the intervention and over the subsequent evaluation period.
... (Ali, 2013;Boucheix, Lowe, Putri, et al., 2013;de Koning, Tabbers, Rikers, et al., 2007;Ganz, Kaylor, Bourgeois, et al., 2008;Yang, 2016;Zu, Agra, Hutson, et al, 2015( . ( Evangelidis et al., 2016;Kuhali, 2017;Briggs et al., 2006;Schwan & Riempp, 2004;Girwidz et al., 2019 ) Delen at el., 2014;Kazanidis et al., 2018;Papadopoulou et al., 2016;Schoeffmann et al., 2015;Pass et al., 2007 ) ‫تلخايص‬ ‫ويمكان‬ ‫ف،ع‬ (Bolliger & Martindale, 2004;Meixner, 2014;Wachtler, Scherz & Ebner, 2018;Woll, Buschbeck, Steffens, et al., 2014;Palaigeorgiou, Chloptsidou & Lemonidis, 2017;Papadopoulou & Palaigeorgiou, 2016) ‫وا‬ (Ali, 2013;Boucheix, Lowe, Putri, et al., 2013;de Koning, Tabbers, Rikers, et al., 2007; Kaylor, Bourgeois, et al., 2008;Yang, 2016;Zu, Agra, Hutson, et al, 2015) . ...
... These systems reinforce learned concepts and provide a means to deliver on-demand learning based solely on student need without a human in the loop e.g., instructor making the decision. Empirical assessments in studies by Wachtler, Scherz, and Ebner (2018), have shown that video and related quizzes lack adaptabilities which meet all student's needs but can increase knowledge, intensify engagement, and promote attention. Using these forms of digital instruction combined with neurocognitive data may provide greater success for students. ...
Article
Current data sources used for the prediction of student outcomes average about 55% accuracy and require a significant amount of input data and time for researchers and educators to produce predictive models of student outcomes. The aim of this study is to examine how neurocognitive data collected via functional near infrared spectroscopy (fNIRS) may be used to create predictive models of student outcomes with greater speed and accuracy when using a synthetic adaptive learning environment (SALEs). Specifically, this study examines the utility of using neurocognitive data to develop student response prediction on a science content test. Participants were recruited from schools located in the United States (n = 40). Participants in the study engaged in three conditions: no content, video and virtual reality. The lesson video and virtual reality lesson provides an explanation of deoxyribonucleic acid replication. Observed neurocognitive responses were collected during each condition and used to predict the success of student responses on an assessment. On average the predicative accuracy of this approach is 85% and occur within 300 ms. Predictive error rates are less than 15%. Results of this study provides evidence to support the use of neurocognitive data for adaption of digitally presented content and how machine learning approaches and artificial intelligence may be used to classify student data in real-time as students engage with content. Results also illustrate good accuracy and capture of moment-to- moment fluctuations of cognition in real-time. These findings may help the development of artificially intelligent tutors and improve student-based learning analytics.
Conference Paper
Full-text available
Many studies are claiming that compulsory attendance has its benefits for students because it is positively influencing their performance in consequence. This means that fully attending students are receiving a better grade than those who skip parts of the course. Whereas there are existing technologies to monitor the attendance of students in standard classroom situations the possibilities to provide such functionalities at online videos are limited. Because of that this work introduces an approach to assess the performance of students at videos. For that a web-platform which offers a detailed recording of the watched parts of videos is presented. The evaluation shows the performance of the students at a course consisting of videos with compulsory attendance and furthermore it points out that the approach for monitoring the attendance at online videos is basically working.
Article
Full-text available
This study focuses on the positioning of interactive questions within learning videos. It is attempted to show that the position of a question’s occurrence has an impact on the correctness rate of its answer and the learning success. As part of the study, the interactive learning videos in which the questions are placed are used as teaching materials with a class. The pupils have been working with the videos for around one month and some interesting results could be obtained. It is shown that questions which are asked too early in the videos are answered incorrectly more often than other questions. This manuscript also recommends an adequate positioning of the first question in learning videos. The new hypothesis that the length of intervals between popping up questions plays a minor role at rather short learning videos is constructed in this publication. Moreover, the positive impact on the long-term learning success of the participants of learning videos is determined.
Conference Paper
Full-text available
It is widely known that interaction, as well as communication, are very important parts of successful online courses. These features are considered crucial because they help to improve students’ attention in a very significant way. In this publication, the authors present an innovative application, which adds different forms of interactivity to learning videos within MOOCs such as multiple-choice questions or the possibility to communicate with the teacher. Furthermore, Learning Analytics using exploratory examination and visualizations have been applied to unveil learners’ patterns and behaviors as well as investigate the effectiveness of the application. Based upon the quantitative and qualitative observations, our study determined common practices behind dropping out using videos indicator and suggested enhancements to increase the performance of the application as well as learners’ attention.
Conference Paper
Full-text available
It is clear that a system known as selective attention is the most crucial resource for human learning (Heinze et al. 1994). Due to this fact a web based information system is developed to support the attention of the watchers of a learning video. This is done by enriching the video with different forms of interactions. Among others there are interactions presenting multiple-choice questions at predefined positions in the video as well as randomly occurring interactions displaying general questions, which are not content related. To gain a basic plan which points out how to distribute the interactions over the video the usage of the information system at a lecture is analyzed. For that on the one hand some feedback of the users is provided and on the other hand an objective analysis of the results of the multiple-choice questions is done by evaluating their time of occurrence in the video.
Conference Paper
Full-text available
In general videos have a more or less consuming character without any interaction possibilities. Due to this fact a web-based application is developed which offers different methods of communication and interaction to a certain learning-video. This should help attendees to avoid that they become tired and annoyed. The lecturer is able to use interactions to omit whether learners are able to understand the content of the video. For instance the developed web-application offers the possibility to add multiple-choice-questions at predefined positions. Furthermore there are many different kind of analysis as for example a detailed attention-profile. In this publication the implementation of a first prototype is described as well as a first field study which points out that learners? interactions and engagements increased arbitrarily.
Conference Paper
Full-text available
Due to the fact that students' attention is the most crucial resource in a high-quality course it is from high importance to control and analyze it. This could be done by using the interaction and the communication because they are known as valuable influencing factors of the attention. In this publication we introduce a web-based information system which implements an attention-profiling algorithm for learningvideos as well as live-broadcastings of lectures. For that different methods of interaction are offered and analyzed. The evaluation points out that the attention profiling algorithm delivers realistic values.
Conference Paper
Full-text available
Human learning processes are strongly depending on attention of each single learner. Due to this fact any measurement helping to increase students' attention is from high importance. Till now there are some developments called Audience-Response-Systems only available for face-to-face education even for masses. In this publication we introduce a web-based information system which is also usable for online-systems. Students' attention will be conserved based on different interaction forms during the live stream of a lecture. The evaluation pointed out that the system helps to enlarge the attention of each single participant.
Article
Full-text available
Question-driven Audience Response Systems (ARSs) are in the focus of research since the 1960s. Since then, the technology has changed and therefore systems have evolved too. This work is about conception and implementation of the web-based ARS RealFeedback which uses the principle of bring your own device (BYOD). A state-of-the-art analysis compares the features of existing web-based ARSs. The most important findings are used for the conception and the implementation of the system. Thinking-aloud tests, and the first usages during lectures confirm that the chosen requirements are very significant and valuable for lecturers
Article
Full-text available
The main course at the Institute of Structural Concrete (IBB) of Graz University of Technology has been supported by the e-Learning project iVISiCE (Interactive Visualizations in Civil Engineering) using a web-based course management system since the year 2000. Within this project a large number of animations, simulations and visualizations have been created that are used as Learning Objects (LO). The most interesting part, however, was the creation of Interactive Learning Objects (ILO). These require the students to operate the visualizations interactively by themselves. During the design and development of these ILOs we considered aspects of Human-Computer Interaction (HCI) and User Centered Design (UCD).
Chapter
Full-text available
Bereits seit längerer Zeit haben eLearning und Multimedia Einzug in die universitäre Lehre gehalten. Die Einsatzmöglichkeiten und -szenarien sind dabei vielfältig und reichen von der Verwendung von PowerPoint-Folien bis hin zu eLearning-Kursen, die komplett online bearbeitet werden. Die Lernarrangements müssen sich jedoch mit den folgenden Fragestellungen auseinandersetzen: Können die Lernenden den Ausführungen folgen? Ist das Vortragstempo angemessen? Reicht das Vorwissen der Studierenden aus? Haben die Teilnehmer die Ausführungen inhaltlich verstanden? Sind die Lerninhalte zielgruppengerecht aufbereitet? Passen die Lerninhalte zur Zielgruppe? Diese Fragestellungen können in der Regel im Rahmen der Interaktion zwischen Dozenten und Studierenden gelöst werden. Interaktion und Reaktion sind daher entscheidende Bestandteile der Wissensvermittlung. Allerdings zeigt die Erfahrung, dass beides aus unterschiedlichen Gründen in Lernarrangements nicht ausreichend oder gar nicht zu Stande kommt. Die folgenden Ausführungen behandeln im ersten Teil das Problemfeld Interaktion von Lehrenden und Lernenden in unterschiedlichen Veranstaltungsarrangements. Dazu werden drei typische Arten der Wissensvermittlung in den Vordergrund gestellt, mit denen am Lehrstuhl für BWL und Wirtschaftsinformatik der Universität Würzburg bereits hinreichend Erfahrung gesammelt wurde. Es handelt sich um (multimedial unterstützte) Präsenzveranstaltungen, Televorlesungen und Blended Learning Arrangements. Auf Basis der gewonnenen Erkenntnisse wurde am Lehrstuhl Prof. Thome ein Werkzeug entwickelt, das die auftretenden Schwierigkeiten hinsichtlich Feedback und Interaktion zumindest in Teilen löst und jetzt seit einem Jahr produktiv im Einsatz ist. Eine Vorstellung dieses Werkzeuges und ein Erfahrungsbericht sind Gegenstand des zweiten Teils.