ArticlePDF Available

Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements

Authors:

Abstract and Figures

Digital learning technologies are becoming increasingly important for our modern educational system. In addition to teaching methods that incorporate interactivity, these approaches benefit students’ overall learning experience and success by enhancing their attention and fostering a positive attitude towards the learning content being presented. Interactivity comes in various forms, and while a combination of distinct activities is beneficial, some are more effective at engaging students. Using digital technologies in an educational environment opens up new possibilities for students, teachers, and researchers. It provides new insights into learning behavior and enables the collection of interaction information. This data could, for example, show how often a video was paused or at what point students lost interest and left, but gaining such knowledge requires further processing. The use of visualizations that depict behavior, such as the change of attention over time, can be an effective way to present extracted information. Therefore, our research focuses on developing an application that enables us to generate various visualizations from the collected data. A single command-line input will be sufficient to create them. Furthermore, a video course was created from which we collected behavioral data. Our results aim to showcase the benefits of interactivity, and that the created figures can be used for data evaluation verifies the versatility of the generated visualizations.
Content may be subject to copyright.
4 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
iJET | eISSN: 1863-0383 | Vol. 18 No. 24 (2023) |
JET International Journal of
Emerging Technologies in Learning
Dohr, D., Wachtler, J., Ebner, M. (2023). Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements. International Journal of
Emerging Technologies in Learning (iJET), 18(24), pp. 4–18. https://doi.org/10.3991/ijet.v18i24.46455
Article submitted 2023-09-01. Revision uploaded 2023-10-27. Final acceptance 2023-10-28.
© 2023 by the authors of this article. Published under CC-BY.
Online-Journals.org
PAPER
Analysis of Students’ Behavior Watching iMooX Courses
with Interactive Elements
ABSTRACT
Digital learning technologies are becoming increasingly important for our modern educa-
tional system. In addition to teaching methods that incorporate interactivity, these approaches
benet students’ overall learning experience and success by enhancing their attention and
fostering a positive attitude towards the learning content being presented. Interactivity comes
in various forms, and while a combination of distinct activities is benecial, some are more
eective at engaging students. Using digital technologies in an educational environment
opens up new possibilities for students, teachers, and researchers. It provides new insights
into learning behavior and enables the collection of interaction information. This data could,
for example, show how often a video was paused or at what point students lost interest and
left, but gaining such knowledge requires further processing. The use of visualizations that
depict behavior, such as the change of attention over time, can be an eective way to pres-
ent extracted information. Therefore, our research focuses on developing an application that
enables us to generate various visualizations from the collected data. A single command-line
input will be sucient to create them. Furthermore, a video course was created from which
we collected behavioral data. Our results aim to showcase the benets of interactivity, and
that the created gures can be used for data evaluation veries the versatility of the generated
visualizations.
KEYWORDS
e-education, visualization, learning behavior, massive open online courses (MOOCs), teaching
 INTRODUCTION
The use and importance of digital learning technologies in our modern society’s
education are steadily increasing because they enable lecturers to combine eec-
tive teaching methods with interactive videos. The use of digital environments not
only changes the possibilities for teachers to convey their knowledge, but it also
improves the means for researchers to gather information and data on students’
learning behavior. The data collected through technology-enhanced learning
Daniel Dohr, Josef Wachtler,
Martin Ebner()
Educational Technology, Graz
University of Technology,
Graz, Austria
mebner@gmx.at
https://doi.org/10.3991/ijet.v18i24.46455
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 5
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
provides valuable insights into the learning process, enabling us to enhance learn-
ing eciency, success, and overall experience [1]. Interactivity plays a major role
in modern teaching methods, especially when conveying complex material to an
audience. As stated in [2], students benet tremendously from the use of interactive
teaching methods, and interactive videos are a powerful tool to facilitate this.
In order to examine the impact of such teaching methods on learning behavior
and determine their eectiveness, it is necessary to process the collected data and
present it in a form that is easily analyzable. For this task, visual representations are
very well suited. They allow us to spot dierences immediately and make it possible
to compare dierent videos. Therefore, the development of a tool that generates
these diagrams is necessary. Because not all interactions support and benet stu-
dents’ learning eorts equally, it is necessary to assess the eectiveness of the visual-
izations. These requirements led us to conduct research on how students’ behavior
in interactive learning videos can be visualized and analyzed. Therefore, we for-
mulate the following research question: “How can students’ behavior in interactive
learning videos be visualized and analyzed?”
For this analysis, it is important to rst understand the denition of learning
behavior and the factors that can inuence students’ attention in their learning
environment.
 
The acronym STEM represents the elds of science, technology, engineering, and
mathematics, all disciplines that our modern society deems crucial for progress [3].
STEM education can be viewed as an interdisciplinary approach to impart knowl-
edge in the elds of science, technology, engineering, and mathematics, with the
goal of achieving economic and technological progress [4]. The need for high levels
of technical literacy among the general population of any state establishes a strong
link between industry and education [5]. As described in [6], the demand for work-
ers who are procient in one of the STEM elds has only increased over the last few
decades. This highlights the importance of STEM elds for the global economy.
Massive open online courses (MOOCs) enable teachers from around the globe to
oer the learning content they have created to a wide online audience without being
limited by location or time constraints. Furthermore, by reducing dependence on the
instructor, online learning platforms oer great scalability. Additionally, these platforms
provide free access to learning materials, which contributes to their popularity [7].
As described in [8], MOOCs make it possible to enhance “student-to-student”
interactions, which improves the retention rate. The concept of MOOCs has a signif-
icant impact on education, providing knowledge to users at various levels, ranging
from beginner to advanced. Also, dierent learning types are supported by the use
of videos, quizzes, and forums. Furthermore, non-technical knowledge, including
rhetoric and writing, is promoted, which indirectly benets STEM [9]. The versatility
of MOOCs makes them a good t for STEM education. For our research, we created
a video series that imparts knowledge to students about a signicant computer sci-
ence concept, object-orientation.
To analyze students’ learning behavior, researchers must understand how they
learn and what keeps them engaged. Furthermore, it is crucial for the analysis to be
aware of the students’ needs, learning behaviors, and indicators. The term “learning
behavior” is described by [10] as the behavior that a person needs to possess in order
to eectively learn the presented content in a group setting or classroom. While the
6 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
teacher’s role is to cultivate such behavior in their students, we can assume that our
audience already possesses a certain degree of such behavior, and we can focus on
the rating of the indicators. There are various learning behaviors shown by [11],
including communication, independent work, and participation, to name a few. The
teachers can support such behaviors to enhance the learning eorts and ultimately
improve the learning experience. Every behavior also has observable indicators that
help to assert them. However, measuring some of them can be quite challenging. An
indicator of motivation, for example, is the willingness to independently research
information about a task or topic [10]. Depending on the learning environment, the
signicance of certain behaviors may vary.
One of the most important attributes of learning is the attention span. It is a prerequi-
site for learning and can be seen as a state of mental alertness or focus that enables one
to focus on the presented content. Human attention is selective, and any information
considered unnecessary is ltered out [12]. If students can concentrate on the material,
it will be easier for them to remember important details and facts later. Naturally, they
are unable to maintain their focus over extended periods, and their minds begin to
wander [13]. As described in [14], the loss of focus can occur in any learning environ-
ment, which can result in a decrease in test scores. Therefore, teaching methods that
capture attention and minimize unnecessary distractions are crucial in any classroom.
The concept of interactive teaching comes into play here. The teaching progress
involves the interaction between lecturers and their students to transfer knowledge.
As outlined by [15], there are three forms of teaching: the passive method, the active
method, and the interactive method. The passive style of teaching focuses on the
lecturer, with students primarily learning through listening. In contrast, the active
learning method emphasizes student participation in the learning process [15].
As described in [16], auditory learners benet from MOOCs, demonstrating high
learning eectiveness and attention scores. Videos, in general, are a powerful tool for
conveying learning material [17, 18]. With the rise of technology-enhanced learning,
the possibilities of imparting knowledge have increased dramatically. [19] describes
videos as one of the most signicant forms of digital media. The reason for this might
be the increased engagement of students when using active teaching methods. [20]
conducted a study and showed that students who used interactive videos learned
faster than their counterparts who watched traditional videos. Technologically
enhanced learning not only benets students but also researchers. As described
in [21], several techniques help researchers assess students’ learning behavior.
 LIVE
Interaction improves learning for students by directing their attention to essen-
tial content. The LIVE platform is a web application that enables lecturers to inte-
grate various types of interactions into their videos. The interactive elements are
integrated into the videos, enabling the platform to gather information about stu-
dents’ learning processes. Only registered and authenticated users can access the
application. The platform also provides information about student attendance. An
algorithm was developed to track student attendance by using recorded timespans
of video-watching to calculate an attendance level. The collection of learning data,
combined with calculated attendance, allows researchers to use them for assess-
ments and visualizations [22]. The current version of the LIVE platform oers its
users four dierent types of interactions. These types dene how and who can trig-
ger the associated interactions.
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 7
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
Fig. 1. An iMooX on-demand video with the LIVE user interface
The interactions are categorized by [22] into four types:
1. Automatic interactions: The rst type of interaction occurs automatically at ran-
dom times throughout the videos. The two methods associated with this type are
unrelated questions, where students are asked simple questions unrelated to the
learning material, and CAPTCHAs.
2. Student-triggered interactions: As the name suggests, students have the ability
to initiate this type of interaction. It is accessible through the LIVE user interface
depicted in Figure 1. There, students can ask the teacher questions, set their per-
ceived attention level, and report technical issues. The attention level they set is
not considered in the platform’s calculation of the attention.
3. Teacher-triggered interactions: During an event, teachers can use this type of
interaction to ask their students questions in real-time. Currently, there is one
available method where teachers can ask students directly. Their responses are
taken into consideration when calculating attention.
4. Planned interactions: The last type of interaction occurs at predetermined points
in the videos. The three methods are multiple-choice questions, open-text ques-
tions, and programming tasks. In contrast to automatic interactions, these inter-
actions require information from the presented learning content.
To eectively visualize the participants’ learning behavior, we require a diverse
range of information regarding their interactions with the provided learning con-
tent. The LIVE platform records student activity and saves it in separate les. Each
le contains a unique set of data that is relevant to our research. Because LIVE sep-
arates the data into distinct les, we can load only the information we need, saving
time during the diagram creation process. Notably, all the available les adhere to
a naming convention. Therefore, the le download and subsequent processing are
simple. All we need to do is to include the course number in the le name.
8 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
 
Visualizations allow researchers to eectively analyze students’ learning behav-
ior while also enabling them to communicate their ndings to an audience. The
LIVEData Evaluator was created to generate such diagrams for researchers to uti-
lize. For this purpose, we utilize the data sets from the LIVE platform. This allows
us to generate course-specic diagrams that provide valuable insight into the learn-
ing behavior of individual courses. Furthermore, it is possible to create composite
diagrams. These combinations allow for comparing multiple videos and recorded
behavior. These features are designed to provide information about the student’s
interactions and engagement, aiding in the understanding of the various fac-
tors that inuence the learning process. LIVEData Evaluator’s user interface is a
command-line interface (CLI), which enables researchers to interact with the appli-
cation through a text-based command prompt, where the command and the desired
arguments are entered. This method is simple and can be further improved by
using scripts to ll in the blanks. The command structure consists of the following
arguments:
1. Mode: The retrieval of data sets for visualization is a crucial aspect of the appli-
cation that determines the eectiveness of the tool. The user can choose between
two modes, the rst being the online mode. If this mode is selected, a POST request
will be initiated to the LIVE platform in order to obtain the required les. To
access the sets, the user must rst log in using their LIVE credentials. There is no
need to download the required data if it is already available. The second mode is
the oine mode, which is useful when there is no internet connection available
or when researchers want to generate a diagram with dummy data, for example.
In this mode, the user must provide all .csv les with the correct names and store
them in a designated folder.
2. Diagram type: If the necessary data is available, the diagram type allows the user
to generate the desired graphs. Users can also choose to generate all diagrams at
once for added convenience. This is particularly benecial when used in combi-
nation with the online mode, as the required les are promptly downloaded from
the platform.
3. Course list: The data sets used for further processing are specied in the course
list. It is made up of course numbers, separated by commas. The course numbers
are determined by the LIVE platform and can be found in the list of records. The
list can be accessed through the LIVE platform. The Evaluator is programmed to
adjust the size of the diagrams according to the number of lectures. This adapt-
ability enables researchers to compare an unlimited number of courses. This is
especially useful when comparing data sets from dierent lectures or all videos
of a single lecture.
4. Source folder: This argument species the location from which the Evaluator
retrieves the downloaded .csv les. When using the online mode, the down-
loaded .csv les are saved in a predened location for later processing. If the user
chooses the oine mode, the parameter species the location to search for the
necessary les.
5. Target folder: The target folder argument species the location where the gener-
ated visualizations will be saved. To prevent the creation of folders in unexpected
or incorrect folders, the specied path must lead to the correct directory. If this is
not the case, the diagram generation will be aborted with an error message.
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 9
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
6. Unique student name: To ensure privacy and avoid bias, the unique student’s
name is an optional parameter that represents the encrypted name of a student.
This parameter is used in the “Watched Parts Single-User Diagram,” which dis-
plays the parts of a video that were watched multiple times by a specic student.
Because they focus on the entire list of participants, all other diagrams currently
ignore this parameter.
7. LIVE username: Users must enter their credentials for the LIVE platform when
using the online mode to successfully request the required .csv les. To gain
access to the researcher tab, where the les are downloaded, researchers must
rst register on the LIVE platform. The user is prompted to enter the login name
and password after entering a valid online command into the command-line
interface. After authentication, the data download begins.
8. LIVE password: When using the online version of the command, the user must
enter their credentials again. Users are prompted to enter their login information
before proceeding with the le download. To protect the entered password, the
password input does not display clear text but rather hidden text.
Figure 2 shows an example of how a command for creating diagrams may
look. To enhance the usability of the application, help messages have been added
to the commands. This provides users with all available options. For this exam-
ple, the les would be downloaded into the /Source folder. Then all available dia-
grams would be generated for courses 10, 22, and 81 before saving them into /
Target/Plots.
Fig. 2. The command structure with example
Depending on the command and, more specically, on the type of diagram, there
are dierent diagrams available, including the following:
1. Interactions Diagram: The purpose of this diagram is to provide an overview of
the student interactions that occur while they engage with the learning content.
This includes video stops, resumes, and play actions.
2. Attention Change Diagram: This graph provides an overview of the average
attention of students at specic points in the videos. By marking the planned
interactions, the researchers can observe their eectiveness.
3. Answer Delay Diagrams: These plots visualize the time it took for students to
answer the planned tasks or questionnaires. These delays in answers are repre-
sented in the form of box and violin plots.
4. Dropout Ratio Diagram: This plot aims to visualize the dropout ratio of dierent
videos. The number of unique viewers and the total number of views for each
second are depicted. This can be used to identify the parts of the lectures where
most students drop out.
5. Watched Parts Diagram: These diagrams show which students have watched spe-
cic sections of the lectures. It also displays which passages were viewed multiple
times to detect comprehension issues.
10 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
 EVALUATION
The MOOC we provide consists of four lectures, and for each lecture, students can
complete a self-assessment survey. Furthermore, to complete the section, students are
required to take a nal quiz, where they could reach up to 10 points. Table 1 shows the
results of these quizzes. Notably, participants were only able to ll out the assessment
once, while the quiz allowed for multiple attempts. Over the duration of the MOOC,
it is evident that students, on average, demonstrated an improvement of 1.54 points
between the assessments and the nal quizzes. The table also shows that the number
of participants decreased from 52 for the rst lecture to 30 for the last. The results
may indicate whether and to what extent the students have improved, but they do not
provide any information about how much of the video the students watched or when
they chose to stop. For this, we require additional data sets from the LIVE platform.
Table 1. iMooX assessment and quiz results
As noted previously, we used the data collected from LIVE to create the visual-
izations. Before we look at those diagrams, we need to address a minor discrepancy
between the collected MOOC and LIVE data. As observed in Table 2, the number of
students who watched the videos does not match the number of completed quizzes.
The reason for this is the fact that students are not required to watch the video in order
to take the quiz. The data presented in Table 2 provides information on the number of
interactions students will have while watching the video, as well as the percentage of
how much of the video was watched by the average student. The remaining informa-
tion provided by the LIVE platform needs to be processed further in order to be useful.
Table 2. LIVE statistics of the OOP course
Therefore, to gain a better understanding of student learning behavior, the col-
lected data is processed and utilized to generate the visualizations described in the
following sections.
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 11
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
 
Figure 3 depicts the video dropout ratio for each second of the programming course.
The y-axis shows the number of total views and the watcher count, which represents
the number of individual users who watched the respective section, and hence, each
lecture is represented by two lines. For the rst lecture, we initially observed strong
engagement from 43 unique viewers, and for the rst percentages, a total of 72 views.
However, the diagram shows a signicant decline within views in the 1–15% range.
Furthermore, the number of viewers declined to 32. From the 15% mark onwards,
both the user and view count gradually decrease until the end of the video.
As for Lecture 2, the diagram shows a major drop in viewership compared to
Lecture 1, where 16 students began watching the video. This drop shows the disparity
between the data gathered from iMooX, as shown in Table 1, and the viewer statistics
from LIVE, as shown in Table 2. While only 16 users watched the video, 38 students
took the quiz. A closer examination of the remaining portion of the curve reveals that
students probably skipped certain sections of the video, as evidenced by the irregular
user curve.
For lecture 3, the user count only decreased slightly, with 14 participants. The
watch behavior is similar to that of the previous video. Given that the user count did
not decrease further, we assume that the remaining students were highly engaged
in the learning content. This assumption will be supported by the attention diagram
depicted in Figure 4.
The downward trend continues for Lecture 4. The number of viewers peaked
at 7 but remained stable throughout the video. The numbers depicted in this dia-
gram indicate an urgent need for improvement and optimization.
Fig. 3. Drop ratio diagram of the OOP course
12 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
 
Figure 4 depicts the amount of time participants needed to answer the LIVE pop-up
questions. Each column represents a lesson from the current course. The number
of delays for the multiple-choice questions is indicated by the green components.
Blue and red parts represent programming tasks and text questions that are not used
in these videos. We can observe that the fastest 25% of students had an answer delay
of 16 seconds for Lecture 1, whereas the average delay time was around 23 seconds.
Furthermore, 75% of the students completed the quiz in less than 29 seconds. The
plot’s relatively long whiskers indicate an even distribution above the upper limit, as
observed in a dierent version of the answer delay diagram (see Figure 5). There is
also an outlier who took up to 58 seconds to respond to the question.
Lecture 2 includes two multiple-choice questions. We can see that the questions
in this lecture were answered more quickly than the questions in the previous video.
The fastest 25% completed the interaction in 9 seconds, while the average student took
around 11 seconds. Furthermore, 75% of the students answered the questions in less
than 15 seconds. Our plot shows outliers taking up to 56 seconds to complete the interac-
tion. The boxplot and its whiskers are particularly short, indicating a dense distribution.
Fig. 4. Drop ratio diagram of the OOP course
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 13
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
The actual distribution is illustrated in Figure 5 using a violin plot. The violin
used for this lecture has a thick base.
The answer delays are once again relatively dense in Lecture 3. The question
was answered in 8 seconds by the fastest 25% of participants. The average student
took approximately 11 seconds to complete the interaction. With only 14 seconds
to respond, the upper limit was calculated. The box plot’s short whiskers indicate a
high concentration of values around the 10-second mark, which is corroborated by
the corresponding violin plot. A notable outlier has a time of 51 seconds.
A small box can also be found in the Lecture 4. 25% of the students answered the
questions in 9 seconds or less, while the average student answered in 11 seconds.
In comparison to previous lectures, the upper limit is 17 seconds. This component,
as indicated by the violin plot, extends beyond the previous one, suggesting that the
students took more time to answer the questions related to inheritance. The corre-
sponding violin plot conrms a high concentration of values around the 10-second
mark. A notable outlier has a time of 51 seconds.
The answer delay diagram distributions and overall fast answer times for all lec-
tures indicate that students understood the presented content and were engaged,
leading to these positive results. Lecture 1 appears to have been the most dicult,
followed by the nal lecture about inheritance.
Fig. 5. Violin plot of the answer delays
14 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
 
Not only do the pop-up interactions provide information about the students’
learning behavior, but they also oer valuable insights. Figure 6 depicts the stu-
dents’ interaction with the video, including stops, plays, and resumes, indicating the
frequency at which the videos were fully viewed. As depicted in Figure 6, there is
a noticeable decline in the overall interactions with each video, which aligns with
the previously mentioned observations of declining viewership. The rst video
depicts 220 pauses and 161 resumes. The plot shows 81 starts, but only 21 people
completed the entire video. The number of completions matches the one shown in
the dropout ratio diagram. In Lecture 2, we observed a signicant decrease in viewer
interactions, with 98 stops, 25 plays starting from the second zero, and 78 resumes.
The participants watched the video ten times in its entirety. Given the numbers in
Table 2, this result was to be expected. In Lecture 3, we observe a signicant decrease
in interactivity compared to the decrease in viewers. Only two viewers left, according
to the LIVE data table and the dropout ratio diagram, while interactivity experienced a
signicant decline. The trend of decreasing interactions continues in the nal Lecture 4,
with only 39 interactions in total. These gures show a signicant decrease in student
engagement in the last two videos, highlighting the critical need for improvement.
Fig. 6. The number of interactions per video
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 15
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
 
This type of diagram is used to show researchers how attention levels behaved
during the lectures. The LIVE platform calculates the attention levels of individual
students based on their interactions with pop-ups and the rest of the user interface.
As a result, activities such as pauses and resumes have an impact on values.
According to Figure 7, the average attention level of the students starts at 99% for
the rst lecture and gradually decreases to 95% within the rst three minutes. The
diagram illustrates the rst automatic pop-up question of the LIVE platform as a ver-
tical red line. The attention value recovers after the interaction, remaining between
96.8% and 95.6% until the next activity at the end of the video, as shown in the g-
ure. The previous diagrams depicted data exclusively obtained from object-oriented
programming (OOP) in the Python course. For the attention change, we also want to
compare the video of Lecture 1 (OOP) and Lecture 2 of the Basics course.
Fig. 7. The change of attention in lecture 1 (OOP)
Figure 8 for the second lecture of the Python Basics course shows a similar pattern.
However, the students’ attention rapidly declined in the rst third of the video, drop-
ping from 95% to 90%. After the interaction, the attention value initially recovered
to 92%, but shortly after, it fell to 87% until the next activity of the video. Notably, this
activity involved a programming task, which is signicantly more interactive than
a multiple-choice question. After completing this task, the attention value steadily
increased and even surpassed the initial attention level in the video, reaching 95.5%.
While both diagrams depict an increase in attention after the interactions, it is
clear that the curve of Lecture 1 is way more jagged than the one shown in Figure 8.
16 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
The second curve is more continuous because there are more participants, resulting
in a more expressive curve.
Fig. 8. The change of attention in lecture 2 (Basic)
 CONCLUSION
Interactivity has a positive impact on students’ learning experiences and over-
all success. Technology-enhanced learning allows for additional data collection on
how students interact with the presented content. The eectiveness of interactivity
is inuenced by various factors, such as the timing of pop-ups and the nature of
the task, whether it is a programming task or a multiple-choice question. To exam-
ine these inuences, we implemented and utilized the LIVEData evaluator to gen-
erate diagrams that illustrate the processed data in a manner that researchers can
comprehend.
The objective of our research was to address the question of how to visualize
and analyze students’ learning behavior in interactive learning videos. The tool
currently provides researchers with access to seven dierent diagrams. These
visualizations can be created with a single command-line interface (CLI) com-
mand. They process the data collected by the LIVE platform. We also provided
instructions on how to interpret the diagrams before demonstrating their use in
various courses.
We began by analyzing the students’ behavior using data provided by the iMooX
platform, where the Python courses are hosted. We analyzed the participants’ learn-
ing eciency using data from surveys, assessments, and quizzes. A glance at the
iJET | Vol. 18 No. 24 (2023) International Journal of Emerging Technologies in Learning (iJET) 17
Analysis of Students’ Behavior Watching iMooX Courses with Interactive Elements
point distributions revealed that the number of students was decreasing with each
lecture. Using only the MOOC data, however, did not provide a clear understand-
ing of the cause. At this point, the visualizations have provided useful insight into
the students’ attention, indicating the root cause of the declining retention rate. The
diagram analysis provided us with valuable insights into students’ attention and
course retention rates. Notably, despite the high attention levels, we lost students
with each lecture.
The majority of dropouts were recorded during the rst lectures. Given the posi-
tive attention values and quick response times for the remaining videos, we hypoth-
esize that the dropouts were either overwhelmed by their prior knowledge base
or disinterested in the content. The diagrams also revealed another pattern for a
dierent course, which had an excellent retention rate.
In conclusion, the visualization analysis has proven to be a valuable addition to
examining students’ learning behavior. The use of these illustrations simplies the
comparison of outcomes between dierent videos and courses. We have answered
the research question through our comprehensive explanation of the implementa-
tion and detailed analysis of the evaluation results.
 REFERENCES
[1] T. N. Garavan, “Understanding participation in e-learning in organizations: A large-scale
empirical study of employees,” International Journal of Training and Development, vol. 14,
no. 3, pp. 155–168, 2010. https://doi.org/10.1111/j.1468-2419.2010.00349.x
[2] R. Hake, “Interactive-engagement versus traditional methods: A six-thousand-student
survey of mechanics test data for introductory physics courses,” American Journal of
Physics, vol. 66, no. 1, pp. 64–74, 1998. https://doi.org/10.1119/1.18809
[3] M. M. Marrero, A. E. Gunning, and T. Germain-Williams, “What is STEM education?”
Global Education Review, vol. 1, no. 4, pp. 1–6, 2014. Available: https://doaj.org/article/
bee9199c147b4ef198dd1d3133b259. [Accessed: Sept. 29, 2023].
[4] B. Yıldırım, “MOOCs in STEM education: Teacher preparation and views,” Technology,
Knowledge and Learning, vol. 27, no. 3, pp. 663–688, 2020. https://doi.org/10.1007/
s10758-020-09481-3
[5] N. Türk, N. Kalaycı, and H. Yamak, “New trends in higher education in the globaliz-
ing world: STEM in teacher education,” Universal Journal of Educational Research, vol. 6,
no. 6, pp. 1286–1304, 2018. https://doi.org/10.13189/ujer.2018.060620
[6] D. Bell, “The reality of STEM education, design and technology teachers’ perceptions:
A phenomenographic study,” International Journal of Technology and Design Education,
vol. 26, no. 1, pp. 61–79, 2015. https://doi.org/10.1007/s10798-015-9300-9
[7] K. Thompson, “7 Things You Should Know About MOOCs,” Library.Educause.Edu, 2011.
https://library.educause.edu/resources/2011/11/7-things-you-should-know-aboutmoocs.
[Accessed: Sept. 29, 2023].
[8] H. Khalil and M. Ebner, “MOOCs completion rates and possible methods to improve
retention a literature review: World conference on educational multimedia, hyper-
media and telecommunications,” in Proceedings of World Conference on Educational
Multimedia, Hypermedia and Telecommunications, pp. 1236–1244, 2014.
[9] V. Subbian, “Role of MOOCs in integrated STEM education: A learning perspective,” IEEE
Xplore, 2013. https://doi.org/10.1109/ISECon.2013.6525230
[10] S. Ellis and J. Tod, “Promoting behaviour for learning in the classroom,” Routledge, 2014.
https://doi.org/10.4324/9781315753980
18 International Journal of Emerging Technologies in Learning (iJET) iJET
| Vol. 18 No. 24 (2023)
Dohr et al.
[11] S. Powell and J. Tod, “A systematic review of learning behaviour in school contexts,”
Semantic Scholar, 2004. https://www.semanticscholar.org/paper/A-systematic-review-of-
learning-behaviour-in-school-Powell-Tod/186239931abdfd7aa57ee97c7d-
d45ab59c32f787. [Accessed: Sep. 29, 2023].
[12] H. J. Heinze et al., “Combined spatial and temporal imaging of brain activity during
visual selective attention in humans,” Nature, vol. 372, no. 6506, pp. 543–546, 1994.
https://doi.org/10.1038/372543a0
[13] M. A. Cicekci and F. Sadik, “Teachers’ and students’ opinions about students’ attention
problems during the lesson,” Journal of Education and Learning, vol. 8, no. 6, p. 15, 2019.
https://doi.org/10.5539/jel.v8n6p15
[14] K. K. Szpunar, S. T. Moulton, and D. L. Schacter, “Mind wandering and education:
From the classroom to online learning,” Frontiers in Psychology, vol. 4, 2013. https://doi.
org/10.3389/fpsyg.2013.00495
[15] M. Giorgdze and M. Dgebuadze, “Interactive teaching methods: Challenges and perspec-
tives,” IJAEDU-International E-Journal of Advances in Education, vol. 3, pp. 544–548, 2017.
https://doi.org/10.18768/ijaedu.370419
[16] J.-J. Chang, W.-S. Lin, and H.-R. Chen, “How attention level and cognitive style aect learn-
ing in a MOOC environment? Based on the perspective of brainwave analysis,” Computers
in Human Behaviour, vol. 100, pp. 209–217, 2019. https://doi.org/10.1016/j.chb.2018.08.016
[17] D. Zhang, L. Zhou, R. O. Briggs, and J. F. Nunamaker, “Instructional video in e-learning:
Assessing the impact of interactive video on learning eectiveness,” Information &
Management, vol. 43, no. 1, pp. 15–27, 2006. https://doi.org/10.1016/j.im.2005.01.004
[18] C. J. Brame, “Eective educational videos: Principles and guidelines for maximizing
student learning from video content,” CBE—Life Sciences Education, vol. 15, no. 4, 2016.
https://doi.org/10.1187/cbe.16-03-0125
[19] F. Lehner, “Interaktive Videos als neues Medium für das eLearning,” HMD Praxis der
Wirtschaftsinformatik, vol. 48, no. 1, pp. 51–62, 2011. https://doi.org/10.1007/BF03340549
[20] S. Schwan and R. Riempp, “The cognitive benets of interactive videos: Learning to tie
nautical knots,” Learning and Instruction, vol. 14, no. 3, pp. 293–305, 2004. https://doi.
org/10.1016/j.learninstruc.2004.06.005
[21] J. Gaytan and B. C. McEwen, “Eective online instructional and assessment strategies,”
American Journal of Distance Education, vol. 21, no. 3, pp. 117–132, 2007. https://doi.
org/10.1080/08923640701341653
[22] J. Wachtler, “Interaction-based support of selective attention in online courses,” 2019.
Available: https://diglib.tugraz.at/download.php?id=5f90186005112&location=browse.
[Accessed: Oct. 3, 2023].
 AUTHORS
Daniel Dohr is studying software engineering and management at the Graz
University of Technology (E-mail: dohr.daniel@gmx.at).
Josef Wachtler is currently working at the Department of Educational
Technology at Graz University of Technology as an Edtech-developer. He holds a
PhD in computer science and assists in supervising Master’s and Bachelor’s theses.
His research-interests are in the eld of video-based learning used in dierent set-
tings like schools, universities or MOOCs (E-mail: josef.wachtler@tugraz.at).
Martin Ebner is currently Head of the Department Educational Technology at
Graz University of Technology and therefore responsible for all university-wide
e-learning activities as well as a senior researcher for educational technology
(E-mail: mebner@gmx.at).
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
This study investigated teachers’ views of Massive Open Online Courses (MOOCs). The sample consisted of 30 teachers recruited from different cities of Turkey using criteria sampling. Phenomenology was the research method of choice. Data were collected using a semi-structured interview form and analyzed using content analysis. Participants use MOOCs because they are free of charge and have good content and high quality. MOOCs help them learn science, technology, engineering, and mathematics, gain professional knowledge, and develop skills, and positive attitudes and values. Most participants are interested in integrating MOOCs in their classes; however, they face various problems during MOOCs, such as loss of motivation and Internet connection issues. It is recommended that MOOCs be designed in such a way that they increase participants’ motivation and allow for feedback.
Article
Full-text available
This research which investigates teachers’ and students’ opinions about students’ attention problems during the lesson is a descriptive study in the survey model. 432 teachers and 1023 students from secondary schools in the central districts of Adana voluntarily participated in the study. The research data were collected with a Written Interview Form developed by the researchers and a descriptive content analysis was used for data analysis. As a result of the research, it was observed that the teachers perceived the attention problems that the students experienced during the course mostly as a problem arising from the students themselves while the students associate this problem not only with themselves, but also with other students, teachers and the environment. According to the results, teachers as well as students easily noticed the psychological characteristics, the behaviors they exhibited and their low academic performance, but the teachers evaluate this situation more as disciplinary problems. The solution suggestions of the teachers who kept the attention problems of the students out of their own sphere and their teaching practices were that passing exams should be harder and discipline regulations should change to facilitate punishment. The students stated that teachers should show more interest towards the students, approach the students positively and use a variety of teaching methods in accordance with the students’ level.
Article
Full-text available
The supply of highly qualified scientists, technologists, engineers and mathematicians is perceived by governments globally as being vital in securing economic prosperity, but somewhere along the line pupils are being ‘switched off’, and disengage with the study of science, technology, engineering and mathematics (STEM) beyond compulsory schooling. Improved STEM Education is presented as a way forward, and the supply of well qualified teachers is perceived as integral to achieving this vision. However in England and Wales, as government funded teacher training bursaries rise for those seeking to pursue a career in mathematics or science, funding for those wishing to train to teach engineering or design and technology is less lucrative. As individual disciplines both hold enormous potential to contribute to the STEM agenda, however currently this is not wholly realised. Set against a background of policy reform and curriculum change, this paper seeks to explore the ways teachers of design and technology perceive STEM, and how the range in variation of perception, relates to design and technology pedagogy. Phenomenography is the adopted methodology, and as such this paper explores participant’s pedagogical understanding and perceptions from a non-dualistic ontological stance. The primary research tool was interview, which following data analysis, categories of description were formed to create empirically grounded outcome spaces. Outcomes from this study show that teacher’s perception of STEM, their personal knowledge, and understanding of that knowledge, is intrinsically linked to the effectiveness of STEM delivery in their own classroom practice. In conclusion, findings from this study would support, in order for learners (pupils) to become STEM literate, that teachers of all STEM subjects be supported to explore ways in which they can best foster mutually reciprocal arrangements with their STEM counterparts.
Article
Full-text available
One of the main objectives of education, which raises individuals that the society needs, is to resource labor force that will provide the development of the society by maintaining the economic, social, scientific and technological advances. Many countries that aspire to have an advanced economy and technology aim to build a society that is advanced at Science, Technology, Engineering and Mathematics (STEM) and have sustainable development in these fields, and this has been one of the main educational strategies of those countries. STEM education has a strategic importance for our country to maintain the competitive power on international scale. This research aims to conduct a needs analysis towards the curricular design of STEM education to be proposed for the undergraduate programs of science education at education faculties. The study holds a descriptive model and qualitative methods were used. The study group consists of 12 faculty from Science, Mathematics, Computer and Instructional Technologies at Faculty of Education, and from the Department of Mechanical Engineering, Faculty of Engineering at Gazi University; 15 science teachers who work at the schools which have different socioeconomic levels in Ankara, and 42 sophomore students who study at the Department of Science Education at Gazi University Faculty of Education. Content analysis was used as the data analysis method. Findings of the study reveal that there are no studies conducted to integrate different fields into teacher education programs, there are no courses related to STEM education integrated into the curriculum, and the knowledge and skills pre-service teachers should gain to implement this approach in their professional teaching career do not exist in the curriculum. In addition, it was found that most of the teachers believe teacher collaboration; they associate their courses mostly with mathematics and information technologies. Likewise, it was identified that the main reason behind why the teachers cannot teach through interdisciplinary approach is due to teacher related reasons. Concerning the pre-service education, The Ministry of National Education and education faculties should work collaboratively, determine the teacher competencies and try to develop undergraduate curriculum for pre-service teachers to gain these competencies. Training teachers during pre-service period rather than in-service training will play a crucial role to gain the expected results from the approach.
Conference Paper
Full-text available
Massive Open Online Courses (MOOCs) have created a major paradigm shift in online education by offering web-based courses at no cost to virtually anyone with access to a computer/laptop with internet connection. In 2012, MOOCs gained significant attention from media, students, educators, and entrepreneurs. It is continuing to develop into a more popular and widespread educational technology. This paper identifies and discusses few key elements of massively open online education that can influence pedagogy and learning in STEM disciplines. It also examines the potential of MOOCs in improving learning experiences and facilitating interdisciplinary education in STEM fields. Errata: On page 3, the date in both the table titles (Table I and Table II) should read "January 2013" and not "January 2012."
Article
The massive open online courses (MOOCs) enable learners to learn over distances. These courses also attract attentions in higher education. However, classroom virtualization generates problems, creates distraction for students. MOOCs have provided an individualized learning opportunity in the sense that the cognitive style affects learning. This study aims at exploring the differences of students’ attention levels and learning effectiveness perceived affected by different cognitive styles within a MOOC learning environment. Brainwave detection equipment was used to carry out a teaching experiment among junior university students who were enrolled in a course about the Internet of Things. Results revealed that: (1) learners who have independent verbal, independent imagery, and dependent imagery types of cognitive styles, learning effectiveness are significantly better in the MOOC learning environment compared to traditional PowerPoint-based teaching; (2) in the MOOC learning environment, learners who have dependent verbal type of cognitive style would have significantly better attentional levels than those who have the dependent imagery type of cognitive style; (3) attention levels among learners who have the independent verbal type of cognitive style were highly positively correlated with learning effectiveness. It also implies that learners who have the independent verbal type of cognitive style benefit greatly to learn in the MOOC environment. For those learners who have the dependent verbal type of cognitive style are the last to benefit to learn in the MOOC teaching environment.
Article
Educational videos have become an important part of higher education, providing an important content-delivery tool in many flipped, blended, and online classes. Effective use of video as an educational tool is enhanced when instructors consider three elements: how to manage cognitive load of the video; how to maximize student engagement with the video; and how to promote active learning from the video. This essay reviews literature relevant to each of these principles and suggests practical ways instructors can use these principles when using video as an educational tool.
Article
Der Einsatz von Videos in eLearning-Applikationen wird zunehmendforciert. Es istdavon auszugehen, dass im Internet zukünftig die Bedeutung der Videotechnologie sogar noch zunehmen und der Interaktivität dabei eine besondere Rolle zukommen wird. Die Interaktivität in videogestützten eLearning-Anwendungen wird derzeit noch wenig genutzt, obwohl dadurch ein exploratives, situiertes und problemorientiertes Lernen gut unterstützt werden kann. Der Nutzer eines interaktiven Videos (auch als »Hypervideo«, »dynamisches Video« oder »clickable Video« bezeichnet) kann sich, ähnlich wie beim hypertextbasierten Lernen, die für ihn relevanten Informationen mithilfe der interaktiven Funktionen heraussuchen und je nach verwendeten Interaktionsele-menten auch aktiv mit dem Video arbeiten. Dieser Beitragführt kurz in die Technologien und die besonderen Herausforderungen ein, die mit der Erstellung interaktiver Videos verbunden sind, und bietet einen Überblick über die aktuelle Verbreitung und Nutzung interaktiver Videos im Bereich eLearning. Darauf aufbauend wird ein neu entwickeltes Tool zur Realisierung interaktiver Videos vorgestellt und es werden erste Erfahrungen in der Unterrichtsanwendung präsentiert.