Content uploaded by Martin Ebner
Author content
All content in this area was uploaded by Martin Ebner on Sep 11, 2020
Content may be subject to copyright.
Draft – finally published in: Leitner P., Maier K., Ebner M. (2020) Web Analytics as Extension for a Learning Analytics
Dashboard of a Massive Open Online Platform. In: Ifenthaler D., Gibson D. (eds) Adoption of Data Analytics in Higher
Education Learning and Teaching. Advances in Analytics for Learning and Teaching. Springer, Cham.
https://doi.org/10.1007/978-3-030-47392-1_19
Chapter
# - will be assigend by editors
WEB ANALYTICS AS EXTENTION FOR A LEARNING ANALYTICS
DASHBOARD OF A MASSIVE OPEN ONLINE PLATFORM
Philipp Leitner, Graz University of Technology, Graz, Austria, e-mail: philipp.leitner@tugraz.at
Karin Maier, Graz University of Technology, Graz, Austria, e-mail: karin.maier@student.tugraz.at
Martin Ebner, Graz University of Technology, Graz, Austria, email: martin.ebner@tugraz.at
Abstract: Massive Open Online Courses (MOOCs) provide anyone with internet access the chance to study at university
level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of
data representing their learning process. Learning Analytics (LA) can help identifying, quantifying and
understanding these data traces.
Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the
Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis
as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable
tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit it
shall act as a future-proof framework to be reused and improved over time.
Aimed towards administrators and educators, the dashboard contains interactive widgets letting the user explore
their datasets themselves rather than presenting categories. This supports the data literacy and improves the
understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The
web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heat maps to identify
regions of learner's interest and help separating structure and content. Activity over time is aggregated in a
calendar view, making timely reoccurring patterns otherwise not deductible, now visible.
Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become
easier to improve the teaching and learning process by tailoring the provided content to the needs of the online
learning community.
Key words: MOOC, Learning Analytics, Learning Dashboard, Online Learning, Visualization
2
Chapter # - will be assigend by editors
1. INTRODUCTION
The internet as a provider for information and educational material plays a central role in the
ubiquitous learning environments, we all live in nowadays and thereby, changing drastically how
learning takes place now and in future. Identified as the future of education (Billsberry, 2013),
Massive Open Online Courses (MOOC) attract a lot of interest in the last decade (McAuley et al.,
2010). MOOCs provide anyone with internet access the opportunity to participate in online
courses on a university level for free. Because of high demand, MOOC platforms have to deal
with a large audience of a wide variety of people from all over the world (Romanowski and
Konak, 2016). Regardless of the potential of this new learning format, there are also some
challenges. While teachers can observe their students in a traditional learning environment and
respond appropriately and immediately when needed, they are not able to do so in an online
environment especially with a large number of participants. Therefore, the legitimate step is to
observe and analyze learners’ data in this new environment to grasp and understand this new way
of learning and improving the underlying process. In this intersection of various academic fields
such as education, psychology, education and computer science the term Learning Analytics
(LA) was coined (Dawson et al., 2014). The goal of LA is to understand learning itself and the
environment in which learning occurs, but additionally it can also be seen as the approach to
optimize these factors (Verbert et. al., 2013).
Although LA is a relatively new research field, one important outcome from previous research
may be that there is no “one size fits all” LA solution (Blikstein, 2013). Therefore, a requirements
analysis of the stakeholders involved, the university and the platform guarantee successful
deployment and valuable results. In their literature review, Leitner, Khalil & Ebner (2017)
categorized the involved stakeholders into learners, teachers and researchers/administrators.
Although the learners are the main target group when talking about learning, our dashboard was
specifically designed to support teachers and administrators to understand how learning is
happening.
To achieve this, however, it is necessary to take a closer look at the activities of learners. The
data records for LA come directly from the Learning Management System (LMS) used, where
information such as the number of downloads or accesses to the system can be generated. Stored
as log files or numbers in a database, this data and its sparse presentation may not be sufficient to
answer the research question of how learners use MOOCs. A chain of processing steps is
necessary to receive a human interpretable representation; this starts with identifying the traces
left behind by learners, through data aggregation techniques within this learning environment, to
data modeling, and the definition of key figures and metrics (Duval, 2011).
The Web Analytics (WA) plugin presented in this research work performs these steps by
capturing the learner's interactions with the provided resources in selected courses on the
Austrian MOOC platform iMooX, founded in 2013 (Kopp & Ebner, 2015). In addition, suitable
indicators are defined which are to be presented to the interest groups. These are encapsulated as
widgets and integrated into a LA dashboard named LA Cockpit.
This approach provides the opportunity for a sophisticated view on how learners interact with the
learning material offered in MOOCs. Through behavioral analysis as well as associated metrics
combined with the educators’ experiences from face-to-face teaching, the dashboard support
teachers in the decision-making process on where to act and how to improve the learning process
in general.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
3
Taking into account guidelines and best practices from our previous research (Maier, Leitner &
Ebner, 2019), we have extended our framework to include also web analytics. Our overall goal
for our LA Cockpit is to close the information gap that teachers in MOOCs have compared to real
classroom learning situations and examines what can be derived from recorded activity traces in
online learning environments. For integration into the existing framework, a subset of possible
metrics was designed, on which the plugin explores appropriate visualization tools for the
engagement and behavior of the captured learners. Further, the plugin aims to provide means of
evaluation to improve the quality of the offered MOOCs by adapting the content and presenting it
to the MOOC community.
Therefore, our main research question follows the goal of how such LA dashboard have to look
like to assist especially teachers or any educators to understand the learning process of his/her
learners in order to improve their teaching and learning behavior within a MOOC platform.
2. RELATED WORK
The reuse of established tools from various research fields for educational data traces is
increasing in the recent years. Those visualization strategies are using charts, graphs or maps as
presentation technologies for digital dashboards (Elias, 2011) which have been successfully
adapted using educational data (Jivet, 2016; Charleer et al., 2017). These learning dashboards
have proven to be effective tools in aiding teachers and learners in the context of the learning
process.
Based on the findings and earlier studies on the design of learning dashboards (Leitner & Ebner,
2017; Khalil, Taraghi & Ebner, 2016), three recurring ideas can be worked out:
1. Relevant metrics: A very crucial step is finding suitable metrics for the target group. If
it is not done properly, the tool will become overloaded or the user will be discouraged
and therefore it would be useless to use LA.
2. Visuals: To make complex coherences understandable and visible to the users, it is
essential to use appropriate colors for the different visualization types. Therefore, it is
necessary to apply basic principles of interface design and ensure that aggregated data is
not falsified.
3. Interactivity: Different views or filter options increase the usefulness of the tool for the
users and speak to their curiosity. Interactivity is preferred over certain discontinuous
numbers.
Furthermore, a suitable system has to be found which also supports the requirements of online
use. Learning Management Systems (LMS) are a good choice because they offer administration
and serve as a provider for online resources. The market provides various paid as well as cost free
alternatives with the option of self-hosting on the universities own infrastructure or as Software
as a Service (SaaS) cloud instance.
As part of the cost-free alternatives, Open Source implementations are particularly attractive.
Three very popular products are challenging each other: Moodle, Open edX and Canvas. All
three are providing basic logging capabilities and visualization options. Although, teachers and
administrators have to work with log files for more specific metrics and key figures if they want
4
Chapter # - will be assigend by editors
to go deeper in the data. It is a great additional effort to provide functions that go beyond the
basic statistical figures and reports for the product providers. Therefore, the ideas and concepts of
LA are slowly finding their way in the software. For example, Moodle offers various dashboard
plugins with LA capabilities. Unfortunately, some integrate connections to external servers or
promote their additional paid content.
Other plugins and projects such as Analytics Graphs for instructors, is not applicable for a
broader range of possible users. They lack in analytical features, customization options or cover
only specific use cases in their implementation.
All these dashboards are working with data produced by learners. Additional constraints for
collecting, storing and transferring this personal data apply. The increasing volume of data, often
a by-product of online interactions, has brought new perspectives on privacy and property. The
ownership of data has become a hot topic in the last years. Individuals started to claim the “right
to be forgotten” (Elias, 2011) and people started to question the “almighty” algorithms over bias
and validity. This rising awareness made it necessary to think about potential risk and benefits.
This has recently been legally manifested in the General Data Protection Regulation (GDPR).
The results are publications about guidelines, best practice and good working examples. It is
necessary to think about inconvenient questions regarding privacy and ethic usage before
applying algorithms and tools on the data. This can be done by agreeing on an ethical framework
or checklist such as the one by Greller and Drachsler (2016) when dealing with learner’s data.
Further, Khalil and Ebner (2016a) dealt with the challenges LA is facing and also pointed out the
possibility of de-identification of learner’s data (Khalil & Ebner, 2016b).
If a researcher wants to use LA, the rights of the data subjects must be questioned. Openness
about intentions, distinction which data shall be collected for which purpose, storage- and access
rights with state-of-the-art software and security standards are some points to think about. They
need to be discussed with all stakeholders. Further training of academic staff is needed to ensure
that all standards are meet. Further, despite the promises and benefits of LA it is necessary to
discuss the critics on LA metrics, such as the loss of control over data traces. To mitigate the risk
and compromise between the benefits and drawbacks the DELICATE checklist can be used
(Drachsler & Greller, 2016). As a consequence, dashboards should not only comply with the
minimal requirements given by law or agreements from the institution. Moreover, it is necessary
to think about the consequences of displaying metrics, classifications and visualizations from the
early stage of the design phase.
WA is used to obtain key information about the behavior of users on websites. Rohloff et al.
(2019) discussed in their research work about the possibility to use WA without compromising
the learners’ data privacy. In their test setting they integrated Google Analytics in an MOOC as a
proof of concept. The study showed that WA can provide useful insights and retrieve a large part
of metrics relevant in context of LA for the stakeholders. Especially key performance indicators
(KPI) are easier to obtain from WA tools than e.g. learner specific metrics due to the fact that
WA is not designed to retrieve user level information or providing LA data to individual students.
(Rohloff et al., 2019)
3. CONCEPT OF THE LA COCKPIT
The first version of the LA Cockpit was completed at the end of 2017 and entered the evaluation
and test phase in an academic test environment the following year (Maier, Leitner & Ebner,
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
5
2017). Designed as a plugin for the LMS Moodle, the initial concept included requirements such
as simple maintenance or modular and configurable system design. The target group were only
administrators of the LMS. The focus was on demonstrating that Learning Analytics plug-ins can
be used with open source resources and serve as a basic source repository for quick and easy
extension.
The LMS Moodle already collects basic statistics about the system as well as data on interactions
between participants with the learning objects and stores this information in the database. The
first version of the LA Cockpit used these database tables to group and aggregate on daily basis.
This basic metrics were encapsulated and visualized trough widgets. The MOOC administrator
could add or delete these widgets. Besides presenting different visualization methods, it should
also serve as a starting point for other key figures. The metrics showed system-wide key figures
from the LMS. We interviewed the stakeholders and together a list of feature requests was
created. This list was decisive for the revision of the dashboard.
The new and extended version of the LA Cockpit is based on these existing daily aggregation
mechanisms. Furthermore, it is improved by additional data tracks from outside the learning
management system. Capturing interaction within the learner's browser environment enriches the
data available in the LMS and provides opportunities for new metrics and widgets. The next
section describes the basics of behavioral analysis and activity measurement. Further design
changes, improvements to the LA Cockpit and a new feature later called Web Analytics (WA)
plugin are discussed in the next sections.
3.1 Activity Measurement
When designing learning analytic tools, the focus lies on how to the content is going to be
displayed and which key figures should be provided. In the environment of many online learners
spread across different courses, the contact and interaction between teachers and learners is
fundamentally different from the traditional face-to-face environment. Teachers receive feedback
from the learners often only through their final grade or explicitly requested responses.
For additional, implicit feedback of the interaction process the teacher must be able to rely on
features of the LMS, but simple statistics of the system do not sufficiently reflect the actions of
learners. The LA Cockpit should re-enact this cognitive connection between teachers and learners
and allow teachers to use their pedagogical knowledge to work with the displayed information.
In order to measure different activities a closer look must be done to the LMS itself. As data
source one can access the LMS resources in form of data base tables, logged events and records
of technical processes such as download count. This is often not enough to capture the manifold
ways of learning, so a number of research studies (Blikstein, 2013, Spikol et al., 2017) try to
make use of additional information (or multi-modal data) such as speech, writing or non-verbal
interaction (e.g. movements, gestures, facial expressions, gaze, biometrics, etc.) during real
learning activities to enlarge the data traces to create metrics from. The internal state of the
learning process is quantified by capturing its external representation of learning.
The focus of the LA Cockpit is on aggregated metrics rather than individual, single and absolute
values. The main goal is to create a context for teachers without classifications or complex
predictive modelling. With the help of the Web Analytics Plugin the LA Cockpit hopes to
visualize - not only quantify - activities within a course in an aggregated mirrored view from the
6
Chapter # - will be assigend by editors
user’s perspective. Insights into what the learners are doing within a MOOC and what resources
they are accessing can provide a starting point for further research.
In the next section the details of the behavioral analysis approach with the Web Analytics plugin
are discussed. Basic technical background of the LA Cockpit and the building blocks of the
technology within its environment are discussed in chapter 4.
3.2 Web Analytics
The LA Cockpit provides means of measuring, identifying and visualizing the behavior of
MOOC participants with an additional plugin built within this research study. Applying Learning
Analytics only with the resources from the Moodle system is not enough, especially in the
context of MOOCs. In the face-to-face classroom situation, teachers can observe, infer and act
upon the learner’s behavior. Following questions must be pointed out:
• Do students struggle to find a certain resource?
• Do they need more time than expected?
• Do they answer their quizzes by going back and forth between video lectures and quiz
questions?
In the online environment of MOOCs, the providing platform has no timely analysis capabilities
of the interaction on the client side. Therefore, the Web Analytics plugin tries to capture the
interaction within the browser window, aggregating the interaction and offering an additional
data source from the learner’s perspective.
Figure 1 provides an overview of the involved resources. The aggregation of data does not
happen user-wise, but resource-wise beginning with the clients’ browsers.
[Add Fig. 1 here]
Figure 1: Resource Interaction of the LA Cockpit
Within Moodle, multiple pages, distinguishable via their URL, represent one specific online
learning course. Each URL is considered as a single resource. This means, that for each accessed
course page the activities of the learners are logged and daily aggregated into the LA Cockpit
database.
Using the web browser to access the learning resources, interaction can happen via different types
of input devices: mouse, keyboard, touch- or speech input, whereas mouse- and keyboard-
interaction are considered as standard input devices. With mobile devices, the mouse is replaced
by touch input, a physical keyboard is simulated with a virtual one. The way those input devices
are used can relate to our cognitive processes and also depends on the presentation of the content.
From a technical point of view, behavior can be categorized into different events happening
within the system. These need to be interpreted by the browser to react accordingly, e.g. a click
on a button opens a pop-up. The triggered event gets forwarded and processed by the browser,
where the WA plugin aggregates different type of events. The following events are aggregated
with their timestamp attached:
• Mouse Movement Aggregation for changing x and y coordinates of the mouse
pointer.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
7
• Click A click (pressing of a button followed by a release) event as well as a target
resource upon which the event has happened (e.g. button, link).
• Key Event Timestamp for the key event and if any special keys are pressed (Shift,
Alt, Control).
• Scroll Depth Scroll depth is saved within a regular interval. This refers to the
calculated percentage of the page the users have scrolled to, where the top of the
webpage is considered 0% and bottom of the page would be 100%.
The main data provider are mouse movements. A mouse movement can be defined as continuous
event sampled at consecutive points in time with according x and y coordinates, creating a
discrete data trace over time. All visually guided movements (e.g. selecting, pointing, clicking)
are formed through gestures with the mouse device.
In the WA plugin’s database, a mouse movement is described by consecutive logged entries. The
database field id refers to a consecutive log number and a timestamp is the Unix timestamp of the
triggered event, whereas event time is the JavaScript generated timestamp. The latter is calculated
from zero, defined as the creation of the web document and is reset with a reload of the web
document. The WA plugin saves both for redundancy reasons. The position of the mouse event is
given by its values x-pos and y-pos, calculated from the coordinate system where (zero, zero)
starts at the top left corner of the web document. For the metrics, the number of database entries
grouped by these coordinates are used to generate the Heatmap value.
There are many different ways to use and consume web content, mouse movement analysis can
provide the necessary data for the goal of the WA Plugin to identify overlapping regions of
interest. Especially as this collected data results from real life situations and not from a controlled
lab situation previous research results about correlation cannot directly be transferred.
8
Chapter # - will be assigend by editors
3.3 Metrics and Visualization
The aggregated data from the WA plugin still needs some further refinements before becoming
usable within the widgets of the LA Cockpit. It might not be necessary or helpful to display all
raw data in every detail. The target group should get widgets which are easy to interpret and to
understand. Therefore, meaningful subsets of the information have been agreed upon and the
activity data will be represented with three new additional widgets: Device Statistics, Activity
Calendar and Heatmap.
These should provide the teacher with a starting point for discussions on how students interact
with learning resources. The web analysis function is intended to enable researchers to create
additional analytical functions of the LA Cockpit that are related to this behavioral analysis data
set. Each metric provides a different perspective on the aggregated interactions, the foundations
of the metrics and its visualizations are explained below.
4. IMPLEMENTATION
4.1 Device Statistics
In the evolution of the Web, the Internet began with a text-based system in which users navigated
by entering commands. Nowadays, browsers perform this task for the user. When accessing
resources, the browser act as an agent and turns the action into commands. When loading
resources, the browser on the client side identifies itself with the string User-Agent to the server.
In HTTP, the User-Agent string is used for content negotiation. The format is a list of product
tokens/keywords, with the most important listed first. The HTTP header of the request specifies
which languages the client can understand and which language is preferred, reflecting the
language set in the browser user interface.
For the WA plugin, this information is used within the Device Statistics widget (figure 2). The
character string of the browser user agent is stored the first time the user accesses the course.
Afterwards, the server analyzes all available data sets and returns sorted subsets to the dashboard
for visualization.
[Add Fig. 2 here]
Figure 2: Device Statistics Widget
4.1 Activity Calendar
For the Activity Calendar widget (figure 3), a basic aggregation of mouse events is performed.
The count value of the activities for each day is calculated directly from the data traces in the
database. All available events are stored on a daily basis as the calendar provides a daily
overview. The year view is calculated from the current date that gives an overview of past
activities within the last twelve months. The metric is aggregated on the server side according to
the request sent after selecting a course to display the data.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
9
[Add Fig. 3 here]
Figure 3: Activity Calendar Widget
4.2 Heatmap
The Heatmap widget shown in figure 4, consists of two main parts. The activity data itself -
visualized as traces between red and green - are aggregated from the WA plugin by analyzing the
user's mouse activity. Since these widgets have to visualize complex relationships, the LA
Cockpit uses state-of-the-art technology such as the D3.js framework. Without the displayed
course URL, these mouse traces would be difficult to interpret. Therefore, it is necessary to
provide the information layer presented to the user. The background image of the widget puts the
captured data into the correct position, for which an elaborate process creates the screenshots.
Then the widget itself presents only the list of URLs where background images are available to
the target user.
[Add Fig. 4 here]
Figure 4: Heatmap Widget
5. DISCUSSION
The revised and extended LA Cockpit for the Austrian MOOC platform iMooX was deployed for
the first time during the MOOC “LawBusters – Drei Themen Recht.humorvoll”. It started at the
end of December 2018 and featured three weeks of video lectures, which dealt with law related
basics in an entertaining way through analogies from Science-Fiction and Fantasy. Nearly 90
users took part in the course, where our goal was to provide a proof-of-concept for the LA
Cockpit as well as getting feedback of the participants.
5.1 First Evaluation results
The Feedback on the LA Cockpit was collected via online surveys and was general satisfying.
Especially, the basic concept of multiple dashboards proved beneficial when grouping different
widgets to the individual liking of the target user such as on course-basis. Yet, when managing a
larger number of courses, the name as a distinction turned out to be not enough. Additional
details such as date of creation could help finding the desired dashboard faster and so improve the
satisfaction with the tool.
We got a similar positive feedback to the PDF Report. It seems that despite the digital era, some
teachers prefer paper over digital reports, or at least the possibility to print it out. An additional
motivation might be that sharing course-related information without elaborate access and
authorization process to the LA Cockpit. Also, the information in the note section received broad
reception and is considered to improve the understanding of the different widgets. The adaptation
possibilities of those texts to document own findings and observations proved useful. User
adapted the text even for basic information related to the displayed metrics or visualization. In
our case with the LawBusters course, the Christmas time left a distinct decline in activity on the
10
Chapter # - will be assigend by editors
platform, shown in figure 5. Even such general remarks can find their place in the note area and
be used as documentation for later comparisons.
[Add Fig 5 here]
Figure 3: Login over Time Widget with notes
The user behavior was captured by the WA plugin and provides a large amount of information.
At the moment, only a subset of this information is visualized in our metrics. The target group of
researchers and administrators suggested that further details about the user experience in-course
should be visualized. One particular example was the keyboard usage in the context of forum,
which could give information about search bar usage to access specific learning materials.
Nonetheless, the provided metrics were considered helpful by the target group and gave
interesting insights to the underlying data.
As for the user group of administrators, the distribution of the operating systems and internet
browsers used, were relevant. Thereby, supports the testing and optimizing of the platform.
Teachers were more interested in the specific language settings of the participants. In our case,
two thirds of the participants accessed our system with German language, whereas the remaining
exclusively used English with American locales as their primary browser settings.
5.2 Limitations
The evaluation of the LA Cockpit also showed some limitations. Besides keyboard and mouse,
the input devices used on the system were especially touch-based. Which depending on the type,
such as mobile phones or tablet, needs to be handled differently. Web applications may either
process those touch-based inputs trough events or access them as interpreted mouse events. Our
WA plugin was designed to only collect and process mouse and mouse-interpreted events and
thus touch-only events were not or only partially recorded. In addition, it is a technically complex
task to completely cover the dependencies on the various combinations of operating systems and
Internet browsers, including different versions.
A second limitation concerns the sampling of mouse events. User interaction takes place
continuously in real time, with each logged event represented by a discrete timestamp. The data
stream generated by the tool has a sampling rate that is influenced by various factors. First of all,
the input device itself. A computer mouse has a polling rate, measured in Hertz (Hz), and a
corresponding polling interval. They define how often the position is reported to the computer,
which is usually once per millisecond. For example, 125 Hz means that the mouse position is sent
to the computer every 8 milliseconds. In addition, a discreetly rated representation of the mouse
movement that the event triggered within the web content, could add another layer of inaccuracy.
The mouse event is intercepted trough JavaScript in the WA plugin, where microsecond times for
events would be theoretically technically feasible. However, in order to minimize current security
threats such as Spectre (Kocher et al., 2019), browsers round the result of queried time stamps to
varying degrees. Thus, the exact profiling of users is not feasible. However, the assumption that
the entire data trace of a mouse movement is sampled at a lower rate has no negative effect on
data aggregation and the visualization of behavioral analysis in the LA Cockpit.
A third limitation relates to the fact that there are different ways, depending on different cognitive
processes and personal characteristics, to achieve the same goal such as downloading a learning
resource or accessing a video lecture. For example, while browsing the webpage the mouse
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
11
pointer of the user could be like an anchor, resting at the top of a paragraph. Another user, on the
other hand, can mark the passage along the text while reading, in order to copy it later. Using
keyboard shortcuts to scroll through pages or browse the web page would also leave the mouse at
a position that does not correlate with the center of the user's visual attention.
Several studies in the areas of mouse tracking, mouse movements and behavioral analysis have
shown that the mouse pointer can act as a weak proxy of the gaze (Arapakis, Lalmas and
Valkanas, 2014) and offers a cost-effective alternative to eye tracking (Huang, White and
Buscher, 2012). The strength of the correlation depends on the design of the website (Clark and
Stephane, 2018). Therefore, it is important to note that mouse motion analysis is not a suitable
substitute for eye tracking studies. The equipment required for these studies is much more
expensive and requires a predefined laboratory setting. These environmental requirements are not
transferable to the target application of the LA Cockpit. However, mouse activity provides a
suitable data source for checking the design of web pages and evaluating user activity in specific
areas. For the WA plugin, these data traces are visualized within metrics to provide insight into
remote processes that would otherwise not be observable.
6. CONCLUSION
The LA Cockpit, a custom LA Dashboard was revised, extended by WA and deployed at the
Austrian MOOC platform iMooX. It combines the collection, transformation and visualization of
data produced in the learning environment. Additionally, the focus during the design and
implementation of the LA Cockpit was on a modular framework and thereby its extendibility and
maintainability. Because of the complexity of LA approaches, the LA Cockpit has a number of
tools and offers a highly modular dashboard that can be adapted to the different needs of the
target groups.
The LA Cockpit contains basic key figures related to activities in the LMS itself. Through the
extension of the WA plugin, it is possible to analyze the behavior of the participants and thereby,
let the target group infer on the way learners interact with the course and its materials. Therefore,
the WA plugin uses a set of metrics to capture user activity. This behavioral analysis is done by
aggregating different traces of the user in the browser. Three widgets aggregating a series of
events and actions are designed to visualize metrics in the dashboard. The Device Statistics
widget provides statistical information about the devices used, browser versions, and language
settings. The widget activity calendar adds a temporary visualization element to the data.
Displayed as daily calendars for the last 12 months, different colored fields match the activity
level. With this view it is possible to uncover recurring patterns of user contributions in online
learning courses that may have gone unnoticed until now. The Heatmap widget uses the
traditional concept of mouse activity heating cards, which display moving areas in different
colors, from low activity areas with cold colors like blue to high activity areas in red. The widget
is often used as a tool to check the usability of websites in terms of their design, and provides
another dimension of information. The most commonly used resources and regions of high
interest can be visually inspected, giving teachers and administrators a quick overview.
All visualizations within the tool follow the core guidelines of dynamic and interactive
presentation. With this concept it is possible to let the user explore the data himself through the
visualizations instead of presenting indicators that are difficult to understand and interpret. This
12
Chapter # - will be assigend by editors
exploration phase is crucial to enable the user to understand relationships that are otherwise not
understandable due to large datasets or blurred by aggregated averages without the real dataset.
All metrics have course-wide aggregation of data that focuses more on the learning process than
on the individual's learning track. The resulting additional privacy does not affect the quality of
the information provided by the tool.
In order to better understand the handling of the LA Cockpit and to include the opinions of the
stakeholders, an evaluation in terms of usability and usefulness of the LA Cockpit was carried
out. In particular, the target group of researchers and administrators provided valuable
suggestions and further ideas for improving the LA Cockpit. The focus of these feature requests
refers to the metrics and the data and not to the application of the widget itself. Nevertheless,
there were common interests, such as including touch compatibility, including video analysis and
metrics, researching aggregated data with knowledge discovery methods, and providing thorough
evaluation.
Further, there is a great need for a comprehensive evaluation of the LA Cockpit including a
significant test user group. The next step in the development cycle would be not only feedback
from the target group on what information they would like to receive, but also on how to achieve
useful results. An evaluation of the tool in terms of interface design, usability and user-
friendliness and the content displayed should be made. This may be achieved through technical
improvements, where adding new widgets will be a quick next step in supporting future use of
the LA Cockpit.
With the LA Cockpit and the WA Plugin a suitable framework for Learning Analytics was
created. Such tools are essential to close the information gap between learners and teachers in
pure online courses. Further research on this topic will prove to be advantageous as the results
can be transferred to general e-learning environments that are gaining importance.
Developers are encouraged to add more metrics, expand the widget repertoire or even transfer the
LA Cockpit to other target groups, e.g. by shifting the metrics to the learners. The creation of a
student-oriented dashboard version of the LA Cockpit would be possible by reusing the core
components aggregation and visualization. Finally, the tool provides quantifiable insight into
learners' behavior and learning process with MOOCs on the iMooX platform.
Further research and development can improve the widgets and add more features to the LA
Cockpit. Useful additions can be made directly from the evaluations of the courses. Video
lectures or quizzes provide motivation for key figures into aspects such as "number of video
lessons seen" or "minutes consumed by video lectures". Since video lectures are a core
component in the transmission of learning content, the next promising steps are the analysis of
video consumption. This could extend the behavioral analysis of the WA plugins and provides
the opportunity to gain further insight into the consumption of video content within MOOCs.
After an in-depth evaluation of the interface design, customizing the visualization of the existing
metrics and providing alternative types provides the opportunity to increase usability and
satisfaction.
The more features and options a LA tool offer, the more important it is to have a clear
explanation of the displayed data, key figures and visualization. A Frequently Asked Questions
(FAQ) section, which includes background information about LA, metric calculations, and design
decisions for widget visualizations could be useful for target user group.
Future research and improvements of the LA Cockpit should not only help learners in their
learning process and close the feedback loop, but also close the gap between learning, teaching
and research. As an actively used tool on platforms such as iMooX, it has the opportunity to gain
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
13
qualitative insights into the application of LA among its target groups. The research community
benefits from the LA Cockpit because most tools never leave their prototype stadium. With these
next possible steps, we aim to improve the feature set, but even more to re-establish the
information channel between learners and teachers in online learning environments. Thereby,
teachers get the opportunity to support and improve the learning process with current
technologies to educate a larger group of learners than the traditional classroom environment
would allow. This approach reflects the fundamental objective of LA to improve all possible
parts of the LA lifecycle (Khalil and Ebner, 2017) itself and let all stakeholders benefit from its
application.
14
Chapter # - will be assigend by editors
REFERENCE
Andres, J. M. L., Baker, R. S., Gašević, D., Siemens, G., Crossley, S. A., & Joksimović, S.
(2018). Studying MOOC completion at scale using the MOOC replication framework. In:
Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 71-
78).
Arapakis, I., Lalmas, M., & Valkanas, G. (2014). Understanding Within-Content Engagement
Through Pattern Analysis of Mouse Gestures. In: Proceedings of the 23rd ACM International
Conference on Conference on Information and Knowledge Management. CIKM ’14. (pp. 1439-
1448).
Blikstein, P. (2013). Multimodal learning analytics. In Proceedings of the third international
conference on learning analytics and knowledge. LAK´13 (pp. 102-106).
Billsberry, J. (2013). MOOCs: fad or revolution? In: Journal of Management Education 37 (pp.
739-746).Charleer, S., Moere, A. V., Klerkx, J., Verbert, K., & De Laet, T. (2017). Learning
analytics dashboards to support adviser-student dialogue. IEEE Transactions on Learning
Technologies (pp. 389-399).
Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends:
A citation network analysis of the learning analytics field. In Proceedings of the fourth
international conference on learning analytics and knowledge (pp. 231-240).
Drachsler, H., & Greller, W. (2016). Privacy and analytics: it's a DELICATE issue a checklist for
trusted learning analytics. In: Proceedings of the sixth international conference on learning
analytics & knowledge (pp. 89-98).
Duval, E. (2011). “Attention Please!: Learning Analytics for Visualization and
Recommendation.” In: Proceedings of the 1st International Conference on Learning Analytics
and Knowledge. (pp. 9-17).
Elias, T. (2011). Learning Analytics: The Definitions, the Processes, and the Potential.
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for
learning analytics. In: Educational Technology & Society 15 (pp. 42-57).
Huang, J., White, R., & Buscher, G. (2012). User See, User Point: Gaze and Cursor Alignment in
Web Search. In: Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems. CHI’12. (pp. 1341-1350).
Jarrett W. Clark and A. Lucas Stephane. “Affordable Eye Tracking for Informed Web Design.”
In: Design, User Experience, and Usability: Theory and Practice. Cham: Springer International
Publishing, 2018, pp. 346–355 (cit. on p. 38).
Jivet, I. (2016). The Learning Tracker A Learner Dashboard that Encourages Self-regulation in
MOOC Learners.
Khalil, M., & Ebner, M. 2015. Learning Analytics: Principles and Constraints. In: Proceedings of
ED-Media 2015 conference.
Khalil, M., & Ebner, M. (2016a). When learning analytics meets MOOCs-a review on iMooX
case studies. In: International Conference on Innovations for Community Services (pp. 3-19).
Khalil, M. & Ebner, M. (2016b) De-Identification in Learning Analytics. Journal of Learning
Analytics. 3(1). pp. 129 - 138
Khalil, M., Taraghi, B., & Ebner, M. (2016). Engaging Learning Analytics in MOOCS: the good,
the bad, and the ugly.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
15
Kocher, P., Genkin, D., Gruss, D., Haas, W., Hamburg, M., Lipp, M., Mangard, S., Prescher, T.,
Schwarz, M., & Yarom, Y. (2018). Spectre attacks: Exploiting speculative execution. In: 40th
IEEE Symposium on Security and Privacy S&P’19.
Kopp, M., Ebner, M. (2015) iMooX - Publikationen rund um das Pionierprojekt. Verlag Mayer.
Weinitzen
Leitner, P., & Ebner, M. (2017). Development of a dashboard for Learning Analytics in Higher
Education. In: International Conference on Learning and Collaboration Technologies (pp. 293-
301).
Leitner, P., Khalil, M., & Ebner, M. (2017). “Learning Analytics in Higher Education - A
Literature Review.” In: Learning Analytics: Fundaments, Applications, and Trends.
McAuley, A, Stewart, B, Siemens, G. & Cormier, D. (2010) Massive Open Online Courses
Digital ways of knowing and learning, The MOOC model For Digital Practice. Retrieved from:
http://davecormier.com/edblog/wp-content/uploads/MOOC_Final.pdf (last access October 2019)
Maier, K., Leitner, P., & Ebner, M. (2019). “Learning Analytics Cockpit for MOOC Platforms.”
In: Emerging Trends in Learning Analytics.
Rohloff, T., Oldag, S., Renz, J., & Meinel, C. (2019). Utilizing web analytics in the context of
learning analytics for large-scale online learning. In: 2019 IEEE Global Engineering Education
Conference EDUCON (pp. 296-305).
Romanowski, B., Konak, A. (2016). Using Google Analytics to Improve the Course Website of a
Database Course. https://www.hofstra.edu/pdf/academics/colleges/seas/asee-fall-2016/asee-
midatlantic-f2016-konak.pdf - last accessed October 4th, 2019.
Spikol, D., Prieto, L. P., Rodríguez-Triana, M. J., Worsley, M., Ochoa, X., Cukurova, M. (2017).
Current and future multimodal learning analytics data challenges. In Proceedings of the Seventh
International Learning Analytics & Knowledge Conference (pp. 518-519).
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics
dashboard applications. American Behavioral Scientist, 57(10), (pp. 1500-1509).