ChapterPDF Available

Web Analytics as Extension for a Learning Analytics Dashboard of a Massive Open Online Platform

Authors:

Abstract

Massive open online courses (MOOCs) provide anyone with Internet access the chance to study at university level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of data representing their learning process. Learning Analytics (LA) can help identifying, quantifying, and understanding these data traces. Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit, it shall act as a future proof framework to be reused and improved over time. Aimed toward administrators and educators, the dashboard contains interactive widgets letting the user explore their datasets themselves rather than presenting categories. This supports the data literacy and improves the understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heatmaps to identify regions of learner’s interest and help separating structure and content. Activity over time is aggregated in a calendar view, making timely reoccurring patterns otherwise not deductible, now visible. Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become easier to improve the teaching and learning process by tailoring the provided content to the needs of the online learning community.
Draftfinally published in: Leitner P., Maier K., Ebner M. (2020) Web Analytics as Extension for a Learning Analytics
Dashboard of a Massive Open Online Platform. In: Ifenthaler D., Gibson D. (eds) Adoption of Data Analytics in Higher
Education Learning and Teaching. Advances in Analytics for Learning and Teaching. Springer, Cham.
https://doi.org/10.1007/978-3-030-47392-1_19
Chapter
# - will be assigend by editors
WEB ANALYTICS AS EXTENTION FOR A LEARNING ANALYTICS
DASHBOARD OF A MASSIVE OPEN ONLINE PLATFORM
Philipp Leitner, Graz University of Technology, Graz, Austria, e-mail: philipp.leitner@tugraz.at
Karin Maier, Graz University of Technology, Graz, Austria, e-mail: karin.maier@student.tugraz.at
Martin Ebner, Graz University of Technology, Graz, Austria, email: martin.ebner@tugraz.at
Abstract: Massive Open Online Courses (MOOCs) provide anyone with internet access the chance to study at university
level for free. In such learning environments and due to their ubiquitous nature, learners produce vast amounts of
data representing their learning process. Learning Analytics (LA) can help identifying, quantifying and
understanding these data traces.
Within the implemented web-based tool, called LA Cockpit, basic metrics to capture the learners’ activity for the
Austrian MOOC platform iMooX were defined. Data is aggregated in an approach of behavioral and web analysis
as well as paired with state-of-the-art visualization techniques to build a LA dashboard. It should act as suitable
tool to bridge the distant nature of learning in MOOCs. Together with the extendible design of the LA Cockpit it
shall act as a future-proof framework to be reused and improved over time.
Aimed towards administrators and educators, the dashboard contains interactive widgets letting the user explore
their datasets themselves rather than presenting categories. This supports the data literacy and improves the
understanding of the underlying key figures, thereby helping them generate actionable insights from the data. The
web analytical feature of the LA Cockpit captures mouse activity in individual course-wide heat maps to identify
regions of learner's interest and help separating structure and content. Activity over time is aggregated in a
calendar view, making timely reoccurring patterns otherwise not deductible, now visible.
Through the additional feedback from the LA Cockpit on the learners’ behavior within the courses, it will become
easier to improve the teaching and learning process by tailoring the provided content to the needs of the online
learning community.
Key words: MOOC, Learning Analytics, Learning Dashboard, Online Learning, Visualization
2
Chapter # - will be assigend by editors
1. INTRODUCTION
The internet as a provider for information and educational material plays a central role in the
ubiquitous learning environments, we all live in nowadays and thereby, changing drastically how
learning takes place now and in future. Identified as the future of education (Billsberry, 2013),
Massive Open Online Courses (MOOC) attract a lot of interest in the last decade (McAuley et al.,
2010). MOOCs provide anyone with internet access the opportunity to participate in online
courses on a university level for free. Because of high demand, MOOC platforms have to deal
with a large audience of a wide variety of people from all over the world (Romanowski and
Konak, 2016). Regardless of the potential of this new learning format, there are also some
challenges. While teachers can observe their students in a traditional learning environment and
respond appropriately and immediately when needed, they are not able to do so in an online
environment especially with a large number of participants. Therefore, the legitimate step is to
observe and analyze learners’ data in this new environment to grasp and understand this new way
of learning and improving the underlying process. In this intersection of various academic fields
such as education, psychology, education and computer science the term Learning Analytics
(LA) was coined (Dawson et al., 2014). The goal of LA is to understand learning itself and the
environment in which learning occurs, but additionally it can also be seen as the approach to
optimize these factors (Verbert et. al., 2013).
Although LA is a relatively new research field, one important outcome from previous research
may be that there is no “one size fits all” LA solution (Blikstein, 2013). Therefore, a requirements
analysis of the stakeholders involved, the university and the platform guarantee successful
deployment and valuable results. In their literature review, Leitner, Khalil & Ebner (2017)
categorized the involved stakeholders into learners, teachers and researchers/administrators.
Although the learners are the main target group when talking about learning, our dashboard was
specifically designed to support teachers and administrators to understand how learning is
happening.
To achieve this, however, it is necessary to take a closer look at the activities of learners. The
data records for LA come directly from the Learning Management System (LMS) used, where
information such as the number of downloads or accesses to the system can be generated. Stored
as log files or numbers in a database, this data and its sparse presentation may not be sufficient to
answer the research question of how learners use MOOCs. A chain of processing steps is
necessary to receive a human interpretable representation; this starts with identifying the traces
left behind by learners, through data aggregation techniques within this learning environment, to
data modeling, and the definition of key figures and metrics (Duval, 2011).
The Web Analytics (WA) plugin presented in this research work performs these steps by
capturing the learner's interactions with the provided resources in selected courses on the
Austrian MOOC platform iMooX, founded in 2013 (Kopp & Ebner, 2015). In addition, suitable
indicators are defined which are to be presented to the interest groups. These are encapsulated as
widgets and integrated into a LA dashboard named LA Cockpit.
This approach provides the opportunity for a sophisticated view on how learners interact with the
learning material offered in MOOCs. Through behavioral analysis as well as associated metrics
combined with the educators’ experiences from face-to-face teaching, the dashboard support
teachers in the decision-making process on where to act and how to improve the learning process
in general.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
3
Taking into account guidelines and best practices from our previous research (Maier, Leitner &
Ebner, 2019), we have extended our framework to include also web analytics. Our overall goal
for our LA Cockpit is to close the information gap that teachers in MOOCs have compared to real
classroom learning situations and examines what can be derived from recorded activity traces in
online learning environments. For integration into the existing framework, a subset of possible
metrics was designed, on which the plugin explores appropriate visualization tools for the
engagement and behavior of the captured learners. Further, the plugin aims to provide means of
evaluation to improve the quality of the offered MOOCs by adapting the content and presenting it
to the MOOC community.
Therefore, our main research question follows the goal of how such LA dashboard have to look
like to assist especially teachers or any educators to understand the learning process of his/her
learners in order to improve their teaching and learning behavior within a MOOC platform.
2. RELATED WORK
The reuse of established tools from various research fields for educational data traces is
increasing in the recent years. Those visualization strategies are using charts, graphs or maps as
presentation technologies for digital dashboards (Elias, 2011) which have been successfully
adapted using educational data (Jivet, 2016; Charleer et al., 2017). These learning dashboards
have proven to be effective tools in aiding teachers and learners in the context of the learning
process.
Based on the findings and earlier studies on the design of learning dashboards (Leitner & Ebner,
2017; Khalil, Taraghi & Ebner, 2016), three recurring ideas can be worked out:
1. Relevant metrics: A very crucial step is finding suitable metrics for the target group. If
it is not done properly, the tool will become overloaded or the user will be discouraged
and therefore it would be useless to use LA.
2. Visuals: To make complex coherences understandable and visible to the users, it is
essential to use appropriate colors for the different visualization types. Therefore, it is
necessary to apply basic principles of interface design and ensure that aggregated data is
not falsified.
3. Interactivity: Different views or filter options increase the usefulness of the tool for the
users and speak to their curiosity. Interactivity is preferred over certain discontinuous
numbers.
Furthermore, a suitable system has to be found which also supports the requirements of online
use. Learning Management Systems (LMS) are a good choice because they offer administration
and serve as a provider for online resources. The market provides various paid as well as cost free
alternatives with the option of self-hosting on the universities own infrastructure or as Software
as a Service (SaaS) cloud instance.
As part of the cost-free alternatives, Open Source implementations are particularly attractive.
Three very popular products are challenging each other: Moodle, Open edX and Canvas. All
three are providing basic logging capabilities and visualization options. Although, teachers and
administrators have to work with log files for more specific metrics and key figures if they want
4
Chapter # - will be assigend by editors
to go deeper in the data. It is a great additional effort to provide functions that go beyond the
basic statistical figures and reports for the product providers. Therefore, the ideas and concepts of
LA are slowly finding their way in the software. For example, Moodle offers various dashboard
plugins with LA capabilities. Unfortunately, some integrate connections to external servers or
promote their additional paid content.
Other plugins and projects such as Analytics Graphs for instructors, is not applicable for a
broader range of possible users. They lack in analytical features, customization options or cover
only specific use cases in their implementation.
All these dashboards are working with data produced by learners. Additional constraints for
collecting, storing and transferring this personal data apply. The increasing volume of data, often
a by-product of online interactions, has brought new perspectives on privacy and property. The
ownership of data has become a hot topic in the last years. Individuals started to claim the “right
to be forgotten” (Elias, 2011) and people started to question the “almighty” algorithms over bias
and validity. This rising awareness made it necessary to think about potential risk and benefits.
This has recently been legally manifested in the General Data Protection Regulation (GDPR).
The results are publications about guidelines, best practice and good working examples. It is
necessary to think about inconvenient questions regarding privacy and ethic usage before
applying algorithms and tools on the data. This can be done by agreeing on an ethical framework
or checklist such as the one by Greller and Drachsler (2016) when dealing with learner’s data.
Further, Khalil and Ebner (2016a) dealt with the challenges LA is facing and also pointed out the
possibility of de-identification of learner’s data (Khalil & Ebner, 2016b).
If a researcher wants to use LA, the rights of the data subjects must be questioned. Openness
about intentions, distinction which data shall be collected for which purpose, storage- and access
rights with state-of-the-art software and security standards are some points to think about. They
need to be discussed with all stakeholders. Further training of academic staff is needed to ensure
that all standards are meet. Further, despite the promises and benefits of LA it is necessary to
discuss the critics on LA metrics, such as the loss of control over data traces. To mitigate the risk
and compromise between the benefits and drawbacks the DELICATE checklist can be used
(Drachsler & Greller, 2016). As a consequence, dashboards should not only comply with the
minimal requirements given by law or agreements from the institution. Moreover, it is necessary
to think about the consequences of displaying metrics, classifications and visualizations from the
early stage of the design phase.
WA is used to obtain key information about the behavior of users on websites. Rohloff et al.
(2019) discussed in their research work about the possibility to use WA without compromising
the learners’ data privacy. In their test setting they integrated Google Analytics in an MOOC as a
proof of concept. The study showed that WA can provide useful insights and retrieve a large part
of metrics relevant in context of LA for the stakeholders. Especially key performance indicators
(KPI) are easier to obtain from WA tools than e.g. learner specific metrics due to the fact that
WA is not designed to retrieve user level information or providing LA data to individual students.
(Rohloff et al., 2019)
3. CONCEPT OF THE LA COCKPIT
The first version of the LA Cockpit was completed at the end of 2017 and entered the evaluation
and test phase in an academic test environment the following year (Maier, Leitner & Ebner,
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
5
2017). Designed as a plugin for the LMS Moodle, the initial concept included requirements such
as simple maintenance or modular and configurable system design. The target group were only
administrators of the LMS. The focus was on demonstrating that Learning Analytics plug-ins can
be used with open source resources and serve as a basic source repository for quick and easy
extension.
The LMS Moodle already collects basic statistics about the system as well as data on interactions
between participants with the learning objects and stores this information in the database. The
first version of the LA Cockpit used these database tables to group and aggregate on daily basis.
This basic metrics were encapsulated and visualized trough widgets. The MOOC administrator
could add or delete these widgets. Besides presenting different visualization methods, it should
also serve as a starting point for other key figures. The metrics showed system-wide key figures
from the LMS. We interviewed the stakeholders and together a list of feature requests was
created. This list was decisive for the revision of the dashboard.
The new and extended version of the LA Cockpit is based on these existing daily aggregation
mechanisms. Furthermore, it is improved by additional data tracks from outside the learning
management system. Capturing interaction within the learner's browser environment enriches the
data available in the LMS and provides opportunities for new metrics and widgets. The next
section describes the basics of behavioral analysis and activity measurement. Further design
changes, improvements to the LA Cockpit and a new feature later called Web Analytics (WA)
plugin are discussed in the next sections.
3.1 Activity Measurement
When designing learning analytic tools, the focus lies on how to the content is going to be
displayed and which key figures should be provided. In the environment of many online learners
spread across different courses, the contact and interaction between teachers and learners is
fundamentally different from the traditional face-to-face environment. Teachers receive feedback
from the learners often only through their final grade or explicitly requested responses.
For additional, implicit feedback of the interaction process the teacher must be able to rely on
features of the LMS, but simple statistics of the system do not sufficiently reflect the actions of
learners. The LA Cockpit should re-enact this cognitive connection between teachers and learners
and allow teachers to use their pedagogical knowledge to work with the displayed information.
In order to measure different activities a closer look must be done to the LMS itself. As data
source one can access the LMS resources in form of data base tables, logged events and records
of technical processes such as download count. This is often not enough to capture the manifold
ways of learning, so a number of research studies (Blikstein, 2013, Spikol et al., 2017) try to
make use of additional information (or multi-modal data) such as speech, writing or non-verbal
interaction (e.g. movements, gestures, facial expressions, gaze, biometrics, etc.) during real
learning activities to enlarge the data traces to create metrics from. The internal state of the
learning process is quantified by capturing its external representation of learning.
The focus of the LA Cockpit is on aggregated metrics rather than individual, single and absolute
values. The main goal is to create a context for teachers without classifications or complex
predictive modelling. With the help of the Web Analytics Plugin the LA Cockpit hopes to
visualize - not only quantify - activities within a course in an aggregated mirrored view from the
6
Chapter # - will be assigend by editors
user’s perspective. Insights into what the learners are doing within a MOOC and what resources
they are accessing can provide a starting point for further research.
In the next section the details of the behavioral analysis approach with the Web Analytics plugin
are discussed. Basic technical background of the LA Cockpit and the building blocks of the
technology within its environment are discussed in chapter 4.
3.2 Web Analytics
The LA Cockpit provides means of measuring, identifying and visualizing the behavior of
MOOC participants with an additional plugin built within this research study. Applying Learning
Analytics only with the resources from the Moodle system is not enough, especially in the
context of MOOCs. In the face-to-face classroom situation, teachers can observe, infer and act
upon the learner’s behavior. Following questions must be pointed out:
Do students struggle to find a certain resource?
Do they need more time than expected?
Do they answer their quizzes by going back and forth between video lectures and quiz
questions?
In the online environment of MOOCs, the providing platform has no timely analysis capabilities
of the interaction on the client side. Therefore, the Web Analytics plugin tries to capture the
interaction within the browser window, aggregating the interaction and offering an additional
data source from the learner’s perspective.
Figure 1 provides an overview of the involved resources. The aggregation of data does not
happen user-wise, but resource-wise beginning with the clients’ browsers.
[Add Fig. 1 here]
Figure 1: Resource Interaction of the LA Cockpit
Within Moodle, multiple pages, distinguishable via their URL, represent one specific online
learning course. Each URL is considered as a single resource. This means, that for each accessed
course page the activities of the learners are logged and daily aggregated into the LA Cockpit
database.
Using the web browser to access the learning resources, interaction can happen via different types
of input devices: mouse, keyboard, touch- or speech input, whereas mouse- and keyboard-
interaction are considered as standard input devices. With mobile devices, the mouse is replaced
by touch input, a physical keyboard is simulated with a virtual one. The way those input devices
are used can relate to our cognitive processes and also depends on the presentation of the content.
From a technical point of view, behavior can be categorized into different events happening
within the system. These need to be interpreted by the browser to react accordingly, e.g. a click
on a button opens a pop-up. The triggered event gets forwarded and processed by the browser,
where the WA plugin aggregates different type of events. The following events are aggregated
with their timestamp attached:
Mouse Movement Aggregation for changing x and y coordinates of the mouse
pointer.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
7
Click A click (pressing of a button followed by a release) event as well as a target
resource upon which the event has happened (e.g. button, link).
Key Event Timestamp for the key event and if any special keys are pressed (Shift,
Alt, Control).
Scroll Depth Scroll depth is saved within a regular interval. This refers to the
calculated percentage of the page the users have scrolled to, where the top of the
webpage is considered 0% and bottom of the page would be 100%.
The main data provider are mouse movements. A mouse movement can be defined as continuous
event sampled at consecutive points in time with according x and y coordinates, creating a
discrete data trace over time. All visually guided movements (e.g. selecting, pointing, clicking)
are formed through gestures with the mouse device.
In the WA plugin’s database, a mouse movement is described by consecutive logged entries. The
database field id refers to a consecutive log number and a timestamp is the Unix timestamp of the
triggered event, whereas event time is the JavaScript generated timestamp. The latter is calculated
from zero, defined as the creation of the web document and is reset with a reload of the web
document. The WA plugin saves both for redundancy reasons. The position of the mouse event is
given by its values x-pos and y-pos, calculated from the coordinate system where (zero, zero)
starts at the top left corner of the web document. For the metrics, the number of database entries
grouped by these coordinates are used to generate the Heatmap value.
There are many different ways to use and consume web content, mouse movement analysis can
provide the necessary data for the goal of the WA Plugin to identify overlapping regions of
interest. Especially as this collected data results from real life situations and not from a controlled
lab situation previous research results about correlation cannot directly be transferred.
8
Chapter # - will be assigend by editors
3.3 Metrics and Visualization
The aggregated data from the WA plugin still needs some further refinements before becoming
usable within the widgets of the LA Cockpit. It might not be necessary or helpful to display all
raw data in every detail. The target group should get widgets which are easy to interpret and to
understand. Therefore, meaningful subsets of the information have been agreed upon and the
activity data will be represented with three new additional widgets: Device Statistics, Activity
Calendar and Heatmap.
These should provide the teacher with a starting point for discussions on how students interact
with learning resources. The web analysis function is intended to enable researchers to create
additional analytical functions of the LA Cockpit that are related to this behavioral analysis data
set. Each metric provides a different perspective on the aggregated interactions, the foundations
of the metrics and its visualizations are explained below.
4. IMPLEMENTATION
4.1 Device Statistics
In the evolution of the Web, the Internet began with a text-based system in which users navigated
by entering commands. Nowadays, browsers perform this task for the user. When accessing
resources, the browser act as an agent and turns the action into commands. When loading
resources, the browser on the client side identifies itself with the string User-Agent to the server.
In HTTP, the User-Agent string is used for content negotiation. The format is a list of product
tokens/keywords, with the most important listed first. The HTTP header of the request specifies
which languages the client can understand and which language is preferred, reflecting the
language set in the browser user interface.
For the WA plugin, this information is used within the Device Statistics widget (figure 2). The
character string of the browser user agent is stored the first time the user accesses the course.
Afterwards, the server analyzes all available data sets and returns sorted subsets to the dashboard
for visualization.
[Add Fig. 2 here]
Figure 2: Device Statistics Widget
4.1 Activity Calendar
For the Activity Calendar widget (figure 3), a basic aggregation of mouse events is performed.
The count value of the activities for each day is calculated directly from the data traces in the
database. All available events are stored on a daily basis as the calendar provides a daily
overview. The year view is calculated from the current date that gives an overview of past
activities within the last twelve months. The metric is aggregated on the server side according to
the request sent after selecting a course to display the data.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
9
[Add Fig. 3 here]
Figure 3: Activity Calendar Widget
4.2 Heatmap
The Heatmap widget shown in figure 4, consists of two main parts. The activity data itself -
visualized as traces between red and green - are aggregated from the WA plugin by analyzing the
user's mouse activity. Since these widgets have to visualize complex relationships, the LA
Cockpit uses state-of-the-art technology such as the D3.js framework. Without the displayed
course URL, these mouse traces would be difficult to interpret. Therefore, it is necessary to
provide the information layer presented to the user. The background image of the widget puts the
captured data into the correct position, for which an elaborate process creates the screenshots.
Then the widget itself presents only the list of URLs where background images are available to
the target user.
[Add Fig. 4 here]
Figure 4: Heatmap Widget
5. DISCUSSION
The revised and extended LA Cockpit for the Austrian MOOC platform iMooX was deployed for
the first time during the MOOC “LawBusters – Drei Themen Recht.humorvoll”. It started at the
end of December 2018 and featured three weeks of video lectures, which dealt with law related
basics in an entertaining way through analogies from Science-Fiction and Fantasy. Nearly 90
users took part in the course, where our goal was to provide a proof-of-concept for the LA
Cockpit as well as getting feedback of the participants.
5.1 First Evaluation results
The Feedback on the LA Cockpit was collected via online surveys and was general satisfying.
Especially, the basic concept of multiple dashboards proved beneficial when grouping different
widgets to the individual liking of the target user such as on course-basis. Yet, when managing a
larger number of courses, the name as a distinction turned out to be not enough. Additional
details such as date of creation could help finding the desired dashboard faster and so improve the
satisfaction with the tool.
We got a similar positive feedback to the PDF Report. It seems that despite the digital era, some
teachers prefer paper over digital reports, or at least the possibility to print it out. An additional
motivation might be that sharing course-related information without elaborate access and
authorization process to the LA Cockpit. Also, the information in the note section received broad
reception and is considered to improve the understanding of the different widgets. The adaptation
possibilities of those texts to document own findings and observations proved useful. User
adapted the text even for basic information related to the displayed metrics or visualization. In
our case with the LawBusters course, the Christmas time left a distinct decline in activity on the
10
Chapter # - will be assigend by editors
platform, shown in figure 5. Even such general remarks can find their place in the note area and
be used as documentation for later comparisons.
[Add Fig 5 here]
Figure 3: Login over Time Widget with notes
The user behavior was captured by the WA plugin and provides a large amount of information.
At the moment, only a subset of this information is visualized in our metrics. The target group of
researchers and administrators suggested that further details about the user experience in-course
should be visualized. One particular example was the keyboard usage in the context of forum,
which could give information about search bar usage to access specific learning materials.
Nonetheless, the provided metrics were considered helpful by the target group and gave
interesting insights to the underlying data.
As for the user group of administrators, the distribution of the operating systems and internet
browsers used, were relevant. Thereby, supports the testing and optimizing of the platform.
Teachers were more interested in the specific language settings of the participants. In our case,
two thirds of the participants accessed our system with German language, whereas the remaining
exclusively used English with American locales as their primary browser settings.
5.2 Limitations
The evaluation of the LA Cockpit also showed some limitations. Besides keyboard and mouse,
the input devices used on the system were especially touch-based. Which depending on the type,
such as mobile phones or tablet, needs to be handled differently. Web applications may either
process those touch-based inputs trough events or access them as interpreted mouse events. Our
WA plugin was designed to only collect and process mouse and mouse-interpreted events and
thus touch-only events were not or only partially recorded. In addition, it is a technically complex
task to completely cover the dependencies on the various combinations of operating systems and
Internet browsers, including different versions.
A second limitation concerns the sampling of mouse events. User interaction takes place
continuously in real time, with each logged event represented by a discrete timestamp. The data
stream generated by the tool has a sampling rate that is influenced by various factors. First of all,
the input device itself. A computer mouse has a polling rate, measured in Hertz (Hz), and a
corresponding polling interval. They define how often the position is reported to the computer,
which is usually once per millisecond. For example, 125 Hz means that the mouse position is sent
to the computer every 8 milliseconds. In addition, a discreetly rated representation of the mouse
movement that the event triggered within the web content, could add another layer of inaccuracy.
The mouse event is intercepted trough JavaScript in the WA plugin, where microsecond times for
events would be theoretically technically feasible. However, in order to minimize current security
threats such as Spectre (Kocher et al., 2019), browsers round the result of queried time stamps to
varying degrees. Thus, the exact profiling of users is not feasible. However, the assumption that
the entire data trace of a mouse movement is sampled at a lower rate has no negative effect on
data aggregation and the visualization of behavioral analysis in the LA Cockpit.
A third limitation relates to the fact that there are different ways, depending on different cognitive
processes and personal characteristics, to achieve the same goal such as downloading a learning
resource or accessing a video lecture. For example, while browsing the webpage the mouse
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
11
pointer of the user could be like an anchor, resting at the top of a paragraph. Another user, on the
other hand, can mark the passage along the text while reading, in order to copy it later. Using
keyboard shortcuts to scroll through pages or browse the web page would also leave the mouse at
a position that does not correlate with the center of the user's visual attention.
Several studies in the areas of mouse tracking, mouse movements and behavioral analysis have
shown that the mouse pointer can act as a weak proxy of the gaze (Arapakis, Lalmas and
Valkanas, 2014) and offers a cost-effective alternative to eye tracking (Huang, White and
Buscher, 2012). The strength of the correlation depends on the design of the website (Clark and
Stephane, 2018). Therefore, it is important to note that mouse motion analysis is not a suitable
substitute for eye tracking studies. The equipment required for these studies is much more
expensive and requires a predefined laboratory setting. These environmental requirements are not
transferable to the target application of the LA Cockpit. However, mouse activity provides a
suitable data source for checking the design of web pages and evaluating user activity in specific
areas. For the WA plugin, these data traces are visualized within metrics to provide insight into
remote processes that would otherwise not be observable.
6. CONCLUSION
The LA Cockpit, a custom LA Dashboard was revised, extended by WA and deployed at the
Austrian MOOC platform iMooX. It combines the collection, transformation and visualization of
data produced in the learning environment. Additionally, the focus during the design and
implementation of the LA Cockpit was on a modular framework and thereby its extendibility and
maintainability. Because of the complexity of LA approaches, the LA Cockpit has a number of
tools and offers a highly modular dashboard that can be adapted to the different needs of the
target groups.
The LA Cockpit contains basic key figures related to activities in the LMS itself. Through the
extension of the WA plugin, it is possible to analyze the behavior of the participants and thereby,
let the target group infer on the way learners interact with the course and its materials. Therefore,
the WA plugin uses a set of metrics to capture user activity. This behavioral analysis is done by
aggregating different traces of the user in the browser. Three widgets aggregating a series of
events and actions are designed to visualize metrics in the dashboard. The Device Statistics
widget provides statistical information about the devices used, browser versions, and language
settings. The widget activity calendar adds a temporary visualization element to the data.
Displayed as daily calendars for the last 12 months, different colored fields match the activity
level. With this view it is possible to uncover recurring patterns of user contributions in online
learning courses that may have gone unnoticed until now. The Heatmap widget uses the
traditional concept of mouse activity heating cards, which display moving areas in different
colors, from low activity areas with cold colors like blue to high activity areas in red. The widget
is often used as a tool to check the usability of websites in terms of their design, and provides
another dimension of information. The most commonly used resources and regions of high
interest can be visually inspected, giving teachers and administrators a quick overview.
All visualizations within the tool follow the core guidelines of dynamic and interactive
presentation. With this concept it is possible to let the user explore the data himself through the
visualizations instead of presenting indicators that are difficult to understand and interpret. This
12
Chapter # - will be assigend by editors
exploration phase is crucial to enable the user to understand relationships that are otherwise not
understandable due to large datasets or blurred by aggregated averages without the real dataset.
All metrics have course-wide aggregation of data that focuses more on the learning process than
on the individual's learning track. The resulting additional privacy does not affect the quality of
the information provided by the tool.
In order to better understand the handling of the LA Cockpit and to include the opinions of the
stakeholders, an evaluation in terms of usability and usefulness of the LA Cockpit was carried
out. In particular, the target group of researchers and administrators provided valuable
suggestions and further ideas for improving the LA Cockpit. The focus of these feature requests
refers to the metrics and the data and not to the application of the widget itself. Nevertheless,
there were common interests, such as including touch compatibility, including video analysis and
metrics, researching aggregated data with knowledge discovery methods, and providing thorough
evaluation.
Further, there is a great need for a comprehensive evaluation of the LA Cockpit including a
significant test user group. The next step in the development cycle would be not only feedback
from the target group on what information they would like to receive, but also on how to achieve
useful results. An evaluation of the tool in terms of interface design, usability and user-
friendliness and the content displayed should be made. This may be achieved through technical
improvements, where adding new widgets will be a quick next step in supporting future use of
the LA Cockpit.
With the LA Cockpit and the WA Plugin a suitable framework for Learning Analytics was
created. Such tools are essential to close the information gap between learners and teachers in
pure online courses. Further research on this topic will prove to be advantageous as the results
can be transferred to general e-learning environments that are gaining importance.
Developers are encouraged to add more metrics, expand the widget repertoire or even transfer the
LA Cockpit to other target groups, e.g. by shifting the metrics to the learners. The creation of a
student-oriented dashboard version of the LA Cockpit would be possible by reusing the core
components aggregation and visualization. Finally, the tool provides quantifiable insight into
learners' behavior and learning process with MOOCs on the iMooX platform.
Further research and development can improve the widgets and add more features to the LA
Cockpit. Useful additions can be made directly from the evaluations of the courses. Video
lectures or quizzes provide motivation for key figures into aspects such as "number of video
lessons seen" or "minutes consumed by video lectures". Since video lectures are a core
component in the transmission of learning content, the next promising steps are the analysis of
video consumption. This could extend the behavioral analysis of the WA plugins and provides
the opportunity to gain further insight into the consumption of video content within MOOCs.
After an in-depth evaluation of the interface design, customizing the visualization of the existing
metrics and providing alternative types provides the opportunity to increase usability and
satisfaction.
The more features and options a LA tool offer, the more important it is to have a clear
explanation of the displayed data, key figures and visualization. A Frequently Asked Questions
(FAQ) section, which includes background information about LA, metric calculations, and design
decisions for widget visualizations could be useful for target user group.
Future research and improvements of the LA Cockpit should not only help learners in their
learning process and close the feedback loop, but also close the gap between learning, teaching
and research. As an actively used tool on platforms such as iMooX, it has the opportunity to gain
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
13
qualitative insights into the application of LA among its target groups. The research community
benefits from the LA Cockpit because most tools never leave their prototype stadium. With these
next possible steps, we aim to improve the feature set, but even more to re-establish the
information channel between learners and teachers in online learning environments. Thereby,
teachers get the opportunity to support and improve the learning process with current
technologies to educate a larger group of learners than the traditional classroom environment
would allow. This approach reflects the fundamental objective of LA to improve all possible
parts of the LA lifecycle (Khalil and Ebner, 2017) itself and let all stakeholders benefit from its
application.
14
Chapter # - will be assigend by editors
REFERENCE
Andres, J. M. L., Baker, R. S., Gašević, D., Siemens, G., Crossley, S. A., & Joksimović, S.
(2018). Studying MOOC completion at scale using the MOOC replication framework. In:
Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 71-
78).
Arapakis, I., Lalmas, M., & Valkanas, G. (2014). Understanding Within-Content Engagement
Through Pattern Analysis of Mouse Gestures. In: Proceedings of the 23rd ACM International
Conference on Conference on Information and Knowledge Management. CIKM ’14. (pp. 1439-
1448).
Blikstein, P. (2013). Multimodal learning analytics. In Proceedings of the third international
conference on learning analytics and knowledge. LAK´13 (pp. 102-106).
Billsberry, J. (2013). MOOCs: fad or revolution? In: Journal of Management Education 37 (pp.
739-746).Charleer, S., Moere, A. V., Klerkx, J., Verbert, K., & De Laet, T. (2017). Learning
analytics dashboards to support adviser-student dialogue. IEEE Transactions on Learning
Technologies (pp. 389-399).
Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends:
A citation network analysis of the learning analytics field. In Proceedings of the fourth
international conference on learning analytics and knowledge (pp. 231-240).
Drachsler, H., & Greller, W. (2016). Privacy and analytics: it's a DELICATE issue a checklist for
trusted learning analytics. In: Proceedings of the sixth international conference on learning
analytics & knowledge (pp. 89-98).
Duval, E. (2011). “Attention Please!: Learning Analytics for Visualization and
Recommendation.” In: Proceedings of the 1st International Conference on Learning Analytics
and Knowledge. (pp. 9-17).
Elias, T. (2011). Learning Analytics: The Definitions, the Processes, and the Potential.
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for
learning analytics. In: Educational Technology & Society 15 (pp. 42-57).
Huang, J., White, R., & Buscher, G. (2012). User See, User Point: Gaze and Cursor Alignment in
Web Search. In: Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems. CHI’12. (pp. 1341-1350).
Jarrett W. Clark and A. Lucas Stephane. “Affordable Eye Tracking for Informed Web Design.”
In: Design, User Experience, and Usability: Theory and Practice. Cham: Springer International
Publishing, 2018, pp. 346–355 (cit. on p. 38).
Jivet, I. (2016). The Learning Tracker A Learner Dashboard that Encourages Self-regulation in
MOOC Learners.
Khalil, M., & Ebner, M. 2015. Learning Analytics: Principles and Constraints. In: Proceedings of
ED-Media 2015 conference.
Khalil, M., & Ebner, M. (2016a). When learning analytics meets MOOCs-a review on iMooX
case studies. In: International Conference on Innovations for Community Services (pp. 3-19).
Khalil, M. & Ebner, M. (2016b) De-Identification in Learning Analytics. Journal of Learning
Analytics. 3(1). pp. 129 - 138
Khalil, M., Taraghi, B., & Ebner, M. (2016). Engaging Learning Analytics in MOOCS: the good,
the bad, and the ugly.
# - will be assigend by editors. Web AnalyticS AS EXTENTION FOR A LEARNING
ANALYTICS DASHBOARD OF A Massive Open Online PLATFORM
15
Kocher, P., Genkin, D., Gruss, D., Haas, W., Hamburg, M., Lipp, M., Mangard, S., Prescher, T.,
Schwarz, M., & Yarom, Y. (2018). Spectre attacks: Exploiting speculative execution. In: 40th
IEEE Symposium on Security and Privacy S&P’19.
Kopp, M., Ebner, M. (2015) iMooX - Publikationen rund um das Pionierprojekt. Verlag Mayer.
Weinitzen
Leitner, P., & Ebner, M. (2017). Development of a dashboard for Learning Analytics in Higher
Education. In: International Conference on Learning and Collaboration Technologies (pp. 293-
301).
Leitner, P., Khalil, M., & Ebner, M. (2017). “Learning Analytics in Higher Education - A
Literature Review.” In: Learning Analytics: Fundaments, Applications, and Trends.
McAuley, A, Stewart, B, Siemens, G. & Cormier, D. (2010) Massive Open Online Courses
Digital ways of knowing and learning, The MOOC model For Digital Practice. Retrieved from:
http://davecormier.com/edblog/wp-content/uploads/MOOC_Final.pdf (last access October 2019)
Maier, K., Leitner, P., & Ebner, M. (2019). “Learning Analytics Cockpit for MOOC Platforms.”
In: Emerging Trends in Learning Analytics.
Rohloff, T., Oldag, S., Renz, J., & Meinel, C. (2019). Utilizing web analytics in the context of
learning analytics for large-scale online learning. In: 2019 IEEE Global Engineering Education
Conference EDUCON (pp. 296-305).
Romanowski, B., Konak, A. (2016). Using Google Analytics to Improve the Course Website of a
Database Course. https://www.hofstra.edu/pdf/academics/colleges/seas/asee-fall-2016/asee-
midatlantic-f2016-konak.pdf - last accessed October 4th, 2019.
Spikol, D., Prieto, L. P., Rodríguez-Triana, M. J., Worsley, M., Ochoa, X., Cukurova, M. (2017).
Current and future multimodal learning analytics data challenges. In Proceedings of the Seventh
International Learning Analytics & Knowledge Conference (pp. 518-519).
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics
dashboard applications. American Behavioral Scientist, 57(10), (pp. 1500-1509).
... While there are some differences between LA, AA and EDM, they all share some common challenges. Numerous studies have reported implementation details of LA products; however, a recent study by Leitner et al. (2020) pointed out that they rarely provide comprehensive descriptions of challenges faced in productionizing these systems. This study shortlisted seven general challenges for deploying LA initiatives: ...
... We draw on recent literature to expand on two particular challenges above (2 and 7), and we tailor them to the difficulties which specifically relate to LAD projects. In addition, with supporting literature we posit an additional challenge, namely Agility, to the original seven identified by Leitner et al. (2020). ...
Article
Full-text available
This study investigates current approaches to learning analytics (LA) dashboarding while highlighting challenges faced by education providers in their operationalization. We analyze recent dashboards for their ability to provide actionable insights which promote informed responses by learners in making adjustments to their learning habits. Our study finds that most LA dashboards merely employ surface-level descriptive analytics, while only few go beyond and use predictive analytics. In response to the identified gaps in recently published dashboards, we propose a state-of-the-art dashboard that not only leverages descriptive analytics components, but also integrates machine learning in a way that enables both predictive and prescriptive analytics. We demonstrate how emerging analytics tools can be used in order to enable learners to adequately interpret the predictive model behavior, and more specifically to understand how a predictive model arrives at a given prediction. We highlight how these capabilities build trust and satisfy emerging regulatory requirements surrounding predictive analytics. Additionally, we show how data-driven prescriptive analytics can be deployed within dashboards in order to provide concrete advice to the learners, and thereby increase the likelihood of triggering behavioral changes. Our proposed dashboard is the first of its kind in terms of breadth of analytics that it integrates, and is currently deployed for trials at a higher education institution.
... At Graz University of Technology (TU Graz) the organizational unit Educational Technology has intensive experience in learning analytics and visualizations, including for our Austria-wide MOOC platform iMooX.at (Maier, Leitner & Ebner, 2019;Leitner, Maier & Ebner 2020), for the university-wide learning management system TeachCenter and through numerous international research cooperation (De Laet et al. 2018a, De Laet et al., 2018b. When students expressed the wish to get a better and easier overview of their study progress, we were happy to comply. ...
... The development of the dashboard was delayed by a few weeks over time due to the closure of the university in March 2020, as all resources were needed at short notice to provide the necessary technical support for emergency teaching (Ebner et al., 2020). Overall, however, in retrospect, the implementation took place quickly and smoothly, probably also due to the existing experience with similar projects. ...
Chapter
Full-text available
At Graz University of Technology (TU Graz, Austria), the learning management system based on Moodle (https://moodle.org/ – last accessed February 10, 2021) is called TeachCenter. Together with a campus management system – called TUGRAZonline – it is the main infrastructure for digital teaching and general study issues. As central instances for both teachers and students, various services and support for them are offered. The latest developments include the design and implementation of a study progress dashboard for students. This dashboard is intended to provide students a helpful overview of their activities: It shows their academic performance in ECTS compared to the average of their peers, their own study progress, and the official study recommendation as well as the progress in the various compulsory and optional courses. The first dashboard prototype was introduced to all computer science students in May 2020, and a university-wide rollout started in December 2020. The chapter describes design considerations and design development work, implementation, as well as the user feedback on the implementation. Finally, the authors present recommendations as guidelines for similar projects based on their experience and students’ feedback and give an outlook for future development and research.
... Ebenso wurde ein hochgradig konfigurierbares und interaktives Dashboard, das LA Cockpit entwickelt, um Manager/innen, Forscher/innen und vor allem Kursleiter/innen bei der Auswertung des Engagements und der Beteiligung von Kursteilnehmer/innen innerhalb eines MOOCs zu unterstützen (Leitner, Maier & Ebner, 2020). Darüber hinaus soll der Vergleich der individuellen Aktivität der Lernenden mit den kursweiten Durchschnittswerten die Selbsteinschätzung der Kursteilnehmer/innen verbessern, sie zur Teilnahme motivieren und die Abschlussraten erhöhen. ...
... [Place Fig. 1 Although we already have been conducting learning analytics at iMooX.at for several years and have published on the topic, e.g. [3][4][5][6], we have not regularly filed data on details of video watching. Only one of our papers analyzed video data and compared how learners in a MOOC deal with H5P-based interactive videos versus videos without such interactions [6]. ...
Chapter
Full-text available
Many MOOCs use units with videos and quizzes, where a successful attempt after several tries is the basis for a MOOC certificate. A first in-depth analysis of quiz behavior within a MOOC at the Austrian MOOC platform iMooX.at had shown several quiz attempts patterns (Mair et al. 2022). As a next step, the researchers now collected details on video watching within a new edition of the same MOOC and therefore could combine data on quiz and video behavior. This analysis shows similar distribution of the quiz attempt patterns as in our first analysis. Additionally, the analysis indicates that learners who completed more quiz attempts than needed for full point results or passing have a higher average video watching time than learners who only made attempts until reaching a full score or passing.KeywordsMOOC; quiz behaviorVideo behaviorLearningLearning analytics
... The role of big data in predicting future trends is set to rise in importance for many industries such as retailing (Bruni & Piccarozzi, 2022), health (Nguyen et al., 2021), digital marketing (Ponzoa & Erdmann, 2021), banking (Shirazi & Mohammadi, 2019), education (Leitner et al., 2020) and smart cities (Zeng et al., 2020). According to Malhotra (2022), "big data is humungous in volume, value, and variated data gathered from different sources, requiring further dissection and polishing using data science and data analytics for important inferences to be derived from it" (p. ...
Chapter
This chapter fundamentally aims at the development of generalized framework encapsulating a wide range of dynamic utility functional and resultant latent choice models. The objectives are served by the application of well cherished exponential family of distributions capable of entertaining numerous probabilistic articulations through a single comprehensive and elegant expression. Moreover, the utility of the proposed scheme is further substantiated by delineating the working pedagogy in accordance with the rapidly embraced Bayesian paradigm. The legitimacy of the devised mechanism in the pursuit of optimal decision-making is advocated with respect to diverse experimental states. We entertained varying extent of worth parameters describing the preference ordering, different sample sizes and distinguished stochastic formations to inject the prior information or historic data in the demonstration of choice behaviors.KeywordsChoice behaviorsLatent choice modelsPrior informationUtility functional
... The role of big data in predicting future trends is set to rise in importance for many industries such as retailing (Bruni & Piccarozzi, 2022), health (Nguyen et al., 2021), digital marketing (Ponzoa & Erdmann, 2021), banking (Shirazi & Mohammadi, 2019), education (Leitner et al., 2020) and smart cities (Zeng et al., 2020). According to Malhotra (2022), "big data is humungous in volume, value, and variated data gathered from different sources, requiring further dissection and polishing using data science and data analytics for important inferences to be derived from it" (p. ...
Chapter
Full-text available
The use of big data in decision-making has been burgeoning due to its significant potential to predict the possible shifts, tendencies and trends in social and economic behaviours, politics, health issues and consumer preferences. In recent years, particularly web analytics which employs an enormous amount of data in different web domains increased its popularity. Drawing on Google and social media data, this study investigates the trends and changes in Turkish women’s current and emerging consumer trends in fashion brands. Additionally, the differences in consumption patterns between two women consumer segments which are conservatives and liberals are identified. Search engine results pages and netnographic analyses findings reveal that a new lifestyle of conservative women emerged from the combination of religious codes and Western types of consumption, which may provide marketers with significant clues about future consumption trends, preferences and changes. The limitations and further research suggestions are highlighted at the end of the paper.KeywordsBig dataIdentity formationConsumer trendsWeb analyticsNetnography
... With the help of a dashboard teachers can also get information about their MOOCsfor example, when and how often the online courses are accessed or how many people posted how often in a discussion forum [23]. ...
Chapter
Full-text available
This paper discusses the general thesis that massive open online courses (in short MOOC), open educational resources (in short OER) and learning analytics are an impactful trio for future education, especially if combined. The contribution bases upon our practical experience as service providers and researchers in the department “Educational Technology” at Graz University of Technology (TU Graz) in Austria. The team members provide support to lecturers, teachers and researchers in these addressed fields for several years now, for example as host of the MOOC platform iMooX.at, providing only OER since 2015. Within this contribution, we will show, against some doubtful or conflicting opinions and positions, that (a) MOOCs are opening-up education; (b) learning analytics give insights and support learning, not only online learning, if implemented in MOOCs; and (c) that OER has the potential for sustainable resources, innovations and even more impact, especially if implemented in MOOCs.
Chapter
Full-text available
Schön, Sandra; Leitner, Philipp; Lindner, Jakob & Ebner, Martin (2023). Learning Analytics in Hochschulen und Künstliche Intelligenz. Eine Übersicht über Einsatzmöglichkeiten, erste Erfahrungen und Entwicklungen von KI-Anwendungen zur Unterstützung des Lernens und Lehrens. In: Tobias Schmohl, Alice Watanabe, Kathrin Schelling (Hg.), Künstliche Intelligenz in der Hochschulbildung, Bielefeld: transkript, S. 27-49. Online zugänglich unter: https://www.transcript-verlag.de/media/pdf/c9/16/59/oa9783839457696.pdf Erschienen unter der Lizenz CC BY SA 4.0 International (https://creativecommons.org/lice nses/by-sa/4.0/deed.de)
Chapter
During the last decades, the internet has become an increasingly important channel for businesses to sell products and communicate with customers. Web analytics helps companies to understand customer behavior and optimizes processes to satisfy the customer needs but there is still room for improvement in real-time visualization in the context of business content. In this paper, we describe a graph-based visualization showing the entirety of the website activities at a glance. To increase the tangibility of customer behavior, the graph adapts to the website interactions in real time using smooth transitions from one state to another. Furthermore, we incorporate machine learning in our data integration process to deal with the dynamics of change of website content over time. Finally, we conduct an evaluation in the form of expert interviews revealing that our approach is suitable to optimize digitalized business processes, initiate marketing campaigns, increase the tangibility to the customer, and put a stronger focus on customer needs.
Chapter
Full-text available
Within the sector of education, Learning Analytics (LA) has become an interdisciplinary field aiming to support learners and teachers in their learning process. Most standard tools available for Learning Analytics in Massive Open Online Courses (MOOCs) do not cater to the individual's conception of where Learning Analytics should provide them with insights and important key figures. We propose a prototype of a highly configurable and customizable Learning Analytics Cockpit for MOOC-platforms. The ultimate goal of the cockpit is to support administrators, researchers, and especially teachers in evaluating the engagement of course participants within a MOOC. Furthermore, comparing learner's individual activity to course wide average scores should enhance the self-assessment of students, motivate their participation, and boost completion rates. Therefore, several metrics were defined which represent and aggregate learner's activity. From this predefined list, stakeholders can customize the cockpit by choosing from multiple visualization widgets. Although, the current prototype focuses only on a minimal group of stakeholders, namely administrators and researchers. Therefore, it is designed in a modular, highly configurable and customizable way to ensure future extensibility. It can be strongly carried out that customization is integral to deepen the understanding of Learning Analytic tools and represented metrics, to enhance the student's learning progress.
Article
Full-text available
Modern processors use branch prediction and speculative execution to maximize performance. For example, if the destination of a branch depends on a memory value that is in the process of being read, CPUs will try to guess the destination and attempt to execute ahead. When the memory value finally arrives, the CPU either discards or commits the speculative computation. Speculative logic is unfaithful in how it executes, can access the victim's memory and registers, and can perform operations with measurable side effects. Spectre attacks involve inducing a victim to speculatively perform operations that would not occur during correct program execution and which leak the victim's confidential information via a side channel to the adversary. This paper describes practical attacks that combine methodology from side-channel attacks, fault attacks, and return-oriented programming that can read arbitrary memory from the victim's process. More broadly, the paper shows that speculative execution implementations violate the security assumptions underpinning numerous software security mechanisms, such as operating system process separation, containerization, just-in-time (JIT) compilation, and countermeasures to cache timing and side-channel attacks. These attacks represent a serious threat to actual systems because vulnerable speculative execution capabilities are found in microprocessors from Intel, AMD, and ARM that are used in billions of devices. Although makeshift processor-specific countermeasures are possible in some cases, sound solutions will require fixes to processor designs as well as updates to instruction set architectures (ISAs) to give hardware architects and software developers a common understanding as to what computation state CPU implementations are (and are not) permitted to leak.
Conference Paper
Full-text available
In this paper, we discuss the design, development, and implementation of a Learning Analytics (LA) dashboard in the area of Higher Education (HE). The dashboard meets the demands of the different stakeholders, maximizes the mainstreaming potential and transferability to other contexts, and is developed in the path of Open Source. The research concentrates on developing an appropriate concept to fulfil its objectives and finding a suitable technology stack. Therefore, we determine the capabilities and functionalities of the dashboard for the different stakeholders. This is of significant importance as it identifies which data can be collected, which feedback can be given, and which functionalities are provided. A key approach in the development of the dashboard is the modularity. This leads us to a design with three modules: the data collection, the search and information processing, and the data presentation. Based on these modules, we present the steps of finding a fitting Open Source technology stack for our concept and discuss pros and cons trough out the process.
Chapter
Full-text available
This chapter looks into examining research studies of the last five years and presents the state of the art of Learning Analytics (LA) in the Higher Education (HE) arena. Therefore, we used mixed-method analysis and searched through three popular libraries, including the Learning Analytics and Knowledge (LAK) conference, the SpringerLink, and the Web of Science (WOS) databases. We deeply examined a total of 101 papers during our study. Thereby, we are able to present an overview of the different techniques used by the studies and their associated projects. To gain insights into the trend direction of the different projects, we clustered the publications into their stakeholders. Finally, we tackled the limitations of those studies and discussed the most promising future lines and challenges. We believe the results of this review may assist universities to launch their own LA projects or improve existing ones.
Conference Paper
Today, Web Analytics (WA) is commonly used to obtain key information about users and their behavior on websites. Besides, with the rise of online learning, Learning Analytics (LA) emerged as a separate research field for collecting and analyzing learners’ interactions on online learning platforms. Although the foundation of both methods is similar, WA has not been profoundly used for LA purposes. However, especially large-scale online learning environments may benefit from WA as it is more sophisticated and well-established in comparison to LA. Therefore, this paper aims to examine to what extent WA can be utilized in this context, without compromising the learners’ data privacy. For this purpose, Google Analytics was integrated into the Massive Open Online Course platform of the Hasso Plattner Institute as a proof of concept. It was tested with two deployments of the platform: openHPI and openSAP, where thousands of learners gain academic and industry knowledge about engineering education. Besides capturing behavioral data, the platforms’ existing LA dashboards were extended by WA metrics. The evaluation of the integration showed that WA covers a large part of the relevant metrics and is particularly suitable for obtaining an overview of the platform’s global activity, but reaches its limitations when it comes to learner-specific metrics.
Article
Recently, interest in how this data can be used to improve teaching and learning has also seen unprecedented growth and the emergence of the field of learning analytics. In other fields, analytics tools already enable the statistical evaluation of rich data sources and the identification of patterns within the data. These patterns are then used to better predict future events and make informed decisions aimed at improving outcomes (Educause, 2010). This paper reviews the literature related to this emerging field and seeks to define learning analytics, its processes, and its potential to advance teaching and learning in online education.
Conference Paper
Eye tracking hardware and software can be used to analyze and improve websites. If conducting an eye tracking study is too costly, examining mouse movement data can also provide similar insights into user behavior as eye gaze data. Prior research has shown eye gaze and mouse cursor position can strongly correlate. The strength of the correlation, however, depends on the design of the website. It is important to determine if mouse tracking is a reliable substitute for eye tracking as new design patterns emerge. Today, there are low-cost eye tracking solutions available, enabling a wider audience to conduct their own eye-mouse correlation studies. In this paper, we use The Eye Tribe Eye Tracker and the analysis software, EyeProof, to find the relationship between eye gaze and mouse position on the Florida Institute of Technology Human-Centered Design Institute website. The results indicate that mouse tracking data may be a suitable substitute for eye tracking data on the studied website and it may be feasible to use consumer-grade eye tracking products to conduct similar assessments.
Article
This paper presents LISSA ("Learning dashboard for Insights and Support during Study Advice"), a learning analytics dashboard designed, developed, and evaluated in collaboration with study advisers. The overall objective is to facilitate communication between study advisers and students by visualising grade data that is commonly available in any institution. More specifically, the dashboard attempts to support the dialogue between adviser and student through an overview of study progress, peer comparison, and by triggering insights based on facts as a starting point for discussion and argumentation. We report on the iterative design process and evaluation results of a deployment in 97 advising sessions. We have found that the dashboard supports the current adviser-student dialogue, helps them motivate students, triggers conversation and provides tools to add personalisation, depth, and nuance to the advising session. It provides insights at a factual, interpretative, and reflective level and allows both adviser and student to take an active role during the session.
Conference Paper
Multimodal Learning Analytics (MMLA) captures, integrates and analyzes learning traces from different sources in order to obtain a more holistic understanding of the learning process, wherever it happens. MMLA leverages the increasingly widespread availability of diverse sensors, high-frequency data collection technologies and sophisticated machine learning and artificial intelligence techniques. The aim of this workshop is twofold: first, to expose participants to, and develop, different multimodal datasets that reflect how MMLA can bring new insights and opportunities to investigate complex learning processes and environments; second, to collaboratively identify a set of grand challenges for further MMLA research, built upon the foundations of previous workshops on the topic.