Conference PaperPDF Available

A Conceptual Framework linking Learning Design with Learning Analytics

Authors:

Abstract and Figures

In this paper we present a learning analytics conceptual framework that supports enquiry-based evaluation of learning designs. The dimensions of the proposed framework emerged from a review of existing analytics tools, the analysis of interviews with teachers, and user scenarios to understand what types of analytics would be useful in evaluating a learning activity in relation to pedagogical intent. The proposed framework incorporates various types of analyt-ics, with the teacher playing a key role in bringing context to the analysis and making decisions on the feedback provided to students as well as the scaffolding and adaptation of the learning design. The framework consists of five dimensions: temporal analytics, tool-specific analytics, cohort dynamics, comparative analytics and contingency. Specific metrics and visualisations are defined for each dimension of the conceptual framework. Finally the development of a tool that partially implements the conceptual framework is discussed.
Content may be subject to copyright.
A Conceptual Framework linking Learning Design with
Learning Analytics
Aneesha Bakharia
Information Systems School
Queensland University of
Technology
Brisbane, Australia
aneesha.bakharia@qut.edu.au
Linda Corrin
Melbourne Centre for the
Study of Higher Education
The University of Melbourne
Melbourne, Australia
lcorrin@unimelb.edu.au
Paula de Barba
Melbourne Centre for the
Study of Higher Education
The University of Melbourne
Melbourne, Australia
paula.de@unimelb.edu.au
Gregor Kennedy
Melbourne Centre for the
Study of Higher Education
The University of Melbourne
Melbourne, Australia
gek@unimelb.edu.au
Dragan Gaševi´
c
Moray House School of
Education and School of
Informatics
University of Edinburgh
Edinburgh, Scotland
dgasevic@acm.org
Raoul Mulder
School of BioSciences
The University of Melbourne
Melbourne, Australia
r.mulder@unimelb.edu.au
David Williams
Department of Physiology
The University of Melbourne
Melbourne, Australia
d.williams@unimelb.edu.au
Shane Dawson
Teaching Innovation Unit
University of South Australia
Adelaide, Australia
shane.dawson@unisa.edu.au
Lori Lockyer
School of Education
Macquarie University
Sydney, Australia
lori.lockyer@mq.edu.au
ABSTRACT
In this paper we present a learning analytics conceptual
framework that supports enquiry-based evaluation of learn-
ing designs. The dimensions of the proposed framework
emerged from a review of existing analytics tools, the analy-
sis of interviews with teachers, and user scenarios to under-
stand what types of analytics would be useful in evaluating
a learning activity in relation to pedagogical intent. The
proposed framework incorporates various types of analyt-
ics, with the teacher playing a key role in bringing context
to the analysis and making decisions on the feedback pro-
vided to students as well as the scaffolding and adaptation
of the learning design. The framework consists of five di-
mensions: temporal analytics, tool-specific analytics, cohort
dynamics, comparative analytics and contingency. Specific
metrics and visualisations are defined for each dimension of
the conceptual framework. Finally the development of a
tool that partially implements the conceptual framework is
discussed.
Categories and Subject Descriptors
K3.2 [Computers & Education]: Computer and Informa-
tion Science Education - computer science education, infor-
mation systems education
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full cita-
tion on the first page. Copyrights for components of this work owned by others than
ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-
publish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from permissions@acm.org.
LAK ’16, April 25-29, 2016, Edinburgh, United Kingdom
c
2016 ACM. ISBN 978-1-4503-4190-5/16/04. . . $15.00
DOI: http://dx.doi.org/10.1145/2883851.2883944
Keywords
Learning analytics, Intervention design, Learning design
1. INTRODUCTION
Over recent years, learning analytics has emerged as a
powerful tool for addressing a range of educational chal-
lenges and issues, including concerns over institutional reten-
tion (particularly for underrepresented groups), and continu-
ous improvement of the student learning experience through
personalised learning. However, the vast potential of learn-
ing analytics to influence and mitigate many of these con-
cerns remains essentially untapped in terms of day-to-day
teaching practice. Most analytics studies have drawn on
historical data to identify patterns in students’ learning be-
haviour which are then related to academic performance
and/or retention. Much of this work, however, is lacking in
an understanding of the pedagogical context that influences
student activities, and how identifying patterns in students’
learning behaviours can be used to influence and contribute
to more positive teaching and learning experiences [16, 5].
Essentially there is a knowledge gap for teachers in attempt-
ing to bridge the divide between the information provided
by learning analytics and the types of pedagogical actions
designed by teachers to support student learning. The field
of learning design offers a way to address this gap by help-
ing teachers to articulate the design and intent of learning
activities which can be used as a guide for the interpretation
of learning analytics data.
In this paper we outline a framework that brings together
both learning design and analytics to create more mean-
ingful representations of data for teachers. In so doing we
argue that an understanding of course context is essential to
providing more adaptive and appropriate analytics visualisa-
tions to aid interpretation and translation into direct actions
to support student learning. The framework development is
situated within the context of a cross-institutional learning
analytics study in Australia which investigated the peda-
gogical concerns and needs faced by teachers in their local
contexts, and how learning analytics may usefully provide
actionable evidence that allows them to respond to those
concerns or needs. The interview data collected from teach-
ers that informed the design of the framework will be pre-
sented as well as a description of an online analytics tool was
developed which operationalised parts of the framework.
2. BACKGROUND
As the field of learning analytics evolves, the need to
align both analytic approaches and outputs with a concep-
tual frame of reference and educational context has been ac-
knowledged [6, 22]. The field of learning design offers the po-
tential to do this. Lockyer, Agostinho and Bennett (in press)
[14] define learning design as both a process of “creating and
adapting pedagogical ideas” as well as the product of “a for-
malised description of a sequence of learning tasks, resources
and support that a teacher constructs for students for an
entire, or part of, an academic semester”. The field of learn-
ing design allows educators and educational researchers to
articulate how educational contexts, learning tasks, assess-
ment tasks and educational resources are designed to pro-
mote effective interactions between teachers and students,
and students and students, to support learning (see [7, 15]).
Given this, well-articulated learning designs provide clear
insight into teachers’ pedagogical intent behind the learning
activities and assessment tasks they provide to students [16].
These articulated learning designs can, therefore, provide a
critical frame of reference for the interpretation of patterns
of student interactions that are generated by learning an-
alytics techniques. In other words, learning design can be
said to provide “a semantic structure for analytics” [18, pp.
312].
While it makes conceptual sense to argue that articulated
learning designs can provide a“frame of reference”for the in-
terpretation of learning analytics outputs, questions remain
about how this can be achieved in practice. This is compli-
cated by the fact there are several ways such learning designs
can be represented, including the Learning Design Visual Se-
quence [3] and the IMS Learning Design specification [10].
Thus a core challenge for the learning analytics community is
to determine conceptual and practical frameworks that can
link teachers’ enacted practice (i.e., their learning designs)
with data and evidence that emerge from learning analytics
through the use of accessible learning analytics tools. This
needs to be achieved in ways that ultimately are useful in
informing ongoing educational practice.
3. METHODOLOGY
The development of the framework was informed by the
outcomes of a study conducted across three Australian uni-
versities in 2014/2015. The study aimed to develop an online
tool to provide meaningful analytics to teachers to support
teaching and learning. Three main sources of information
were used to help conceptualise and categorise the analytic
needs and wants of teachers. These included a review of
the literature on existing learning analytics tools, interviews
with teaching staff across the three institutions, and spe-
cific user scenarios for each of the courses that were to be
used to pilot the tool. The review of the literature iden-
tified several learning analytics tools that had been devel-
oped to provide learning analytics data to teachers. Each
of these tools were then examined to determine if any theo-
retical/learning design foundation was present, the metrics
used, and the methods of data visualisation employed.
The semi-structured interviews were conducted with teach-
ers across the three participating Australian universities.
The purpose of the interviews was to determine the ways
in which learning analytics could be used to assist teach-
ers to address the fundamental education problems or situ-
ations they face in the delivery of online and blended learn-
ing. The interviews explored the curriculum structures and
learning designs teachers employed within their classes, the
pedagogical problems they faced in their teaching, the ways
in which they used technology-based tools in their teach-
ing, and the role learning analytics could play in addressing
some of their known concerns. The sample consisted of 12
participants, four from each institution, who were involved
in the delivery of courses that used a range of tools within
the Learning Management System (LMS). To ensure cross-
disciplinary representation, the participants were course co-
ordinators from across the arts, professions and science disci-
plines. Table 1 provides an overview of the discipline, num-
ber of enrolled students and tools used in the delivery for
each participant’s course. Once all interviews had been con-
ducted a thematic analysis was performed on the interview
transcripts to identify and group the needs and wants of
teachers interviewed.
In addition to the review of existing analytics tools and
the interviews, user scenarios were developed for each of the
courses that were to be used to pilot the analytics tool. The
case profiles contained more detailed information about the
four pilot courses including: learning outcomes, structure of
lectures/tutorials, assessment details, LMS tools used, the
purpose of these tools within the curriculum, current cur-
riculum evaluation methods, and suggestions for analytics
that could be useful for the course. Additional suggestions
identified through the user scenarios were then added to the
outputs of the literature review and interview analysis. An
important function of the user scenarios was to inform the
prioritisation of development of the web-based analytics tool
components.
The existing analytics tool review, interview analysis, and
user scenarios combined to form the basis for the develop-
ment of the conceptual framework which identifies different
types of learning analytics that could respond directly to the
kinds of issues and concerns that teachers expressed with
the learning contexts, learning designs and learning activi-
ties they used in their online and blended learning environ-
ments. Each dimension of the conceptual framework was
then used to identify the functional requirements of a learn-
ing analytics tool that would support enquiry-based learning
design. The user scenarios were then used to prioritise the
development of the web-based learning analytics tool.
4. RESULTS AND DISCUSSION
In this section, a conceptual framework that links learn-
ing analytics to learning design is presented. The conceptual
framework consists of five dimensions: temporal analytics,
comparative analytics, cohort dynamics, tool specific analyt-
ics, and contingency and intervention support tools. In the
Table 1: Course details for recruited participants.
proposed conceptual framework the teacher plays a crucial
role in bringing context to the analysis and making deci-
sions on the feedback provided to students as well as the
scaffolding and adaptation of the learning design.
4.1 Temporal Analytics
All the interview participants indicated that they valued
the ability to see course, content and tool access statistics
over the duration of the course within the LMS. The abil-
ity to see students’ access to elements within a course (i.e.,
curriculum content and tools) and the duration of students
sessions was identified as important for teachers as they tried
to reconcile how they have structured the overall course and
scheduled key activities within it, with how the students
have chosen to access the content, tasks and assessment
tools within the LMS. In particular, they wanted to iden-
tify course material that was valuable to students and was
being reviewed multiple times. They also wanted to check
whether students viewed or posted in a discussion forum.
Ten of the twelve interviewees indicated a need to review
access statistics at an individual student level to be able to
provide individualised support and/or to deal with student
appeals.
Whilst teachers sequence and schedule content and ac-
tivity according to their learning design, within the LMS
there is no explicit way to represent this pedagogical intent
when implementing content and activities. The LMS allows
content and activities to be added in a hierarchical struc-
ture. Through the interviews and user scenarios it was clear
that most teachers created weekly folders to contain the con-
tent and activities specific to the topic being covered in that
week. Therefore the “week” was defined as the most rele-
vant period for temporal analysis and the lense with which
to review student activity allowing the teacher to link this
back to the course structure and schedule.
An additional request, also within the temporal dimen-
sion, made by six of the twelve interview participants was
the ability to see course and content access before and after
key instructional events, such as the weekly lecture, tutori-
als, assessment due dates, or the start and finish of a quiz.
As an example, one participant was interested in knowing
whether students were accessing prescribed pre-reading ma-
terial prior to the lecture and tutorial. Five other interview
participants were also interested in seeing resource access
(i.e., slides, notes, video recordings) prior to a quiz, and ob-
serving the impact this had on student grades. Temporal
“events” specified by the participants fell into the following
three categories:
Recurring Events
Events that occurred at the same time each week such
as a tutorial or a lecture.
Submission Events
Events which included the due dates for the submission
of assessment items and the dates on which an online
quiz was made available to students.
Single Events
Events that only occurred once in the semester, such
as a guest lecture or field trip.
4.2 Comparative Analytics
Comparative analytics allow the teacher to observe pat-
terns or relations between two or more aspects of a course.
Each of the interview and user scenario participants requested
to be able to see how scheduled learning activities impacted
on student participation (and thereby gain some insight into
whether activities were achieving the desired learning design
objective). In addition, nine of the interview participants
wanted to be able to compare these activities and levels
of participation with each other over time. This type of
comparative analysis is only possible with a clear knowledge
of the course structure and activity scheduling within the
LMS. By being able to comparatively review activities and
students’ participation in them, activities can be identified
that may be candidates for redesign, for a greater level of
student scaffolding, or for other forms of intervention.
Comparative analytics in this context is defined as the
provision of analytics in a form that enables the teacher to
compare different types of activities that may occur within
the same time period as well as the same types of activities
occurring over different time periods. Examples include:
the ability to compare access to content, communica-
tion and assessment tools over the duration of a week
or the whole semester.
the ability to compare access to each content, commu-
nication and assessment item by week.
the ability to compare each student’s course access by
week.
Comparative analytics is not restricted to the temporal
domain, and is equally applicable to social network, content
or discourse analyses. For example, a social network dia-
gram that describes the online discussion flows of a single
small group could be the subject of comparative analytics,
showing how the involvement of the group members shifts
and changes over time and in relation to other components
of the course. In this way, comparative analytics provides a
lense through which the structure and sequence of designed
activities within the curriculum, implemented through the
use of LMS tools, can be evaluated.
4.3 Cohort Dynamics
Seven of the participants in the interviews expressed the
desire to view which students had accessed or not accessed
a specific item of content or tool. Similar to the request
for course access statistics (as described in the Temporal
Analytics section), this requirement on the surface seems
relatively simple. However, while LMS reports provide sim-
ple access logs, typically they require extensive processing
to be easily used and interpreted. Moreover, teachers would
be required to drill-down and extensively filter data to gain
insight into individual student’s patterns of access and usage
for content items. All of the interview and user scenario par-
ticipants requested the ability to view tool specific metrics
displayed by student. For example, they wanted the abil-
ity to view the number of quiz attempts, access to lecture
recordings and forum posts made by a particular student.
The ability to view students’ access to course items also
relates to the ability to identify student pathways and the
impact their patterns of activity may have had on learning
outcomes. While the request for access to view students
who had both accessed and not accessed course content and
tools could be regarded as relatively simple, several partic-
ipants requested the ability to analyse the access pathways
that relate access patterns to student performance on assess-
ment. As an example, one participant requested the ability
to compare the performance of students who attended lec-
tures and accessed online lecture recordings with students
who only viewed the lecture recording. There was an ex-
pectation that there would be common access patterns for
particular groups of students and that identifying these stu-
dent groups would allow teachers to better understand the
cohort dynamics. Another example of this was a request to
compare the access patterns of successful and unsuccessful
students, based on grades, to identify differences and advise
the underperforming students on more effective approaches
to study.
An understanding of the cohort dynamics can be helpful
in determining how different groups of students interact and
engage with overarching curriculum structures and the spe-
cific learning designs of elements of the course (e.g content,
assessments, discussion). If different types of interaction
patterns manifested for different groups of students, these
may results in different learning outcomes. The implica-
tion is that the learning design may be more “accessible” by
particular groups of students, resulting in different levels of
success. This could help to highlight where learning activ-
ities may need to be scaffolded or interventions may need
to be considered for particular cohorts. Cohort dynamics
may also differ between course offerings with each semester
bringing a different student dynamic that must be under-
stood in realtime in order for learning designs to be adapted
to better meet student requirements [20].
4.4 Tool Specific Analytics
While temporal analytics relate to all content and tools
implemented in the LMS course sites, interview and user
scenario participants also identified the need for analytics
that were specifically tailored to the particular LMS tools
they were using. Simple requests included the analysis of
quiz scores, quiz attempts, and counts for discussion forum
posts. More advanced analytics were requested for quizzes,
such as, a need for quiz item analysis that can be more
easily interpreted than what the LMS currently provides.
The ability to map questions to concepts and provide ag-
gregate reporting across these concepts was also requested.
One participant suggested analytics based on content analy-
sis would be useful to identify the topics that students were
either exploring in a discussion forum. Six of the interview
participants were interested in visualisations able to show
the networks that were forming within collaborative activi-
ties involving forums. Sociograms, as illustrated by Lockyer,
Heathcote and Dawson [16] provide a way to identify devi-
ations between observed and anticipated interactions based
on the learning design. There was also a need identified by
four of the interview participants for the analysis of learning
activities occurring outside the LMS with participants using
a variety of social media platforms (i.e., Facebook, Twitter,
Blogs). Interview participants who utilize lecture recordings
or media resources in their learning design were interested
in knowing whether the media was streamed or downloaded,
and what portions were being played.
Tool specific analytics can clearly be used in conjunction
with comparative analytics and cohort dynamics. In partic-
ular, there is potential to compare an emerging social net-
work by week and include any specific tool related metrics as
features to a clustering algorithm. The addition of tool spe-
cific metrics would allow additional types of student groups
based upon similarity to be discovered.
4.5 Contingency and Intervention
Support Tools
Eleven of the twelve interview participants highlighted the
value of identifying and intervening when students were de-
termined to be potentially “at risk” (because they did not
access crucial content or achieve a pass score on a quiz or
assessment). In most cases, the intervention proposed was
sending an email to identified students alerting them to the
fact they were falling behind and providing them with advice
on the areas on which they should concentrate. Participants
indicated that currently the identification and selection of
students who were at risk was often a labour intensive and
manual task. Groups of students were often determined by
filtering a spreadsheet of data based on quiz scores, and then
emailing specific feedback and guidance to students whose
score fell within a certain range.
The contingency dimension seeks to address this issue by
providing tools to help teachers identify and select an indi-
vidual, a group, or multiple groups of students - based on
some determined parameters - for the purpose of providing
appropriate intervention and guidance. The contingency di-
mension is associated with the cohort dynamics dimension,
in that patterns established on the basis of student similarity
discovery algorithms will allow teachers to effectively select
and identify student groups for some kind of intervention.
Contingency is, however, seen as an outcome of cohort dy-
namics - only after the teacher has understood why students
in a particular group or cohort were similar (e.g., they may
not have participated in key activities or grasped a core con-
cept) can he or she provide appropriate advice to both sup-
port students’ understanding and scaffold their approaches
to learning activities.
Contingency also requires a productivity element, in that,
tools to facilitate the communication with students are needed.
Particularly, the ability to select and email multiple students
or send template messages with custom fields to person-
alise email with student details and scores (i.e., mail merge
features). The provision of tools that simplify workload in
terms of performing interventions is a necessity, potentially
increasing the chance for teachers to adopt the use of tech-
nology in their teaching practice [8].
5. THE LEARNING ANALYTICS FOR
LEARNING DESIGN CONCEPTUAL
FRAMEWORK
Figure 1 illustrates the proposed Learning Analytics for
Learning Design Conceptual Framework. The proposed con-
ceptual framework aims to transform learning design into a
teacher-led enquiry-based practice. In the framework, the
teacher plays a central role in bringing contextual knowl-
edge to the review and analysis of the learning analytics
and then in making decisions in relation to contingency.
The framework incorporates the different analytical di-
mensions that emerged from the review of existing learning
analytics tools, thematic analysis of the semi-structured in-
terviews conducted with the study participants and the user
scenarioss, namely temporal analytics, tool specific analyt-
ics and cohort dynamics. Each analytical dimension can
be reviewed and analysed through a comparative approach
by the teacher. Comparative analytics and related support
visualisations are relevant across all other analytical dimen-
sions and contribute to the teacher’s ability to make sense
of the data. Comparative analysis therefore provides the
lense through which the teacher evaluates the learning de-
sign implementation within the LMS in direct relation to the
activity scheduling and sequencing dictated by the course
design.
The teacher plays a key role within the proposed concep-
tual framework in bringing teaching and learning context to
the analysis. Firstly, the teacher must use their tacit domain
knowledge and understanding of the macro (i.e., the course
structure and higher level curriculum design) and micro (i.e.,
implementation of activities within the LMS) level learning
designs while analysing and reviewing the learning analyt-
ics and visualisations. This view is supported by Lockyer,
Heathcote and Dawson [16], who state that “The interpre-
tation of visualizations also depends heavily on an under-
standing the context in which the data were collected and
the goals of the teacher regarding in-class interaction.” (p.
1446). Secondly, the teacher uses the insight gained from
the analytics and contextual knowledge to make decisions
on improving the delivery of the learning objectives, thereby
adapting the learning design (i.e., contingency). Persico and
Pozzi [20] see the teacher drawing analogies with similar ac-
tivities and carrying out comparative evaluation. Pardo, El-
lis, and Calvo [19] argue that analysis of the digital footprint
cannot alone lead to informed learning activity redesign and
that qualitative data on why the students have engaged in
different ways leads to improved interpretation. The teacher
is therefore crucial in being able to collate the required qual-
itative data and incorporate these findings in the learning
design adaptation decision-making process.
In the proposed conceptual framework, contingency oc-
curs as the output and is the result of learning design deci-
sions made by the teacher. Contingency can take the form of
restructured or scaffolded learning activities or recommen-
dations and feedback provided to distinct student groups.
Contingency requires the teacher to understand the differ-
ent types of analytics and interpret the patterns emerging
from the cohort dynamics dimension. Contingency also re-
lies on the availability of tools to identify, select, filter and
communicate feedback to students.
The cohort dynamics dimension needs to be supported
by algorithms that are able to discover and provide inter-
pretable student usage and similarity patterns to the teacher.
Yi, et al. [24], define pattern detection as a “means to find
specific distributions, trends, frequencies, outliers, or struc-
ture in the dataset” and go on to say that by using pattern
detection, “people may not only find what they have been
looking for but also discover new knowledge that they did
not expect to find”. Manual pattern discovery may however
be a difficult and time consuming task, given the number of
features to be evaluated. There are two types of machine
learning algorithms that are suitable for inclusion in the co-
hort dynamics dimension:
Sequential Pattern Mining
Sequential pattern mining algorithms are able to find
not only what content and/or tools students are ac-
cessing but also whether there was an order in the way
that groups of students accessed the content and/or
tools [21]. The page flow visualisation [1] represents
a useful way to visualise the output of a sequential
mining algorithm.
Figure 1: The Learning Analytics for Learning Design Conceptual Framework
Unsupervised Clustering
Unsupervised clustering algorithms are capable of find-
ing similar students based on what the student ac-
cessed, when they accessed the items, and any other
metrics such as quiz scores, forum contributions, and
forum post vocabulary. Example algorithms include
k-means clustering, dimension reduction, non-negative
matrix factorisation and nearest-neighbour classifiers.
These algorithms are commonly used in the educa-
tional data mining and learning analytics literature for
student profiling [13, 17].
In Table 2, the specific types of analytics, metrics and
visualisations that relate to each of the dimensions in the
framework are included. Table 2 can be viewed as a blueprint
for the types of analytics that need to be included in order
to use learning analytics to inform enquiry-based learning
design improvements.
6. THE LOOP TOOL - A REFERENCE IM-
PLEMENTATION OF THE PROPOSED
CONCEPTUAL FRAMEWORK
The open source Loop tool is the reference implementation
for the Learning Analytics for Learning Design Conceptual
Framework (for more detail on the component of the tool
see [4]). The Loop tool is programmed in Python and uses
the Django web application framework. The Loop tool is
made up of two components: (1) a data warehouse and (2)
a web interface to display metrics and visualisations. The
Loop tool supports the Blackboard and Moodle Learning
Management Systems and is a partial implementation of the
proposed conceptual framework. The Loop tool currently
implements the temporal and comparative dimensions. The
cohort dynamics, tool specific and contingency dimensions
are currently being implemented.
6.1 Course Data Preprocessing
In order to perform the access-related analytics by person
(i.e., teacher, tutor or student), by week, and by content
item or tool, both a course access log and the course struc-
ture are required. In older versions of Moodle, the course
export format contained both the logs and the hierarchi-
cal course structure manifest. In more recent versions of
Moodle, the course export zip file and a csv export from
the log tracking database table need to be processed. For
Blackboard courses the IMS-CP archive format [9] (which
includes forum posts) and an export of the Accumulator
database table need to be processed. The IMS-CP archive
format contains the manifest file in XML format that pro-
vides the course structure hierarchy.
Within the data warehouse, multidimensional cubes are
created using a star schema architecture [2] commonly found
in Business Intelligence applications. The star schema in-
cludes a fact table (i.e., a table with a timestamped entry
for each course item accessed) with related dimension tables
that store the dates (i.e., a table containing the the day,
month, year and day of week), the course items by type and
the users (students and teaching staff). The star schema
allows analytics for aggregate calculations by week, tool and
student to be stored. An additional categorisation for course
items is performed to group items for analysis based upon
whether the item is a content item, a communication tool
or an assessment tool.
6.2 Dashboards
The Loop tool provides a dashboard for each week in a
semester. The “week” was determined, through the inter-
Table 2: Metrics and Visualisations for the Conceptual Framework
Temporal Analysis Tools Specific Analysis
Access by day of week and total access by week
Access per course structure item
Session duration and average session duration
Unique page views
Ability to visualise the impact of assessment and other
events on activity
Types of analytics/visualisation:
Timelines
Event Markers for recurring, submission and
once-off events
The inclusion of metrics, analytics and visualisation
specific to the type of LMS and social media tools being
used.
Quizzes
Quiz scores
Number of quiz attempts
Quiz item analysis
Forums
No of forum posts
Topics being discussed (i.e., Topic Modeling)
Social network analysis
Discourse analysis
Cohort Dynamics/Patterns Comparative Analysis
Inclusion of weekly metrics, overall semester metrics
and tool specific metrics as features for pattern discov-
ery algorithms
Analysis of student sequential access for patterns
Session duration and average session duration
Types of analytics/visualisation:
Finding students who accessed content/tools and
those that did not
Cohort dynamics can be found by clustering stu-
dent features by week and also by course duration
whole of course.
Allowing teachers to search for similar students
Allowing teachers to search for students with spe-
cific attributes (e.g., Quiz score lower than, etc)
Need to compare the impact of different learning ac-
tivities
Week by week comparison of content and tool access
Week by week comparison of student course content
item access
Comparison of access and usage of content, communi-
cation and assessment within the LMS course
Comparative analysis with previous cohorts (i.e.,
group dynamics might change and learning design
needs to adapt to it)
Types of analytics/visualisation:
Timelines and event markers
Correlations between activities and measures of
engagement eg correlation between communica-
tion tool access and session time or quiz scores
Display of expandable hierarchical course struc-
ture tree with column counts for each week
Contingency and Decision Support Tools
Inclusion of weekly metrics, overall semester metrics
and tool specific metrics as features for pattern discov-
ery algorithms
Ability for teachers to easily recommend strategies and
provide feedback to students
Types of Contingency Tools:
Finding students who accessed content/tools and
those that did not
Allowing teachers to search for similar students
Allowing teachers to search for students with spe-
cific attributes (e.g., Quiz score lower than, etc)
Allowing teachers to email groups of students us-
ing templates (e.g., mail merge)
views and user scenarios, to be a good period of activity
measurement in terms of the way topics and activities are
designed within higher education. Each weekly dashboard
includes content, communication tool and assessment tool
access graphs for each day of the week. Recurring, sub-
mission and single events can also be defined. These are
displayed on the timeline graphs to allow teachers to see the
pre and post event course access changes.
Figure 2 shows the weekly dashboard (temporal dimen-
sion). Summary statistics for session duration and average
session length are included as sparklines. Top users and
items assessed during the week are also displayed on the
dashboard.
The Loop tool uses bubble charts to illustrate the percent-
age of activity that has occurred before and after an event
(Figure 3). The teacher is able to define multiple events
which they can then be selected to produce the pre and
post event visualisation.
6.3 Comparative Visualisations
A tree table is used to display hierarchical course content
access by week (see Figure 4). The inclusion of the tree table
visualisation was inspired by Loco-Analyst, a tool identified
in the literature review which integrated analytics with the
course structure [11]. Weekly forum post counts, assessment
item attempts and average scores are also shown by week.
6.4 Contingency
For each course item a teacher is able to see the students
who accessed the item and those who have not accessed the
item. Filtering by date is also provided. This is the only
basic form of contingency that the Loop tool currently in-
cludes. In future versions of the Loop tool, the functionality
for teachers to easily identify similar groups of students as
well as recommend strategies and provide feedback to stu-
dents will be provided.
6.5 Individual Course Item and Student Views
The Loop tools allows teachers to drill-down to view indi-
vidual student access statistics and specific tool metrics (i.e.,
temporal and comparative dimensions). Figure 5 shows a
timeline of total student access for a specific course item.
Assessment attempts, scores and number of forum posts are
also included (i.e., tool specific dimension).
6.6 Future Directions
This paper has detailed the progress made in Phases 1
and 2 of an Australian Government Office of Learning and
Teaching funded project called “Completing the Loop” [12].
The first two phases involved collection of data to inform
the development of the framework and design of the tool
(as outlined above in the methodology section). Phase 3 of
the project has commenced and a trial of the Loop tool is
underway. Results from the trial using courses from Black-
board and Moodle will be used to validate and extend the
proposed conceptual framework.
Additional functionality that supports the cohort dynam-
ics and contingency dimensions are currently being inves-
tigated for inclusion in the Loop tool. Figure 6 shows the
result of using the t-sne dimension reduction algorithm [23]
to cluster students using using weekly course metrics (i.e.,
content access, communication tool access, assessment at-
tempts and scores, forum posts and session duration in each
week within the semester) as features. The aim is to pro-
vide visualisations that allow teachers to identify clusters of
students and then provide insight on why students in each
cluster are similar to help teachers interpret student groups
and provide appropriate feedback and intervention.
7. CONCLUSION
In this paper, we have proposed a learning analytics con-
ceptual framework for learning design. The framework was
informed by an understanding of the types of learning an-
alytics that would be useful to support the evaluation of
learning designs. While clear dimensions for the types of
analytics required emerged from interviews with teachers, it
was evident that the teacher played crucial roles in bringing
the learning and teaching context into the interpretation of
the analytics and also in making decisions based upon the
analytics. Five main types of analytics namely temporal,
comparative, tool specific, cohort dynamics and contingency
were identified.
The proposed framework makes a useful theoretical contri-
bution, bridging the gap between learning design and learn-
ing analytics while establishing a platform to support enquiry-
based evaluation and scaffolding of learning activities. The
utility of the proposed framework, however, lies in its abil-
ity to direct the types of analytics and contingency support
tools that are essential to support the learning design pro-
cess. As illustrated in this paper, each dimension in the con-
ceptual framework leads to clear analytics and visualisation
requirements; which in turn were able to be implemented
within the Loop tool. The Loop tool is currently being tri-
aled and enhanced to incorporate analytics and tools for the
cohort dynamics and contingency dimensions. The evalua-
tion results of the Loop tool will be used to further develop,
refine and validate the proposed framework.
8. ACKNOWLEDGEMENTS
Support for this project has been provided by the Aus-
tralian Government Office for Learning and Teaching. The
views in this project do not necessarily reflect the views of
the Australian Government Office for Learning and Teach-
ing. The study discussed in this paper was approved by the
University of Melbourne’s Melbourne Graduate School of
Education Human Ethics Advisory Group (MGSE HEAG),
approval number 1339454.
Figure 2: Weekly Course dashboard included in the Loop Tool (Temporal dimension): A = Daily page views; B = Critical
learning events; C = Week metrics; D = Top accessed content.
Figure 3: Individual course item access (Temporal dimen-
sion)
Figure 4: A tree table visualisation (Comparative dimen-
sion)
Figure 5: Individual course item access (Temporal dimen-
sion)
Figure 6: Student groups discovered by t-sne dimension re-
duction
9. REFERENCES
[1] Introducing flow visualization: visualizing visitor flow.
http://analytics.blogspot.com.au/2011/10/
introducing-flow-visualization.html. Accessed:
2015-10-12.
[2] Star schema.
https://en.wikipedia.org/wiki/Star schema. Accessed:
2015-10-12.
[3] S. Agostinho, B. M. Harper, R. Oliver, J. Hedberg,
and S. Wills. A visual learning design representation
to facilitate dissemination and reuse of innovative
pedagogical strategies in university teaching. 2008.
[4] L. Corrin, G. Kennedy, P. de Barba, A. Bakharia,
L. Lockyer, D. Gasevic, D. Williams, S. Dawson, and
S. Copeland. Loop: A learning analytics tool to
provide teachers with useful data visualisations. In
Globally connected, digitally enabled. Proceedings
ascilite Perth 2015, pages 409–413, 2015.
[5] D. Gaˇsevi´c, S. Dawson, T. Rogers, and D. Gasevic.
Learning analytics should not promote one size fits all:
The effects of instructional conditions in predicating
academic success. The Internet and Higher Education,
2016.
[6] D. Gaˇsevi´c, S. Dawson, and G. Siemens. Letˆa˘
A´
Zs not
forget: Learning analytics are about learning.
TechTrends, 59(1):64–71, 2015.
[7] P. Goodyear. Teaching, technology and educational
design: The architecture of productive learning
environments. The Australian Learning and Teaching
Council, 2009.
[8] M. S.-J. Gregory and J. M. Lodge. Academic
workload: the silent barrier to the implementation of
technology-enhanced learning strategies in higher
education. Distance Education, pages 1–21, 2015.
[9] IMS Global Learning Consortium. Content packaging
specification.
https://www.imsglobal.org/content/packaging.
Accessed: 2015-10-12.
[10] IMS Global Learning Consortium. Learning design
specification.
http://www.imsglobal.org/learningdesign, note =
Accessed: 2015-10-12.
[11] J. Jovanovic, D. Gasevic, C. Brooks, V. Devedzic,
M. Hatala, T. Eap, and G. Richards. Loco-analyst:
semantic web technologies in learning content usage
analysis. International journal of continuing
engineering education and life long learning,
18(1):54–76, 2008.
[12] G. Kennedy, L. Corrin, L. Lockyer, S. Dawson,
D. Williams, R. Mulder, S. Khamis, and S. Copeland.
Completing the loop: returning learning analytics to
teachers. In Rhetoric and Reality: Critical perspectives
on educational technology. Proceedings ascilite
Dunedin 2014, pages 436–440, 2014.
[13] V. Kovanovi´c, D. Gaˇsevi´c, S. Joksimovi´c, M. Hatala,
and O. Adesope. Analytics of communities of inquiry:
Effects of learning technology use on cognitive
presence in asynchronous online discussions. The
Internet and Higher Education, 27:74–89, 2015.
[14] L. Lockyer, S. Agostinho, and S. Bennett. Design for
e-learning. In J. F. . E. M. C. Haythornthwaite,
R. Andrews, editor, The SAGE Handbook of
E-learning Research. Sage, Thousand Oaks, CA, in
press.
[15] L. Lockyer, S. Bennett, S. Agostinho, and B. Harper.
Handbook of research on learning design and learning
objects: issues, applications, and technologies (2
volumes). IGI Global, Hershey, PA, 2009.
[16] L. Lockyer, E. Heathcote, and S. Dawson. Informing
pedagogical action: Aligning learning analytics with
learning design. American Behavioral Scientist, page
0002764213479367, 2013.
[17] G. Lust, J. Elen, and G. Clarebout. Regulation of
tool-use within a blended course: Student differences
and performance effects. Computers & Education,
60(1):385–395, 2013.
[18] Y. Mor, R. Ferguson, and B. Wasson. Editorial:
Learning design, teacher inquiry into student learning
and learning analytics: A call for action. British
Journal of Educational Technology, 46(2):221–229,
2015.
[19] A. Pardo, R. A. Ellis, and R. A. Calvo. Combining
observational and experiential data to inform the
redesign of learning activities. In Proceedings of the
Fifth International Conference on Learning Analytics
And Knowledge, pages 305–309. ACM, 2015.
[20] D. Persico and F. Pozzi. Informing learning design
with learning analytics to improve teacher inquiry.
British Journal of Educational Technology,
46(2):230–248, 2015.
[21] P. Reimann, L. Markauskaite, and M. Bannert.
e-research and learning theory: What do sequence and
process mining methods contribute? British Journal
of Educational Technology, 45(3):528–540, 2014.
[22] B. Rienties, L. Toetenel, and A. Bryan. Scaling up
learning design: impact of learning design activities on
lms behavior and performance. In Proceedings of the
Fifth International Conference on Learning Analytics
And Knowledge, pages 315–319. ACM, 2015.
[23] L. Van der Maaten and G. Hinton. Visualizing data
using t-sne. Journal of Machine Learning Research,
9(2579-2605):85, 2008.
[24] J. S. Yi, Y. Kang, J. T. Stasko, and J. A. Jacko.
Understanding and characterizing insights: how do
people gain insights using information visualization?
In Proceedings of the 2008 Workshop on BEyond time
and errors: novel evaLuation methods for Information
Visualization, page 4. ACM, 2008.
... The first section of the Cohort Snapshot lists the most common programs that the students in the unit are enrolled in and the corresponding progress rate (Figure 1). This allows academics to ensure that resources are relevant to create an inclusive environment and increasing engagement (Bakharia et al., 2016;Sanger & Gleason, 2020). Knowing who is enrolled in a unit is of particular importance for foundational units where you have large cohorts of students from many programs, with some programs having higher progress rates than others as seen in Figure 1. ...
Full-text available
Article
It is known that due to the large, diverse sets of data captured in learning analytics, clear display of information is crucial to its success. Here, we describe a concise learning analytics resource, the Cohort Snapshot, that has been developed using a human centred approach, with input from experienced academics from across the institution. Meaningful data is presented as a unit level summary and includes program enrolment, student demographics, grade distributions and LMS activity. The data that was used to develop the resource was accessed via the university’s data warehouse which was synthesised using the R programming language. The Cohort Snapshot provides teaching academics, including sessional staff, access to data, to allow the adaption of teaching pedagogy to meet the needs of increasingly diverse student cohorts in a regional Australian university.
... Designing courses requires educators to define learning objectives and pedagogical intent to achieve desired learning outcomes (Mangaroska & Giannakos, 2019). From the learner's perspective, a well-designed course provides insights into the instructors' pedagogical expectations and favourable conditions for effective learning, contributing to learner satisfaction (Bakharia et al., 2016;Lockyer et al., 2013;O'Mahony et al., 2012;Martin et al., 2019, Matcha, et al., 2020. As such, course designers and instructors spend a considerable amount of time developing and re-developing online courses to provide high-quality learning and teaching experiences. ...
Full-text available
Article
Well-designed online courses enhance learning experiences and allow effective development of learners' skills and knowledge. A critical factor contributing to the design of online courses in the higher education settings are well-defined learning objectives that align with course assessments and learning activities. While there are several introspective instruments to evaluate course designs, with the broader adoption of educational technologies and digital tools, there is a wealth of data that offers insights on the alignment of learning objectives to assessments. Such data has paved the way for evidence-based methods of investigating course effectiveness within higher education. This study outlines a methodology for designing and evaluating the alignment between course learning objectives and assessment activities at scale, utilising a combination of learning analytics and measurement theory approaches, more specificially exploratory multi-dimensional item response theory (MIRT) models. We demonstrate the proposed methodology within a professional development MOOC on leadership skills development, where we evaluate the alignemnt between course objectives and reflective writing assessments activities. Our results suggested that the alignment of the existing course objectives to assessment activities can be improved, showing the practical value of the proposed approach. The theoretical and practical implications of this research are further illustrated.
... LA applications can track the learning behaviors for cognitive, metacognitive and psychomotor learning tasks (Mor et al., 2015;Park et al., 2017). Nevertheless, in all these LA techniques and procedures, clear guidelines for aligning the collected data with the pedagogical models and acquiring substantial results are still deficient (Bakharia et al., 2016;Macfadyen et al., 2020). More specifically, LA can track a large amount of data relating to teachers' and learners' activities, but it is still scarce concerning the methods to identify relevant LA indicators that can support teachers and learners using tracked datasets (Ferguson, 2012). ...
Full-text available
Conference Paper
In recent years, Learning Analytics (LA) has become a very heterogeneous research field due to the diversity in the data generated by the Learning Management Systems (LMS) as well as the researchers in a variety of disciplines, who analyze this data from a range of perspectives. In this paper, we present the evaluation of a LA tool that helps course designers, teachers, students and educational researchers to make informed decisions about the selection of learning activities and LA indicators for their course design or LA dashboard. The aim of this paper is to present Open Learning Analytics Indicator Repository (OpenLAIR) and provide a first evaluation with key stakeholders (N=41). Moreover, it presents the results of the prevalence of indicators that have been used over the past ten years in LA. Our results show that OpenLAIR can support course designers in designing LA-based learning activities and courses. Furthermore, we found a significant difference between the relevance and usage of LA indicators between educators and learners. The top rated LA indicators by researchers and educators were not perceived as equally important from students' perspectives.
Article
Background: The use of crowdsourcing in a pedagogically supported form to partner with learners in developing novel content is emerging as a viable approach for engaging students in higher-order learning at scale. However, how students behave in this form of crowdsourcing, referred to as learnersourcing, is still insufficiently explored. Objectives: To contribute to filling this gap, this study explores how students engage with learnersourcing tasks across a range of course and assessment designs. Methods: We conducted an exploratory study on trace data of 1279 students across three courses, originating from the use of a learnersourcing environment under different assessment designs. We employed a new methodology from the learning analytics (LA) field that aims to represent students' behaviour through two theoretically-derived latent constructs: learning tactics and the learning strategies built upon them. Results: The study's results demonstrate students use different tactics and strategies, highlight the association of learnersourcing contexts with the identified learning tactics and strategies, indicate a significant association between the strategies and performance and contribute to the employed method's generalisability by applying it to a new context. Implications: This study provides an example of how learning analytics methods can be employed towards the development of effective learnersourcing systems and, more broadly, technological educational solutions that support learner-centred and data-driven learning at scale. Findings should inform best practices for integrating learnersourcing activities into course design and shed light on the relevance of tactics and strategies to support teachers in making informed pedagogical decisions. Journal of Computer Assisted Learning© 2022 The Authors. Journal of Computer Assisted Learning published by John Wiley & Sons Ltd.
Article
Öğrenme Tasarımı, öğrenci hangi aktiviteyi, ne zaman, ne kadar sürede ve hangi sırada yaparsa daha iyi öğrenebilir sorusunun yanıtlanması için bir topluluk etkileşimini tanımlamaktadır. Öğrenme tasarımının iyileştirilmesi için öğrenme analitikleri kanıta dayalı öngörü oluşturulması yönünden önemlidir. Bu öngörülerin farklı durumlara transfer edilebilmesi için öğrenme analitiklerin hangi öğrenme tasarımı bağlamında kullanıldığına daha fazla odaklanılması gereksinimi ortaya çıkmaktadır. Bu çalışmada, öğrenme analitikleri sürecinin niçin öğrenme tasarımı ile çevrelenmesi gereksiniminden ve alanyazındaki çerçevelerin sunduğu geniş bakış açılarından yola çıkarak; öğrenme analitikleri öngörülerinin daha işlevsel olması için, öğrenme analitiklerinin hangi bağlamda ele alındığını kolaylaştıracak çerçeveler özetlenmiş ve daha işlevsel bulunanlar tartışılmıştır. E-öğrenme için öğrenme türleri ve etkinlik tasarımı olarak önerilen öğrenme tasarımı çerçeveleri, Öğrenme Yönetim Sistemi (ÖYS) içerisinde online derslerin tasarımında kolaylıkla kullanılabilecek sınıflamalar içermektedir. Analitik Katmanları Çerçevesi bir öğrenme analitiği uygulamasında hangi analitiklere odaklanılacağı konusunu çok boyutlu bir perspektiften örneklendirmektedir. Tartışılan çerçevelerin gelecekteki çalışmalar için temel alınması, öğrenme tasarımı ve öğrenme analitikleri etkileşiminden doğan öngörülerin farklı bağlamlar için güncellenerek uygulanmasını mümkün hale getirebilir.
Full-text available
Article
Lay Description What is already known about this topic? Learning design (LD) is the pedagogic process used in teaching/learning that leads to the creation and sequencing of learning activities and the environment in which it occurs. Learning analytics (LA) is the measurement, collection, analysis & reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. There are multiple studies on the alignment of LA and LD but research shows that there is still room for improvement. What this paper adds? To achieve better alignment between LD and LA. We address this aim by proposing a framework, where we connect the LA indicators with the activity outcomes from the LD. To demonstrate how learning events/objectives and learning activities are associated with LA indicators and how an indicator is formed/created by (several) LA metrics. We address this aim in our review. This article also aims to assist the LA research community in the identification of commonly used concepts and terminologies; what to measure, and how to measure. Implications for practice and/or policy This article can help course designers, teachers, students, and educational researchers to get a better understanding on the application of LA. This study can further help LA researchers to connect their research with LD.
Full-text available
Article
Background Data‐driven educational technology solutions have the potential to support teachers in different tasks, such as the designing and orchestration of collaborative learning activities. When designing, such solutions can improve teacher understanding of how learning designs impact student learning and behaviour; and guide them to refine and redesign future learning designs. When orchestrating educational scenarios, data‐driven solutions can support teacher awareness of learner participation and progress and enhance real time classroom management. Objectives The use of learning analytics (LA) can be considered a suitable approach to tackle both problems. However, it is unclear if the same LA indicators are able to satisfactorily support both the designing and orchestration of activities. This study aims to investigate the use of the same LA indicators for supporting multiple teacher tasks, that is, design, redesign and orchestration, as a gap in the existing literature that requires further exploration. Methods In this study, first we refer to the previous work to study the use of different LA to support both tasks. Then we analyse the nature of the two tasks focusing on a case study that uses the same collaborative learning tool with LA to support both tasks. Implications The study findings led to derive design considerations on LA support for teachers’ design and orchestrating tasks.
Article
Teaching with online learning platforms should simplify the monitoring of students’ activity, particularly when evaluating student dropout. Popular learning environments such as Moodle should implement visual analytic tools that facilitate such tasks, nevertheless, institutions are usually reluctant to incorporate them. This paper presents UBUMonitor, a desktop application that allows the visualization of student’s activity data, extended as a proof of concept with a module for dropout tracking. Therefore, by using UBUMonitor, teachers will be able to easily visualize their students’ engagement with their subject, which can facilitate early action to prevent students from dropping out more effectively.
Article
There is a huge and growing amount of data that is already captured in the many, diverse digital tools that support learning. Additionally, learning data is often inaccessible to teachers or served in a manner that fails to support or inform their teaching and design practice. We need systematic, learner-centred ways for teachers to design learning data that supports them. Drawing on decades of Artificial Intelligence in Education (AIED) research, we show how to make use of important AIED concepts: (1) learner models; (2) Open Learner Models (OLMs); (3) scrutability and (4) Ontologies. We show how these concepts can be used in the design of OLMs, interfaces that enable a learner to see and interact with an externalised representation of their learning progress. We extend this important work by demonstrating how OLMs can also drive a learner-centred design process of learning data. We draw on the work of Biggs on constructive alignment (Biggs, 1996, 1999, 2011), which has been so influential in education. Like Biggs, we propose a way for teachers to design the learning data in their subjects and we illustrate the approach with case studies. We show how teachers can use this approach today, essentially integrating the design of learning data along with the learning design for their subjects. We outline a research agenda for designing the collection of richer learning data. There are three core contributions of this paper. First, we present the terms OLM, learner model, scrutability and ontologies, as thinking tools for systematic design of learning data. Second, we show how to integrate this into the design and refinement of a subject. Finally, we present a research agenda for making this process both easier and more powerful.
Article
Revue Sciences et Techniques de l'Information et de la Communication pour l'Éducation et la Formation
Full-text available
Conference Paper
One of the great promises of learning analytics is the ability of digital systems to generate meaningful data about students' learning interactions that can be returned to teachers. If provided in appropriate and timely ways, such data could be used by teachers to inform their current and future teaching practice. In this paper we showcase the learning analytics tool, Loop, which has been developed as part of an Australian Government Office of Learning and Teaching project. The project aimed to develop ways to deliver learning analytics data to academics in a meaningful way to support the enhancement of teaching and learning practice. In this paper elements of the tool will be described. The paper concludes with an outline of the next steps for the project including the evaluation of the effectiveness of the tool.
Full-text available
Article
This study examined the extent to which instructional conditions influence the prediction of academic success in nine undergraduate courses offered in a blended learning model (n = 4134). The study illustrates the differences in predictive power and significant predictors between course-specific models and generalized predictive models. The results suggest that it is imperative for learning analytics research to account for the diverse ways technology is adopted and applied in course-specific contexts. The differences in technology use, especially those related to whether and how learners use the learning management system, require consideration before the log-data can be merged to create a generalized model for predicting academic success. A lack of attention to instructional conditions can lead to an over or under estimation of the effects of LMS features on students' academic success. These findings have broader implications for institutions seeking generalized and portable models for identifying students at risk of academic failure.
Full-text available
Conference Paper
Annie.Bryan@open.ac.uk ABSTRACT While substantial progress has been made in terms of predictive modeling in the Learning Analytics Knowledge (LAK) community, one element that is often ignored is the role of learning design. Learning design establishes the objectives and pedagogical plans which can be evaluated against the outcomes captured through learning analytics. However, no empirical study is available linking learning designs of a substantial number of courses with usage of Learning Management Systems (LMS) and learning performance. Using cluster-and correlation analyses, in this study we compared how 87 modules were designed, and how this impacted (static and dynamic) LMS behavior and learning performance. Our findings indicate that academics seem to design modules with an " invisible " blueprint in their mind. Our cluster analyses yielded four distinctive learning design patterns: constructivist, assessment-driven, balanced-variety and social constructivist modules. More importantly, learning design activities strongly influenced how students were engaging online. Finally, learning design activities seem to have an impact on learning performance, in particular when modules rely on assimilative activities. Our findings indicate that learning analytics researchers need to be aware of the impact of learning design on LMS data over time, and subsequent academic performance.
Full-text available
Article
This paper describes a study that looked at the effects of different technology-use profiles on educational experience within communities of inquiry, and how they are related to the students’ levels of cognitive presence in asynchronous online discussions. Through clustering of students (N=81) in a graduate distance education engineering course, we identified six different profiles: 1) Task-focused users, 2) content-focused no users, 3) no users, 4) highly intensive users, 5) content-focused intensive users, and 6) Socially-focused intensive users. Identified profiles significantly differ in terms of their use of learning platform and their levels of cognitive presence, with large effect sizes of 0.54 and 0.19 multivariate η2, respectively. Given that several profiles are associated with higher levels of cognitive presence, our results suggest multiple ways for students to be successful within communities of inquiry. Our results also emphasize a need for a different instructional support and pedagogical interventions for different technology-use profiles.
Full-text available
Conference Paper
This paper provides an outline of an Australian Government Office of Learning and Teaching project that aims to investigate and then develop ways in which learning analytics data can be more usefully harnessed by academic teachers in higher education. Fundamental to this project is linking the learning design of online tasks provided to students with the learning analytic affordances of the technology-based tools that support them. The paper provides an outline of the background to the project, including its conceptual underpinnings, and sets out the program of research and development. The expected outcomes of the project are discussed.
Full-text available
Article
This special issue deals with three areas. Learning design is the practice of devising effective learning experiences aimed at achieving defined educational objectives in a given context. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers' practice. Learning analytics use data about learners and their contexts to understand and optimise learning and the environments in which it takes place. Typically, these three—design, inquiry and analytics—are seen as separate areas of practice and research. In this issue, we show that the three can work together to form a virtuous circle. Within this circle, learning analytics offers a powerful set of tools for teacher inquiry, feeding back into improved learning design. Learning design provides a semantic structure for analytics, whereas teacher inquiry defines meaningful questions to analyse.
Full-text available
Article
The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational data mining. This paper first introduces the field of learning analytics and outlines the lessons learned from well-known case studies in the research literature. The paper then identifies the critical topics that require immediate research attention for learning analytics to make a sustainable impact on the research and practice of learning and teaching. The paper concludes by discussing a growing set of issues that if unaddressed, could impede the future maturation of the field. The paper stresses that learning analytics are about learning. As such, the computational aspects of learning analytics must be well integrated within the existing educational research.
Article
The effect of technology-enhanced learning (TEL) strategies in higher education has arguably been transformative despite the not-insignificant barriers existing in this context. Throughout the discourse very little attention has been paid to those primarily responsible for this implementation—academic teaching staff. This paper aims to highlight the impact of academic workload allocations, an often silent barrier to the uptake of TEL strategies in higher education. We will discuss the effects of academic identity and culture, preferential time allocation to associative activities, academic technological capacity, university policies and workload and funding models on the uptake, and implementation on TEL in higher education. Our aim is to highlight the risks to staff, students and institutions should these concerns not be addressed and to propose a model for utilisation by all staff responsible for implementing flexible workload models supportive of further implementation of TEL strategies across the sector.
Conference Paper
A main goal for learning analytics is to inform the design of a learning experience to improve its quality. The increasing presence of solutions based on big data has even questioned the validity of current scientific methods. Is this going to happen in the area of learning analytics? In this paper we postulate that if changes are driven solely by a digital footprint, there is a risk of focusing only on factors that are directly connected to numeric methods. However, if the changes are complemented with an understanding about how students approach their learning, the quality of the evidence used in the redesign is significantly increased. This reasoning is illustrated with a case study in which an initial set of activities for a first year engineering course were shaped based only on the student's digital footprint. These activities were significantly modified after collecting qualitative data about the students approach to learning. We conclude the paper arguing that the interpretation of the meaning of learning analytics is improved when combined with qualitative data which reveals how and why students engaged with the learning tasks in qualitatively different ways, which together provide a more informed basis for designing learning activities.
Article
This paper proposes an analysis of current research in learning design (LD), a field aiming to improve the quality of educational interventions by supporting their design and fostering the sharing and reuse of innovative practices among educators. This research area, at the moment, focuses on three main strands: the representations that can be used as a common language to communicate about pedagogical plans and other half-fabricates of the design process, the methodological approaches to LD and the tools that support the LD process. For each of the three strands, the current landscape is discussed, pointing at open issues and indicating future research perspectives, with particular attention to the contribution that learning analytics can make to transform LD from a craft, based on experience, intuition and tacit knowledge, into a mature research area, grounded on data concerning the learning process and hence supporting enquiry while teachers design, run and evaluate the learning process.