Content uploaded by L. Lockyer
Author content
All content in this area was uploaded by L. Lockyer on Jul 28, 2018
Content may be subject to copyright.
1
NOTE: This is the accepted version of the Contribution (verison 2). The published Contribution
(version 3) appeared as:
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: aligning learning
analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459. DOI:
10.1177/0002764213479367
Informing Pedagogical Action: Aligning Learning Analytics
with Learning Design
Lori Lockyer, Macquarie University, Australia
Elizabeth Heathcote, University of Adelaide, Australia
Shane Dawson, University of South Australia, Australia
Abstract
This paper considers the developing field of learning analytics and argues that in order to move from
small-scale practice to broad scale applicability, there is a need to establish a contextual framework that
helps teachers interpret the information that analytics provides. The paper presents learning design as a
form of documentation of pedagogical intent that can provide the context for making sense of diverse
sets of analytic data. We investigate one example learning design to explore how broad categories of
analytics – which we call checkpoint and process analytics – can inform the interpretation of outcomes
from a learning design, and facilitate pedagogical action.
2
Introduction
This paper examines two relatively new concepts within education, learning analytics, i.e., the
collection, analysis and reporting of data associated with student learning behaviour, and learning
design, i.e., the documented design and sequencing of teaching practice, and how together these may
serve to improve understanding and evaluation of teaching intent and learner activity. Learning
analytics offers a passive method of gathering information on how learners are interacting with learning
resources, each other, and their teacher(s). Unlike traditional surveys or focus groups, which rely on
participants both opting to provide feedback and accurately remembering and reporting past events,
learning analytics captures data on specific, observable behaviour in real-time. While this overcomes
data accuracy difficulties, the challenge posed by learning analytics is interpreting the resulting data
against pedagogical intent, and the local context to evaluate the success or otherwise of a particular
learning activity (Dawson, Bakharia, Lockyer, & Heathcote, 2010). Learning designs, which document
pedagogical intent and plans, potentially provide the context to make sense of learning analytics data.
Essentially, learning design establishes the objectives and pedagogical plans, which can then be
evaluated against the outcomes captured through learning analytics.
In this paper, we explore how learning design might provide the framework for interpreting learning
analytics results and apply the concept to a sample learning design. Taking the example of design for
Case-based Learning, we investigate how learning analytics can help to evaluate whether a learning
design is achieving its intended purpose. Using a framework we call checkpoint and process analytics,
we consider how expected learner behaviours and interactions to intended outcomes of the learning
design. We argue that the resulting information allows a learning design to be evaluated in context,
with a rich set of real-time, behaviour-based data on how learners are currently interacting within the
learning environment.
We begin with definitions and a discussion of the importance of learning analytics and learning design
as fields that inform educational practice, and then turn to a example learning design that illustrates the
potential for juxtaposition of learning design and learning analytics.
3
Overview of Learning Analytics
Over the past few decades, there has been increasing government, public, and industry interest in
developing indicators of the quality of learning and teaching practices (Bloxham & Boyd, 2012;
Coates, 2005, 2010; King Alexander, 2000). Arguably, shrinking fiscal resources and the expansion of
a global competitive education market has fuelled this increasing pressure for educational
accountability. The offshoot of these economic drivers has been the development in the education
sector of standardised scalable, real-time indicators of teaching and learning outcomes. However,
creating such standards is a complex task given the diversity of student engagements, systems, learning
outcomes and teaching practices that are enacted across any educational institution. Any attempt to
introduce wide scale educational analytics and accountability processes thus requires a thorough
understanding of the pedagogical and technical context in which the data is generated.
In universities, learning quality assurance data are generally derived from student experience surveys
alongside measures of attrition, progression and assessment scores. These data are commonly used
retrospectively by university administrators and teachers to improve future iterations of courses,
determine impact on learning outcomes, and to provide a benchmark on overall performance (Coates,
2005). The high adoption of education technologies, such as learning management systems (LMS), has
resulted in a vast set of alternate and accessible learning data (Greller & Drachsler, in press; Pardo &
Kloos, 2012). Student interactions with the course activities via the LMS are captured and stored. The
resulting digital footprints can be collected and analysed to establish indicators of teaching quality and
provide more proactive assessment of student learning and engagement. This area of applied research is
becoming known as learning analytics.
As defined for the first Learning Analytics and Knowledge Conference in 2011, the study of learning
analytics is the “measurement, collection, analysis and reporting of data about learners and their
contexts, for purposes of understanding and optimizing learning and the environments in which it
occurs” (https://tekri.athabascau.ca/analytics/). Research in learning analytics interrogates the data
associated with a learner’s online interactions to create predictive models for performance (Macfadyen
& Dawson, 2010) and attrition (Campbell & Oblinger, 2007), as well as more complex learning
dimensions such as dispositions and motivations (Buckingham Shum & Deakin Crick, 2012; Dawson,
Macfadyen, & Lockyer, 2009). These forms of data inform decisions about future learning and
4
teaching practice. The emergent field is multi-disciplinary and draws on methodologies related to
educational data mining, social network analysis, artificial intelligence, psychology, and educational
theory and practice.
Learning analytics integrates and analyses the ‘big data’ sets available in educational contexts in order
to gain a better understanding of student engagement, progression and achievement. Although the field
is still in its infancy, learning analytics can help teachers interpret learner- and instructor-centric data
for informing future pedagogical decisions. To date, learning analytics studies have tended to focus on
broad learning measures such as predictors of student attrition (Arnold, 2010), sense of community and
achievement (Fritz, 2010), and overall return on investment of implemented technologies (Norris,
Baer, Leonard, Pugliese, & Lefrere, 2008). However, learning analytics also provides additional and
more sophisticated measures of the student learning process that can assist teachers in designing,
implementing and revising courses. While there is a vast potential for this field, there still remains
much work to be done to build the theoretical and empirical base that provides clear evaluative
procedures for matching observed student interaction behaviours with course and program level
learning goals and outcomes (Pardo & Kloos, 2012). These forms of analytics and the associated data
sets, tools and models for analysis can be increasingly important for informing teachers on the success
and outcomes of their design of learning experiences and activities alongside monitoring student
learning for direct support during the academic semester.
Overview of Learning Design
The field of learning design emerged in the early 2000s as researchers and educational developers saw
the potential to use the Internet to document and share examples of good educational practice. Here, we
use the term ‘learning design’, but work in the same vein has been carried out under such names as
‘pedagogical patterns,’ ‘learning patterns’ and ‘pattern language.’ Learning design describes the
sequence of learning tasks, resources and supports that a teacher constructs for students over part of, or
the entire, academic semester. A learning design captures the pedagogical intent of a unit of study.
Learning designs provide a board picture of a series of planned pedagogical actions rather than detailed
accounts of a particular instructional event (as might be described in a traditional lesson plan). As
5
such, learning designs provide a model for intentions in a particular learning context that can be used as
a framework for design of analytics to support faculty in their learning and teaching decisions.
The learning design field has developed in part as a response to the discourse about teaching and
learning in higher education of the preceding decades. Higher education treatises of the early 1990s
(e.g. Laurillard, 1993; Ramsden, 1992) called for more effective teaching in higher education, a move
away from reliance on the traditional, didactic large group lecture, and an assumption that information
and communication technology would help revolutionise higher education pedagogy. Thus, the broad
field of learning design was underpinned by two main aims: to promote teaching quality and to
facilitate the integration of technology into teaching and learning.
A main premise of learning design has been reusability across educational contexts, based on the notion
that if good teaching practice in one educational context could be captured in a description, then that
description could be read, interpreted and adapted for re-use in another context. Research and
development work in this area has included the creation of online repositories of learning designs that
teachers could read, interpret and adapt to their own practice (e.g. Agostinho, Harper, Oliver, Hedberg,
& Wills, 2008; Conole & Culver, 2010), and the development of technical languages and tools
designed to make learning designs machine readable and adaptable (Koper, 2006; Masterman, 2009).
These efforts required researchers to identify and evaluate examples of good practice. In this field, the
notion of good practice manifests in such a way that most learning designs shared through repositories
focus on alternative pedagogies for higher education and most emphasise the use of technology (see for
example (the Learning Designs site http://www.learningdesigns.uow.edu.au/ and the Pedagogical
Pattern Collector http://193.61.44.29:42042/ODC.html). This stems from the field’s development as a
response to quality teaching in higher education and assumptions that constructivist approaches to
teaching and learning would support quality designs and practices. The importance of engaging and
challenging the learner are among the underlying principles of good practice (Boud & Prosser, 2002).
A such, we find an emphasis in learning design on project, experiential, and inquiry-based pedagogies
that place importance on learner communication and interaction, often facilitated by technology.
The learning designs come in many forms and level of detail. Some draw upon an architectural model
to describe textually solutions to common educational problems (McAndrew & Goodyear, 2007). Some
use common representations such as process diagrams, flowcharts, and tables (Falconer, Beetham,
6
Oliver, Lockyer, & Littlejohn, 2007), and others combine text descriptions with graphical
representations (Agostinho et al., 2008). Regardless of the format in which learning designs are
documented, essential elements include: identifying the key actors involved (teachers and students),
what they are expected to do (teaching and learning tasks), what educational resources are used to
support the activities, and the sequence in which the activities unfold. These essential elements may be
presented with great detail and provide a highly contextualised description of a particular unit, covering
specific topics. Or they may be presented more generically, free of the detail of any particular
implementation of the design. Learning designs also range in granularity from presenting a teaching
and learning process that might occur for an entire semester-long course to that which might occur in
only one class.
Considering this variability in presentation, detail and granularity, research in the field has focused on
the dissemination, adoption, use and usability of learning designs. The learning design approach has
been found to be useful for faculty to document their own practice, for instructional designers to
document the practices of those they may work with, and for both faculty and designers to interpret the
practices of others (Agostinho, 2011). In particular, graphical representations of learning designs have
been found to stimulate design ideas for teachers who are engaged in designing a course (Bennett,
Lockyer, & Agostinho, 2004; Jones, Bennett, & Lockyer, 2011). Whether they are using learning
designs documented by a graphical representations and/or a textual description to stimulate ideas and
support their own design practices, teachers seem to find specific examples of learning designs – those
that retain information about the original context for the design –more valuable than generic designs
(Bennett et al., 2004). This suggests that teachers can use specific, detailed learning designs as
examples and are able to adapt the ideas to their own context.
The most easily understood and adapted common elements within all learning designs include:
• a set of resources for the student to access which could be considered to be prerequisites to the
learning itself (these may be files, diagrams, questions, web links, pre-readings etc.);
• tasks the learners are expected to carry out with the resources (prepare and present findings,
negotiate understanding etc.); and
• support mechanisms to assist in the provision of resources and the completion of the tasks.
These supports indicate how the teacher, other experts or peers might contribute to the learning
7
process (e.g., such as moderation of a discussion or feedback on an assessment piece) (Bennett et
al., 2004).
Figure 1 provides an example learning design visual representation showing three common categories
of resources, tasks and supports.
Figure 1: A learning design visual representation showing three common categories (resources, tasks and supports) (
(adapted from http://www.learngingdesigns.uow.edu.au).
While learning designs can provide a description of pedagogical intention, they do not identify how
students are engaged in that design during or post implementation. This is where learning analytics can
provide information for a more holistic perspective of the impact learning activities.
Using Learning Analytics
Learning analytics has the potential to draw on a variety of data sources that are collected in a range of
institutional systems, including student information systems (Lauría, Baron, Devireddy, Sundararaju, &
Jayaprakash, 2012), library interactions (Bichsel, 2012), learning management systems (Dawson, 2010;
Liaqat, Hatala, Gašević, & Jovanović, 2012), admissions systems (Dawson, Macfadyen, Lockyer, &
Mazzochi-Jones, 2011) and grades (Macfadyen & Dawson, 2010). However, at present the
predominance of learning analytics research centres on the types of data available in institutional LMS.
Current commercial and open source LMS provide a level of student tracking data that can be made
available to teachers as reports and tables indicating, for example, student time spent online, page
views, or number of posts in a discussion forum. At present, this ubiquitous data is underutilised as an
indicator of student engagement and learner progress. This is largely due to the lack of conceptual
frameworks and resulting common understanding of how to use and interpret such data, and models
which can validly and reliably align such data with the learning and teaching intent (Ferguson, 2012;
Mazza & Dimitrova, 2007). At present, the available LMS data is neither easily understood by teachers
in it aligns to individual and group student engagement behaviours (activity patterns), nor presented in
ways that provide easy interpretation. One approach that can assist teachers to interpret this data is via
8
visualizations (Dawson, McWilliam, & Tan, 2008). Various approaches to learning analytics, and
visualizations in particular, are discussed next.
A variety of learning analytics tools are available which summarise and visualise various elements of
student behaviour and activities (see Table 1).
Table 1: Examples of learning analytics tools and visualisations.
Visualisation
Available Tools
Description
Framework
Reports
BlackBoard, Moodle,
Desire 2 Learn
Individual user tracking, course
based
Individual and cohort
monitoring
Social Network
Analysis
SNAPP - Social
Networks Adapting
Pedagogical Practice
Extracts and visualises student
relationships established through
participation in LMS discussions
(Dawson, Bakharia, & Heathcote,
2010)
Social-constructivist models
of learning
Student
dashboards and
monitoring
Student Activity
Meter
Visualisations of student activity
for promotion of self-regulated
learning processes (Govaerts,
Verbert, Duval, & Pardo, 2012)
Self-regulated learning –
monitoring of individual
behaviours and achievement to
guide learning process
Individual and
Group
monitoring
GLASS: Gradient's
Learning Analytics
System
Visualisations of student and group
online event activity (Leony et al.,
2012)
Individual and cohort
monitoring
Learning content
interaction
LOCO- Analyst
Provides insight into individual
and group interactions with the
learning content (Jovanović et al.,
2007)
Individual and cohort
monitoring
Discourse
analysis
Cohere
Supports and displays social and
conceptual networks and
connections (De Liddo,
Buckingham Shum, Quinto,
Bachler, & Cannavacciuolo, 2011)
Social learning and
argumentation theory
One type of visualisation noted in Table 1 is social networks. These network diagrams can be applied
in education to depict teacher and learner online communication patterns. Tools, such as SNAPP
(Dawson, Bakharia, & Heathcote, 2010), draw upon data from the LMS to represent visually patterns
of user interactions. Figure 2 illustrates how such tools can present interaction data visually to the
teacher from within the LMS. A caveat is that, while visualisations offer effective ways of making
9
sense of large datasets, they still require familiarity and expertise to fully appreciate their results. Social
network diagrams in particular require some degree of literacy in interpreting the results, e.g., in
understanding the meaning of actor locations, and nuances associated with what data were used to draw
connections between actors.
Figure 2: SNAPP visualisation tool embedded in a LMS discussion page
The interpretation of visualisations also depends heavily on an understanding the context in which the
data were collected and the goals of the teacher regarding in-class interaction. Interpretation of the
analysis thus requires alignment with the original teaching context if it is to be useful as feedback on
whether the learning design has achieved its intent. Interpretation requires an understanding of the
relationship between technology functionality, observed interactions behaviours and educational theory
(Heathcote, 2006). It is the conceptual bridging and understanding between the technical and
educational domains that remains problematic for learning analytics (Dawson, Heathcote, & Poole,
2010). This leads to questions surrounding how analytics can begin to bridge the technical-educational
divide to provide just-in-time, useful and context-sensitive feedback on how well the learning design is
meeting its intended educational outcomes. Here we argue that a critical step for moving forward on
this agenda entails the establishment of methods for identifying and coding learning and teaching
10
contexts. This requires a marriage of the processes and methodologies associated with the fields of
learning analytics and learning design, one that is illustrated in the example discussed below.
Learning Analytics to Evaluate Learning Design
To explore the importance of understanding learning design (or pedagogical intent) for accurate
interpretation of social network analysis in learning contexts, we examine a particular instance of
interaction, as illustrated in the social network diagram presented in Figure 3.
Figure 3: Facilitator-centric social network pattern - ego network. Each node (bubble) represents a student or instructor.
Here, the central bubble is acting as ‘facilitator’ with most of the interactions being controlled through them. The bold-
outlined nodes illustrate people who are within the central node’s ‘ego-network’, i.e., they are in direct contact with the
central node.
The social network diagram shows a facilitator-centric pattern. Interaction in this discussion forum may
be seen to be dominated by a central participant – in this example, the central actor is the instructor.
Various learning designs should result in this pattern if they are considered as successful (i.e. reflect
achievement of the intended learning design). For example, if this diagram represents a question and
answer (Q&A) forum on course content, then the network is well aligned with the pedagogical
intentions. Q&A forums commonly represent situations where one-to-one relationships mediated by an
instructor are expected. If the instructor was absent from a configuration like this, it might indicate
either successful delegation of the answering of student queries to other student(s), or if the intent was
not to delegate answering responsibility, may indicate an absent instructor and potentially frustrated
students. Alternatively, if the network showed a pattern where one student facilitated their particular
11
topic, one would expect the central node to represent the facilitating student in the early phase of
discussion on the topic, where they mediate and clarify understanding for their peers. Conversely, if the
intent of the forum was to promote learner-to-learner interaction for co-construction of knowledge, then
the pattern seems polarised from the intended aim. The learning design and intent of the forum clearly
needs to be established before the analytics visualisation provides useful evaluative insight.
Thus, to interpret the data that learner environments generate, it is important to combine learning
analytics with learning design. While learning designs provide theoretical, practice-based, and/or
evidence-based examples of sound educational design, learning analytics may allow us to test those
assumptions with actual student interaction data in lieu of self-report measures such as post hoc
surveys. In particular, learning analytics provides us with the necessary data, methodologies and tools
to support the quality and accountability that has been called for in higher education.
Aligning Learning Analytics with Learning Design
While theoretically learning designs and learning analytics may be seen to provide compatible
information, to be truly useful a framework is needed to align the two concepts. Our discussion of
learning design and learning analytics focuses on two broad categories of analytic applications. The
first relates to what we term as checkpoint analytics. That is the snapshot data that indicates a student
has met the prerequisites for learning by accessing the relevant resources of the learning design. For
instance, checkpoint analytics would relate to metrics such as logins into the online course site,
downloads of a file for reading, or signing up to a group for a collaborative assignment. While these
forms of analytics may be valuable for providing lead indictors of student engagement, they do not, in
isolation of other data, provide insight into the learning process or understanding of how students are
learning and what they are learning. As checkpoint analytics exclusively measure access to the
resources included in a learning design, their value lies in providing teachers with broad insight into
whether or not students have accessed prerequisites for learning and/or are progressing through the
planned learning sequence (akin to attendance in a face-to-face class). Data on whether or not students
have accessed pre-readings or organised themselves into groups for upcoming assignments could be
considered checkpoints that indicate whether the foundations for learning have been established, and
thus checkpoint analytics concentrate on highlighting which students have completed these learning
prerequisites and which have not.
12
The second type of learning analytics we term as process analytics. These data and analyses provide
direct insight into learner information processing and knowledge application (Elias, 2011) within the
tasks that the student completes as part of a learning design. For example, social network analysis of
student discussion activity on a discussion task provides a wealth of data that can offer insight into an
individual student’s level of engagement on a topic, their established peer relationships and therefore
potential support structures. The inclusion of content analysis adds further scope for determining the
level of understanding and learning models established.
The articulation of the nature of support available within learning designs helps to interpret process
learning analytics. These supports give an indication of what roles we can expect to see learners and
teachers taking within collaborative spaces such as discussion forums (e.g. whether we would expect
exclusively student to student interactions in a group discussion on construction of a group assignment,
or a facilitator-centric interactions in the Q&A portion of the forum). In this way they help to provide
an expected configuration based on what support was built into the learning design.
Learning Design and Analytics Investigation
The following investigates a theoretical scenario to illustrate the potential for leveraging learning
analytics in support of and for evaluation of learning design. The scenario uses a learning design drawn
from a repository established through an Australian project that identified, reviewed and documented
examples of university courses that effectively used technology to facilitate flexible learning
(Agostinho et al., 2008; http://www.learningdesigns.uow.edu.au/). The design selected for illustration
here comprises individual, small group and large group learning tasks, and use of online resources and
discussion forums.
The following describes the selected learning design and then we discuss the types of analytics that
may inform the teacher about (1) how students are learning during implementation of the design and
(2) how the design might be adapted or redesigned for further iterations.
13
Case-based Learning Design
The learning design investigated here provides a sequential representation and brief description of the
learning design. The central point of the diagram (Figure 4) is the learning task (represented as a square
with green shading) that the students are expected to do, the associated content resources (represented
as a triangle with blue shading) and the teacher and/or peer support that facilitate the tasks (represented
as a circle with pink shading). Each step in the sequence of the learning design is associated with
potential analytics: checkpoints (represented as crosses) and processes (represented as open circles).
Figure 4: Case Based Learning Design adapted from Bennett (2002) available at
http://needle.uow.edu.au/ldt/ld/4wpX5Bun. The last column outlines potential learning analytics corresponding to stages
within the learning design. These are specified as process (represented by open circles) and checkpoint (represented by
stars) analytics.
Author: jlj366
Created: 2011-10-31 04:20pm
Edited: 2011-11-29 04:05pm
Keywords: case based learning, professional
knowledge, theory-practice link, complex problems,
authentic problems
Description:
This design helps students to develop knowledge and skills relevant to their future professional
practice. They begin by analysing relevant cases and then apply what they have learned to
solving an industry-based problem. At the end of the project they reflect on their experiences.
The design as described is implemented in a face-to-face setting with online support.
This sequence of tasks aims to help students link theory to practice in two ways. One is by
analysing cases relevant to the knowledge and skills to be developed. Cases provide detailed
descriptions of realistic situations, dilemmas and their outcomes. By analysing a known problem
and its solution, students come to appreciate the complexity of real-life situations and derive
conclusions that they can take into their own future practice. In this design the cases should be
chosen to be relevant to the later project task, so that analysis of the cases provide students with
useful insights into real world practice. The purpose of the project is to enable students to begin
to put their ideas into practice, bring together component skills already learned and deal with
complex situations. This is based on the premise that learners develop their skills and knowledge
through engagement with realistic (or authentic) problems. The final tasks provide students with
an opportunity to look back on their experiences and extract lessons for the future.
Case Based Learning
Intended Learning Outcomes:
to develop specific knowledge and skills related to the project
to develop understanding of how theory relates to practice
to develop skills in case analysis and reflection
to develop teamwork and project skills
Resources Tasks Supports
Two related cases
Analysis questions
Analyse cases individually.
Explore concepts from the case
materials and from the literature
Focus questions
Discuss the cases in project teams
Small group discussion
List of key points
Discuss Emerging concepts as a
whole class
Teacher facilitation of discussion
Past student projects
Proposal template,
Proposal journal
Develop the project proposal (team)
Group discussion & collaboration
Develop the project
Feedback on proposal
Reflection Template
Reflect on the outcome both
individually and as a project team
Additional information
This design can be customised to suit any project, but particularly those for which there might not be a single correct answer, but a range of options (eg.
design projects). The design can be used to add case analysis as preparation for an existing problem-based task, or to add problem/project and reflection
tasks to case-based learning.
This design is likely to be more appropriate for students with relevant prior knowledge on which they can draw (ie., the design is more appropriate for
students in later years of undergraduate study who have developed content knowledge or students with relevant work experience such as those at
postgraduate level). The design can be adapted with a different mix of individual and group work, and with or without online technologies.
All of the resources and supports should be prepared prior to the implementation of the learning design. If no suitable cases exist, then time needs to be
allocated for case preparation. Student teams can be established at the beginning of the teaching session
Suggestions for Assessment:
There are four key outputs which can be assessed:
- case analyses (individual)
- proposal (group)
- project (group)
- reflections (individual and/or group)
Other assessment opportunities are:
- small group discussion participation
- project updates (if used)
Case Based Learning - Learning Design Tool http://needle.uow.edu.au/ldt/ld/4wpX5Bun
1 of 2 18/04/12 12:03 PM
1
2
3
4
5
6
Learning analytics
Checkpoint/
Process analytics
14
This semester-long learning design involves students working collaboratively on group projects
relevant to the students’ future professional practice. As such, this design is typically used in
professionally-focused programs such as architecture, business, teaching, and multimedia design. The
tasks may be carried out fully online, face-to-face, or a combination of both. The learning objectives
for this design are:
- to develop specific knowledge and skills related to the project
- to develop an understanding of how theory relates to practice
- to develop skills in case analysis and reflection
- to develop teamwork and project skills
The learning design is grounded in a case-based reasoning approach that helps students link theory to
practice through a series of case analyses and project tasks. First, students individually engage in a case
analysis task in which they explore and analyze a real-life case. Students may choose from a number
of cases which are similar to their later project task. The cases students were to examine were made
available online by the teacher; cases provided detailed descriptions of realistic situations, problems
encountered, solutions used and their outcomes. As a group and/or whole class, students then discuss
the problems, solutions and the complex circumstances of their chosen cases. Next, the project task
involves students working in groups to develop a written project proposal and a relevant project output
(e.g., blueprint, business case, lesson plan, website). The project is designed to enable students to put
theoretical concepts identified the cases into practice, learn or reinforce skills and deal with complex,
authentic situations. At the conclusion of the project students engage in a reflection task in which they
consider their experiences and extract lessons for future practice. The reflections are undertaken
individually and as a group and may take the form of a discussion and/or written submission.
How Analytics Can Support Implementation of a Learning Design:
Stage 1 Case Analysis Task - Checkpoint Analytics: Learning analytics can generate reports of
student login behaviors and access to individual cases; these provide the teacher with indicators of
when students have commenced the learning sequence. The opportunity then exists for automatic or
teacher-generated reminder alerts that can be incorporated to prompt late starters to initiate the learning
activity (such as Arnold, 2010).
15
Stage 2 Case Analysis Discussion Task – Process Analytics: Once students analyze their case
individually, they then share their ideas with their project group members. They identify issues that
arose in these cases and consider how they may be applicable to the project they are about to undertake.
A network diagram generated from the online discussion forum can help the teacher identify the
effectiveness of each group’s interaction process.
Figure 5: discussion dominated by one student
Figure 6: Equal distribution of student contribution in
discussion
For example, Figure 5 illustrates what a discussion dominated by a single student would look like using
social network analysis; Figure 6 shows an example of greater diversity of interaction. These forms of
expected interactions and learner-behavior patterns can be used to identify deviations between
interaction as anticipated from the learning design, and as actual outcome. For example, if this learning
design called for a student leader to facilitate peers in sharing the ideas and analysis of the cases they
have considered, Figure 5 might demonstrate achievement of that design. However, if all students were
equally expected to share and comment on each other’s cases, Figure 6 might be expected.
Stage 3 Whole Class Discussion Task – Process Analytics: After the project groups discuss their
case analyses, the learning design calls for the teacher to facilitate a whole-class discussion. If
successful, the social network analysis of discussion forum posts should illustrate the teacher as central
in the network. However, as student discussion increases, the teacher may expect to become less
dominant, with discussion increasing among students. Figures 7 and 8 illustrate the expected change in
instructor network position as the student discussion and facilitation process evolves.
16
Figure 7: Red central node represents the instructor –
typical social network visualization during the early
facilitation phase, where the instructor is mediating a
discussion.
Figure 8: Social network example indicating strong
student peer interaction. Instructor facilitation (red
node) reduced. This type of visualization would be
expected after a few weeks of semester for group
discussion activity, where the learning design
emphasizes student discussion.
Stage 4 Project Proposal Task – Process Analytics: At this stage, students begin their project task. In
the first part of the project task, students work in a small group to collaborate on their project proposal.
If the task is completed within a discussion forum, a social network diagram could used to indicate
established density and connections of participation as well as outliers or disconnected students
disengaged from the task. If the collaboration on the assignment occurs within a document sharing tool
such as a wiki or Google docs, analysis of the student content, timing and versions are available for
analysis. A Checkpoint analytic may be included here, to indicate students participating or not
participating in the development of the shared group document.
Stage 5 Project Development Task - Checkpoint or Process Analytics: As students work on their
project task, analytics might include checkpoints to verify that students have accessed the teacher’s
feedback on the proposal. Process analytics might visualize group collaboration on developing the
project after receiving teacher feedback.
Stage 6 Reflection Task - Checkpoint or Process. The final reflection task can be assessed using both
checkpoint and process analytics. The checkpoint is to verify whether the self- reflection template has
been accessed or uploaded with student changes. Additionally further content analysis can be
17
undertaken to map student reflections and changes over an extended period of time. Self-reflection
requires strong metacognitive capacities that have been demonstrated to be essential for developing the
skills necessary for life-long learning (Butler & Winne, 1995).
How Analytics Support Implementation and Redesign
Overall, these kinds of instrumental checkpoint analytics and the more interpretive process analytics
provide the teacher with indicators of student engagement. This can be used both during the course and
after. During the delivery of a course, the teacher may use these analytics to intervene when learning
behavior does not match the theoretical expectations from the learning design. Such intervention may
involve the teacher: sending reminders to students about the suggested progression through the task,
emailing students with prompting questions to promote deeper investigation of content; or, moderating
a planned group discussion to stimulate more equal contribution. This is the kind of intervention that a
teacher would normally undertake during implementation of the course. Traditionally, this kind of
intervention relies on the teacher noticing the unanticipated or detrimental learning behavior. This
awareness of learner behavior is more difficult to do in the online environment than the face-to-face
context where teachers have visual cues to draw upon.
Analytics can also help with course redesign. Traditionally, educators draw upon their past experience
when they teach a course or when they are designing a new course. For many teachers this may be an
informal process that relies on recall, student surveys, and/or teacher notes recorded during or after the
teaching session. Revisiting the learning analytics collected during the course can support teachers
when they are planning to run the course again or when the learning design is being applied to a
different cohort or context.
The review of the checkpoint and process analytics of Case-based Learning design discussed here,
provide important data to assist a teacher in refining the overall design and integration. Updated case
resources might account for examples of past student projects and the problems encountered (as
indicated by the analytics) or additional resources might provide students guidance on strategies for
effective project team processes. The analytics may also help a teacher plan for points in the task
sequence in which they may need to provide additional support to students.
18
Conclusion
This paper argued that the evaluative potential of learning analytics would be significantly enhanced by
reference to the learning design which documents pedagogical intent. In addition, we looked at the
value that the accountability and quality agenda might gain from the ability to use learning analytics for
real-time evaluation of learning within a specific pedagogical design. An example of a learning design
was explored to outline how analytics tools of two types – checkpoint and process analytics – could
generate analytics that allow for comparison of expected behaviours and interactions with outcomes of
a learning design. Using the example of Case-based Learning Design, we saw how the resulting
learning analytics allows a learning design to be evaluated in light of its pedagogical intent, using a rich
set of real-time, behaviour-based data on learner interaction within the learning environment. The next
stages of research and development include several parallel directions: engaging teachers and students
in understanding and using visual patterns of interaction as means to encourage learning activity;
scaling up to larger numbers of classes, providing a base for comparing statistically the observed to
expected analytics of behaviours and interactions; and using results to provide meaningful feedback to
teachers on how their learning design is meeting their pedagogical goal, and to assist them in decisions
around design and pedagogical change in real-time.
As the field of learning analytics continues to evolve and the diversity of data sources increases there
will be an associated rise in the number and accuracy of predictive models (logistic regressions,
decision trees, support vector machines, etc.) for student performance and progression. As these models
come into the mainstream there is an opportunity to leverage analytics tools and visualisations to
establish pedagogical recommendations. However, as noted above any user interaction behaviour must
be analysed in the specific education context such as the learning design and course modality. An
understanding of the learning design context is imperative for establishing accurate predictive models
alongside pedagogical recommendations. Establishing a conceptual framework for typical learning
analytics patterns expected from particular learning designs can be considered an essential step in
improving evaluation effectiveness and to build the foundation for pedagogical recommender systems
in the future.
19
References
Agostinho, S. (2011). The use of a visual learning design representation to support the design process
of teaching in higher education. Australasian Journal of Educational Technology, 27(6), 961-
978.
Agostinho, S., Bennett, S., Lockyer, L., Kosta, L., Jones, J., & Harper, B. (2009). An examination of
learning design descriptions in an existing learning design repository. In R. Atkinson & C.
McBeath (Eds.), Conference of the Australasian Society for Computers in Learning in Tertiary
Education (pp. 11-19). Auckland, New Zealand: Auckland University of Technology and
ASCILITE.
Agostinho, S., Harper, B., Oliver, R., Hedberg, J., & Wills, S. (2008). A visual learning design
representation to facilitate dissemination and reuse of innovative pedagogical strategies in
university teaching. In L. Botturi & S. Stubbs (Eds.), Handbook of Visual Languages for
Instructional Design: Theories and Practices (pp. 380-393). Hershey PA: Information Science
Reference, IGI Global.
Arnold, K.E. (2010). Signals: Applying academic analytics. EDUCAUSE Quarterly Magazine, 33(1).
http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/Sign
alsApplyingAcademicAnalyti/199385
Bennett, S. (2002). A technology supported constructivist learning enviroment that uses real-life cases
to support collaborative project work. from
http://www.learningdesigns.uow.edu.au/exemplars/info/LD1/index.html
Bennett, S., Lockyer, L. & Agostinho, S. (2004). Investigating how learning designs can be used as a
framework to incorporate learning objects. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R.
Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp.
116-122). Perth, 5-8 December.
http://www.ascilite.org.au/conferences/perth04/procs/bennett.html
Bichsel, J. (2012). Analytics in Higher Education: Benefits, Barriers, Progress and Recommendations.
Louisville, Colorado: Educause Center for Applied Research.
Bloxham, S., & Boyd, P. (2012). Accountability in grading student work: securing academic standards
in a twenty-first century quality assurance context. British Educational Research Journal,
38(4), 615-634.
20
Boud, D., & Prosser, M. (2002). Appraising new technologies for learning: a framework for
development. Educational Media International, 39(3-4), 237-245.
Buckingham Shum, S., & Deakin Crick, R. (2012). Learning dispositions and transferable
competencies: pedagogy, modelling and learning analytics. In Proceedings of the 2nd
International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia,
Canada.
Butler, D.L., & Winne, P.H. (1995). Feedback and self-regulated learning: A theoretical synthesis.
Review of Educational Research, 65(3), 245-281.
Campbell, J., & Oblinger, D. (2007). Academic analytics. Retrieved 25th October, 2007, from
http://connect.educause.edu/library/abstract/AcademicAnalytics/45275
Coates, H. (2005). The value of student engagement for higher education quality assurance. Quality in
Higher Education, 11(1), 25-36.
Coates, H. (2010). Defining and monitoring academic standards in Australian higher education. Higher
Education Management and Policy, 22(1), 41-58.
Conole, G., & Culver, J. (2010). The design of Cloudworks: Applying social networking practice to
foster the exchange of learning and teaching ideas and designs. Computers & Education, 54(3),
679-692.
Dawson, S. (2010). 'Seeing' the learning community: An exploration of the development of a resource
for monitoring online student networking. British Journal of Educational Technology, 41(5),
736–752.
Dawson, S., Bakharia, A., & Heathcote, E. (2010). SNAPP: Realising the affordances of real-time SNA
within networked learning environments. In Networked Learning - Seventh International
Conference, Aalborg, Denmark.
Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011) ‘Seeing’ networks: visualising and
evaluating student learning networks Final Report 2011. Canberra: Australian Learning and
Teaching Council Ltd. Available URL
http://research.uow.edu.au/content/groups/public/@web/@learnnet/documents/doc/uow115678.
pdf
Dawson, S., Heathcote, E., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of
ICT systems for enhancing the student learning experience. International Journal of
Educational Management, 24(2), 116-128.
Dawson, S., Macfadyen, L. & Lockyer, L. (2009). Learning or performance: Predicting drivers of
student motivation. In Atkinson, R. J. & McBeath, C. (Eds) (2009). Same places, different
21
spaces. Proceedings of the 26th Annual ascilite International Conference, (pp. 184-193).
Auckland: The University of Auckland, Auckland University of Technology, and Australasian
Society for Computers in Learning in Tertiary Education.
Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011). Using social network metrics
to assess the effectiveness of broad-based admission practices. Australasian Journal of
Educational Technology, 27(1), 16-27.
Dawson, S., McWilliam, E. & Tan, J.P.L. (2008). Teaching smarter: How mining ICT data can inform
and improve learning and teaching practice. In Hello! Where are you in the landscapeof
educational technology? Proceedings ascilite Melbourne 2008.
http://www.ascilite.org.au/conferences/melbourne08/procs/dawson.pdf
De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M., & Cannavacciuolo, L. (2011). Discourse-
centric learning analytics. In proceedings of the 1st International Conference on Learning
Analytics and Knowledge, Banff, Alberta, Canada.
Elias, T. (2011). Learning Analytics: Definitions, Processes and Potential.
http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
Falconer, I., Beetham, H., Oliver, R., Lockyer, L., & Littlejohn, A. (2007). Mod4L final report:
Representing learning designs. Final project report for Joint Information Systems Committee
(JISC) Design for Learning program, Glasgow: Glasgow Caledonian University.
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges.
(Technical report KMI-12-01). Knowledge Media Institute, Open University UK
http://kmi.open.ac.uk/publications/techreport/kmi-12-01.
Fritz, J. (2010). Classroom walls that talk: Using online course activity data of successful students to
raise self-awareness of underperforming peers. Internet and Higher Education,
doi:10.1016/j.iheduc.2010.07.007.
Govaerts, Sten, Verbert, Katrien, Duval, Erik, & Pardo, Abelardo. (2012). The student activity meter
for awareness and self-reflection. In Proceedings of the 2012 ACM annual conference extended
abstracts on Human Factors in Computing Systems Extended Abstracts, Austin, Texas, USA.
Greller, W., & Drachsler, H. (In press). Translating learning into numbers: A generic framework for
learning analytics. Educational Technology and Society.
Heathcote, E. (2006). Learning Design Templates - a Pedagogical Just-in-Time Support Tool. In G.
Minshull & J. Mole (Eds.), Designing for Learning: The proceedings of Theme 1 of the JISC
Online Conference: Innovating e-Learning (pp. 19-26). UK: JISC.
22
Jones, J., Bennett, S, & Lockyer, L. (2011). Applying a learning design to the design of a university
unit: A single case study. In T. Bastiaens & M. Ebner (Eds.), Proceedings of World Conference
on Educational Multimedia, Hypermedia and Telecommunications (pp. 3340-3349).
Chesapeake, VA: AACE.
Jovanović, J., Gasevic, D., Brooks, C., Devedzic, V., Hatala, M., Eap, T., & Richards, G. (2007). Using
Semantic Web Technologies to Analyze Learning Content. IEEE Internet Computing, 11(5),
45-53.
King Alexander, F. (2000). The Changing Face of Accountability: Monitoring and Assessing
Institutional Performance in Higher Education. The Journal of Higher Education, 71(4), 411-
431.
Koper, R. (2006). Current Research in Learning Design. Educational Technology & Society, 9(1),
13/22.
Lauría, E.J.M., Baron, J.D., Devireddy, M., Sundararaju, V., & Jayaprakash, S.M. (2012). Mining
academic data to improve college student retention: an open source perspective. Paper
presented at the LAK 12: 2nd International Conference on Learning Analytics and Knowledge,
Vancouver, Canada.
Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of educational
technology. London: Routledge.
Leony, Derick, Pardo, Abelardo, Valent, Luis de la Fuente, #237, S, David, #225, . . . Kloos, Carlos
Delgado. (2012). GLASS: a learning analytics visualization tool. In Proceedings of the 2nd
International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia,
Canada.
Liaqat, A., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a
learning analytics tool. Computers & Education, 58(1), 470-489.
Macfadyen, L., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for
educators: A proof of concept. Computers & Education, 54(2), 588-599.
Masterman, E. (2009). Activity theory and the design of pedagogic planning tools. In L. Lockyer, S.
Bennet, S. Agostinho & B. Harper (Eds.), Handbook of Research on Learning Design and
Learning Objects Issues Applications and Technologies (Vol. 1, pp. 209-227). Hershey, PA:
Information Science Reference.
Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for supporting
instructors in web-based distance courses. International Journal of Human-Computer Studies,
65(2), 125-139.
23
McAndrew, P., & Goodyear, P. (2007). Representing practitioner experiences through learning design
and patterns. Rethinking pedagogy for a digital age, 92-102.
Norris, D., Baer, L., Leonard, J., Pugliese, L., & Lefrere, P. (2008). Action analytics: Measuring and
improving performance that matters in higher education. EDUCAUSE Review, 43(1), 42-67.
Pardo, A., & Kloos, C.D. (2012). Stepping out of the box. Towards analytics outside the Learning
Management System. In Proceedings of the 2nd International Conference on Learning Analytics
and Knowledge, Vancouver, British Columbia, Canada.
Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.