Content uploaded by Dragan Gasevic
Author content
All content in this area was uploaded by Dragan Gasevic on Sep 26, 2017
Content may be subject to copyright.
THE ROLE OF LEARNING ANALYTICS
IN FUTURE EDUCATION MODELS
3
2
CONTENTS
EXECUTIVE SUMMARY 3
1.0 INTRODUCTION 4
2.0 IMPORTANCE OF DATA-INFORMED APPROACHES FOR EDUCATION 5
3.0 LEARNING ANALYTICS OVERVIEW 7
3.1 What are learning analytics? 7
3.2 How can learning analytics help? 7
3.3 Learning analytics case studies and players 7
3.4 Openness and learning analytics 9
4.0 EXAMPLES AND IMPACT OF LEARNING ANALYTICS 10
5.0 THE CONTEXT FOR LEARNING ANALYTICS AND FUTURE EDUCATION MODELS 12
6.0 CONCLUSIONS 13
7.0 REFERENCES 14
EXECUTIVE SUMMARY
The use of data analysis to guide the
design and deployment of learning
experiences is nally reaching
widespread adoption. Educational
institutions at all levels are realising
that, with the help of technology, they
are collecting data that could be very
valuable when properly analysed,
aligned with learning outcomes, and
integrated into a tighter feedback
loop with stakeholders. But this
vision is not exempt from hurdles.
Educational institutions at all levels
need to nd the way to make the
best use of data, how to integrate it
into their day-to-day operations and
nurture a cultural change towards
institutions. In this paper we provide
an overview of the current initiatives
in the area of Learning Analytics, how
these developments can be used in
conventional learning experiences and
the potential benets.
45
1.0 INTRODUCTION 2.0 IMPORTANCE OF DATA-INFORMED
APPROACHES FOR EDUCATION
Data is ubiquitous and is being used
in an increasing number of disciplines.
Education is not an exception.
On the contrary, the use of technology
to mediate interactions among
stakeholders allows access to detailed
lists of events occurring in a learning
experience. This wealth of data of fers
an unprecedented opportunity to
achieve sustained improvements and
reection on teaching and learning
practice. But there are numerous
barriers to realising this potential.
Leaders of educational institutions
need to be aware of the complex
ecosystem emerging around the
notion of analytics. Institutions need
to be aware of the right tools and
methods, and have the know-how to
apply them to detect how learning is
occurring and how can it be improved.
Learning analytics is opening a new
space for innovation that can provide
teachers, principals, parents and
students with fast feedback about
learning processes. This capacity is
the basis for reshaping educational
models, but for it to become a reality
we need to identify the precise role
of learning analytics and how it can
inuence current and future
education models.
The rest of this document rst
describes the importance of data-
informed approaches in the area of
education. It then details the current
state-of-the-art and emerging
initiatives in the area of learning
analytics. The paper concludes with
a description of models and initiatives
in the area of K-12 education, and the
possible models that will emerge in
the future.
Abelardo Pardo, Shane Dawson,
Dragan Gaš ević and Susi Steigler-Peter s
The concept of data for informing
teaching and learning practice is
not a new phenomenon for the
education sector. However, while
education has long been awash with
data, its strategic application for
understanding the student learning
process and developing scalable
personalised interventions has to
date been less effective. For instance,
there is a mass of data related to
benchmarking performance and
expenditure at the international (e.g.
OECD; PISA tests) and national (e.g.
NAPLAN) levels. Yet the relationship
between these data and ndings
is often far removed and loosely
connected to the day-to-day teaching
operations that occur within a school.
These forms of data are essentially
targeted towards the development,
or assessment, of federal and state
education policy. This is not to say that
such benchmarking and standardised
assessment measures do not offer
broad value for individual schools.
More so, that these data and forms
of measurement need to be further
integrated and nuanced in order to
establish targeted insights that can
effectively promote and guide the
development of quality teaching and
learning practice. As the stakeholders
change so too do the types and
granularity of data required to identify
areas of high performance and
areas requiring further support and
development. Ultimately, the goal of
any evaluation and quality assurance
process is to improve on practice.
In this case, it is demonstrating
improvements in student learning
or operational efciencies. Without
entering into much debate, let’s
assume this is important.
Over the past decade there has
been a growing paradigm shift in
education, from the collection of
data for compliance purposes to the
application of data for continuous
improvement. This shift in emphasis
has been promoted through the
concept of data-informed or
evidence-based decision-making.
Data-informed decision-making in
education involves the collection
and analysis of data to provide
actionable insights into teaching and
learning practice (Mandinach, 2012;
Mandinach and Jackson, 2012). As
with many organisations embracing
data-informed practices, the difculty
for the education sector lies not in
the availability of student or school
data, but how such data is made
available to key stakeholders to enable
decision-making for improving student
learning. The diversity and volume
of data and information required is
largely dependent on an individual’s
role and responsibilities in the school.
67
For instance, school leaders will
require aggregated data sets relating
to standardised assessment scores
commonly benchmarked against
similar schools or data that assesses
the impact of specic school-based
strategies. Data and information on
pedagogical approaches, curriculum,
student assessment, remedial
interventions, disciplinary actions,
professional development, and level of
integration of education technologies
alongside the cost-effectiveness
of support resources or technical
infrastructure all form a subset of
an education leader’s arsenal of
resources to help provide insight and
direction for setting school policy and
strategy. Similarly, teachers require
data related to student performance,
learning development and the
effectiveness of adopted curriculum
and learning strategies.
Although there is a growing emphasis
on data-informed approaches in
education, there is a long-held
perception that teachers commonly
make decisions based on their
past experience and intuition. The
notion is that effective teachers
are watchful and empathetic. They
can react quickly to the individual
and class cues that may suggest
understanding or misconceptions,
apprehension or over-condence.
This ability is accrued through years of
teaching experience. But while these
perceptions may or may not fall into
the category of urban myth, there is
no debate regarding the impact of
quality teaching on student learning.
The applied use of student learning
data in this context can only further
assist teachers and help improve
practice. The development of a quality
teacher should not come at the cost of
many years of experience. Knowledge
of data analytics and understanding
how student learning data manifests
through curriculum strategies and
relates to an individual’s learning
progression are fundamental skills
for the 21st Century teacher. Teaching
as a “science” through rigour and
purpose can only enhance the practice
of the “art”.
Despite the growing need for teachers
to adopt more data-driven approaches
to their practice, there is evidence
to suggest that the profession is not
equipped to make optimal use of the
full suite of data that is now available.
The sub-optimal application of data-
informed teaching and learning stems
from several important variables.
The promotion of data usage is
inuenced by the timeliness and
perceived usefulness of the data
itself to address an identied issue
or question. When considering the
types of data commonly collected
in education there is justiable
reason as to why teachers are more
reticent and reluctant in their uptake.
Education data is commonly lag data.
When developing long-term strategic
goals for a school, a leader’s access
to lag data through standardised
tests, attrition or enrolment rates,
for example, can be considered
applicable and t for purpose.
However, when considering student
learning progression, such data is no
longer of assistance. It is too late to
identify a difculty in student learning
through the lens of a formative
assessment. When promoting student
self-regulated learning the immediacy,
personalisation and scaffolding of
feedback is critical. As such there is
a need to establish more appropriate
digitally-enabled processes and
practices that can provide teachers and
learners with more immediate data.
The growing uptake of education
technologies across all facets of
education, from K-12 to tertiary,
affords novel opportunities to
provide nuanced, contextualised
data sources that can address the
need for just-in-time feedback. The
eld of learning analytics is galvanising
these ingredients into a discipline in
which multidisciplinary work from the
areas of pedagogy, technology and
learning are combined to provide a
much deeper insight into how learning
occurs and how it can be better
supported by all stakeholders in the
ecosystem, particularly the learner.
3.1 What are learning analytics?
The rapid adoption of education
technologies – such as learning
management systems, education
apps, social media, lecture capture,
student information systems and
mobile computing – has allowed for
collecting unprecedented amounts
of big data about learning, teaching
and institutional processes (Baer
and Campbell, 2012; Macfadyen
and Dawson, 2010; Siemens and
Long, 2011). As with any information
technology, user interactions with
the technology create digital traces
that can be recorded and stored in
databases. These traces can be about
the content students accessed in
the courses they took, the messages
posted or viewed on discussion
boards, classmates they took classes
with or interacted with on discussion
boards, or keywords used while
searching for learning resources in
institutional systems. These traces
can automatically be “mined” to
identify patterns in the educational
technology use by learners. The
patterns are then used to understand
the learning processes underlying
the use of technology and inform
teaching practice, emerging learning
models and institutional processes.
This process has been described as
learning analy tics.
More formally, the Society for Learning
Analytics Research (SoLAR) (Long,
Siemens, Conole and Gašević, 2011)
denes learning analytics as “the
measurement, collection, analysis and
reporting of data about learners and
their contexts, for purposes of
understanding and optimising learning
and the environments in which it
occurs”. This denition emphasises
the nature of learning analytics as an
evidence-based discipline – similar to
medicine. That is, learning analytics
use rigorous research methods to
understand important problems for
practice of education in order to
produce evidence that can inform
the decision-making process of the
key stakeholders in the learning
ecosystem – from students and
parents to instructors, deans,
university presidents, governments
and other private and public
sector organisations.
3.2 How can learning analytics help?
The use of analytics and data analysis
is not new in education. In the past,
this was typically driven by the needs
of the education sector to support
data-driven decision-making and
planning (Baker and Yacef, 2009).
In this process, there have been
numerous reports of the use of
approaches from different disciplines
such as business intelligence, web
analytics, data mining, predictive
modelling, academic analytics and
educational data mining (Romero and
Ventura, 2010). More recently, with
the growth of the eld of learning
analytics, the roots of the eld are
more deeply connected with the
established research area (Ferguson,
2012) such as statistics, text mining,
social network analysis, machine
learning, human computer interaction,
learning sciences and educational,
cognitive and social psychology.
The multidisciplinary nature of
learning analytics and the availability
of big data in education offer
numerous opportunities to answer old,
difcult questions about education
and offer solutions to the pressing
challenges of the education sector. For
example, timely feedback to learners
has been proven as one of the most
powerful ways to enhance learning
and learners’ academic success
(Hattie and Timperley, 2007). However,
large classes – even with “only” tens
of students – could hardly allow for
individualised feedback for each
and every student. Much too often,
learners would need to wait for the
mid-term exam or rst assessment
points to receive any feedback about
their progress. For many, mid-term is
already too late and the opportunity
to intervene is long gone. By tracing
and analysing the data about learners’
activities and visualising results
through actionable dashboards (Duval,
2011), learning analytics can offer
“real-time” feedback about learners’
progress and give the learners hints
if they are off track and tips on how
to enhance their learning.
3.3 Learning analytics case studies
and players
Student retention is the best
established application of learning
analytics, with Purdue Signals as
probably the best-known showcase
3.0 LEARNING ANALYTICS OVERVIEW
“ The applied use of student
learning data in this
context can only further
assist teachers and help
improve practice.”
2.0 IMPORTANCE OF DATA-INFORMED
APPROACHES FOR EDUCATION
(CONT.)
89
and success stor y for the new
eld (Pistilli, Arnold and Bethune,
2012). Purdue Signals is a software
system that uses data about the
interaction of learners with learning
management system Blackboard.
These trace data are then analysed
to build predictive models about
academic standing of learners, such
as if they are at risk of completing
the course successfully, if they are
on track with course expectations,
or if they need to study more or
require additional instructional and/
or institutional support. The system
then applies the metaphor of trafc
lights – green, yellow and red – to
present the predicted academic
standing to students and instructors.
Being deployed since 2007 at Purdue
University, Purdue Signals has been
conrmed as a sound solution to
increasing student retention
(Arnold and Pistilli, 2012).
For example, students who took at
least one course in year one in 2007
with Purdue Signals had a retention
rate of 97%, whereas the retention
rate for those that did not use Purdue
Signals was 83%. This difference
was much bigger in year four in 2010,
when the students who took at least
one course with Purdue Signals had a
retention rate of 93%, whereas those
who did not had a retention rate of
69%. Moreover, 89% of the students
who used Purdue Signals reported an
overall positive experience, while 59%
of the students expressed their wish
to use the system in each course
they take.
Changing students’ learning habits and
behaviour often requires much more
than just showing visual dashboards.
This was well supported in Purdue
Signals by notifying instructors to send
personalised emails to the students
who are at risk (Tanes, Arnold, King
and Remnet, 2011). Similarly, the
University of Michigan developed
a system called E2Coach (Expert
Electronic Coaching ) by building on
the experience of behavioural change
accumulated in public health research
(Wright, McKay, Hershock, Miller and
Tritz, 2014). E2Coach takes data about
students’ grades on previous courses,
survey data about the students’
intentions behind the enrolment in a
course and other socio-psychological
factors. These data are combined
with the data about the students’
interaction with online learning
systems and progression throughout
a semester. The students then receive
tailored messages with advice on how
to proceed further with their learning.
Unlike other solutions that focus
on students at risk, E2Coach offers
advice to advanced (grade A) students
on how to enhance their learning.
The studies show that the grades
of students are improved with the
increased use of E2Coach.
Learning analytics and big data go
beyond assisting students within a
single course. Many institutions are
trying to nd solutions to improve
career planning of students by
offering timely advice. For example,
eAdvisor1 is a system implemented
at Arizona State University to offer
personalised support to students in
course selection and planning their
degree programs. eAdvisor allows
students to get insight into possible
paths they may take throughout
their degree majors along with the
main requirements they must meet
to be able to graduate. Whenever a
student enrols in a particular course,
eAdvisor evaluates the progression
towards their degree and generates
feedback about the implications of
such a decision (e.g. it will take
them more time to complete their
degree). The results of the use of
eAdvisor repor tedly increased the
number of students on track in their
degree program from 22% in 2007
to 91% in 2010 (Jarret, 2013). Another
implication of the implementation
of eAdvisor was increased retention
of 8% in rst year.
Desire2Learn’s Degree Compass2
is another similar example to that
of eAdvisor. Degree Compass
recommends courses to students
to take in future semesters by using
predictive models obtained from
the historic data about previous
course completion. According to
Desire2Learn, an implementation
of Degree Compass at Austin Peay
State University showed that Degree
Compass was able to predict 92% of
the total grades accurately and 90%
of the passing grades (Desire2Learn,
2012, p. 2). Moreover, in the period Fall
2010 – Fall 2012, an increase in grades
A, B and C of 1.4% was reported
(i.e. a remarkable shift of 5.3
standard deviations).
The recognition of importance of
data-driven decision-making and
planning by the education sector
worldwide created a marketplace for
learning analytics solutions. It has
become a standard requirement for
the vendors of learning management
systems to provide some form of
learning analy tics. Most leading
vendors of learning management
systems – such Blackboard Analytics3
and Desire2Learn Insights4 – have
their solutions that offer learning
analytics based on the students’
interaction with their systems.
Moreover, specialised learning
analytics providers – such as Civitas
Learning5 and Knewton6 – have been
founded recently. A good example of
the impact of the availability of such
solutions is the partnership between
Arizona State University and Knewton.
By using Knewton’s Adaptive Learning
Platform, they created a customised
learning experience for about 5,000
students that were taking a remedial
mathematics course (Upbin, 2012).
After one semester, the effect of this
solution was the reduction in the
dropout rates from 13% to 6%, and
the increase in the number of passing
students from 66% to 75%, with half of
the students successfully completing
the course almost a month early.
3.4 Openness and learning analytics
There are considerable efforts to
promote openness in the eld of
learning analytics. Open platforms
for learning analytics and learning
analytics solutions for open
educational resources are in the
centre of these efforts. The Society for
Learning Analytics Research (SoLAR)
published a white paper (Siemens et
al., 2011), which outlined an integrated
and modularised platform for open
learning analytics (OLA). The primary
goal of the SoLAR OLA platform is to
provide a set of requirements that
will guide design, implementation
and evaluation of open platforms,
which can integrate heterogeneous
learning analytics techniques. The
core components of the SoLAR
OLA platform consist of a learning
analytics engine, adaptive content
engine, intervention engine (supports
recommendations and automated
feedback provisioning) and reporting
(dashboards and visualisation). The
platform has received considerable
attention. A good example is edX
Insights7 – a learning analytics
framework of edX, a leading MOOC
provider – that offers an open-source
solution that can easily be integrated
with existing online learning systems.
In this way, opportunities for adaption
of learning analytics solutions are
increased for institutions that either
prefer open-source or cannot afford
commercial solutions. Moreover, it
allows institutions more exibility in
creating customised learning analytics
solutions that best t their needs.
Carnegie Mellon University’s (CMU)
Open Learning Initiative (OLI)8 is a
platform with numerous open courses
publicly available for institutions to
use in a so-called “blended learning”
mode (a mix of face-to-face and online
instruction) (Lovett, Meyer and Thille,
2008). The initiative contains a so-
called Learning Dashboard, which is an
advanced learning analytics support
system built upon the cutting-edge
research in the learning sciences.
Learning Dashboard processes
data about student activities and
assessments and generates reports
about students’ standing for course
instructors. The use of CMU OLI and
learning analytics were validated
through a series of randomised control
trials across six public institutions.
The results of the randomised control
trials revealed that the students in
the blended learning format are not
disadvantaged “for this mode of
instruction in terms of pass rates,
nal exam scores, and performance
on a standardised assessment of
statistical literacy” (Bowen, Chingos,
Lack and Nygren, 2012). Moreover,
in the discussion of the randomised
control trials results, it was concluded
that the adoption of blended models
of instruction “in large introductor y
courses have the potential to
signicantly reduce instructor
compensation costs in the long run”.
1 https://eadvisor.asu.edu
2 http://www.desire2learn.com/products/degree-compass
3 http://www.blackboard.com/platforms/analytics/overview.aspx
4 http://www.desire2learn.com/products/insights
5 http://www.civitaslearning.com
6 http://www.knewton.com
3.0 LEARNING ANALYTICS OVERVIEW
(CONT.)
10 11
4.0 EXAMPLES AND IMPACT OF
LEARNING ANALYTICS
The examples described in the
previous section show a clear
advantage of using data to tackle
well-known issues in the context of
post-secondary education. In the
context of K-12, the problem remains
the same: can data be used to drive
decision-making about the quality of
the learning experience? The same
issues of collecting and processing
data to increase the knowledge of the
problems are present in primary and
high schools. In these environments
the challenge still remains to create
the right environment at different
levels within institutions to increase
awareness of the advantages of using
data, and then foster the skills to
analyse such data and transform
it into actionable insight.
At the government levels, current tests
are being used as a mechanism to
promote this sustained improvement
(e.g. the National Assessment Program
– Literacy and Numerac, NAPLAN, in
Australia). However, learning analytics
postulates that data should be used
to provide decision support with an
element of immediacy. In other words,
rather than deploying an assessment
mechanism, process the results and
then obtain actionable insights in
a period of months, this procedure
should aim to be almost immediate.
This use of data poses a true paradigm
shift for stakeholders – namely, school
management, teachers, parents and
students.
Moody and Dede (2008) classied the
use of data depending on the derived
actions in data for accountability,
improvement and reection. Currently,
various assessment instruments
mostly proposed by government
institutions propose the use of data for
accountability. School performance
is now being assimilated with the
results in various tests (e.g. NAPL AN).
However, it is difcult for stakeholders
to derive a clear vision of actionable
items from the result of these tests,
and therefore, its impact is not clear.
Using data for improvement places
the emphasis on specic changes
that need to be measurable. In this
type of scenario, data is analysed and
immediately translated into actions
that are deployed in learning scenarios
seeking to improve their effectiveness.
Hamilton et al (2009) have suggested
the type of queries that can be
answered with the help of data for
administrators and teachers.
Administrators may use data to detect
the areas with the greatest need
for improvement and the resources
that can be allocated to them most
effectively. They can also study how
the curriculum is addressing the
boundary conditions imposed by
governments. Data can also be used
to gain insight on the effectiveness
of different teachers and the most
adequate professional development
opportunities that need to be provided.
Teachers, on the other hand, may use
data to gain insight into how students
are learning and their main strengths
and weaknesses, based on ne-
grained evidence such as the level of
prociency of computer assessment,
engagement with activities, etc.
Additionally, teaching staff may use
data to monitor changes introduced
into activities. Data may provide a
robust framework to perform rigorous
comparisons of instruction methods
and detect the most adequate ones
for different student proles. However,
these potential scenarios are already
posing some difculties as described
in some case studies (Dauenhauer,
2014) in the context of physical
education. Other recent studies
document different levels of adoption
of data-driven decision-making and
the need to address the tensions
created by the need to adopt these
techniques (Hubbard, Datnow and
Pruyn, 2014). Additionally, when facing
comprehensive adoption in a school,
there are usually issues related to
the availability of data and the skills
required from the staff (Rainey, 2013).
The suggestion is to promote the vision
of a teacher as a researcher that has
the skills and interest to embrace
a sustained improvement mindset.
A possible strategy to alleviate
this gap is to consider professional
development opportunities to
provide staff with the right tools and
knowledge to manipulate data and
create the sensibility to the benets
derived from its use (Staman, Visscher
and Luyten, 2014).
The other two stakeholder groups,
parents and students, together with
teachers, may benet from the third
category of data use in which the
main focus is reection. In the case
of teachers, reective activities can
be added systematically across the
year among teams to collaboratively
analyse the data and comment on
the possible explanations or actions
to be derived. A similar scenario is
feasible for the case of parents and
students. Introducing systematic
reection about the data collected
and provided offers the potential of
providing parents with perceptions
of factors that help them participate
more actively in their child’s education
(Bauer, 2015).
Learning analytics in the K-12 space
is yet to full its potential. Principals
and management teams need to be
aware of the possibilities of data-
driven decision-making and deploy the
adequate technical and non-technical
resources for it to be adopted at
an institutional level. They have to
embrace data to drive decisions
related to school performance,
teaching effectiveness and resource
allocation. This vision needs then to be
articulated so that teachers adopt a
systematic use of data for immediate
feedback and potentially adjustments
in their day-to-day activities. Teachers
need to become data savvy to be able
to orchestrate how data should be
collected, analysed and turned into
actionable initiatives. Parents and
students should be taken into
account when articulating this vision
so that they get used to receiving
and reecting on the information
received, as well as exploring ways
to increase their engagement in
a learning experience.
7 http://github.com/edx/insights
8 http://oli.cmu.edu
12 13
5.0 THE CONTEXT FOR LEARNING
ANALYTICS AND FUTURE
EDUCATION MODELS
6.0 CONCLUSIONS
There is no doubt that the trend
towards delivering more personalised
learning experiences for students
of all ages has strengthened, both
nationally and globally, over the last
10-15 years. Charles Leadbeater, in his
2008 paper What’s Next? 21 Ideas for
21st Century Learning articulated
a number of shifts in learning features
including, for example, the shift
away from institution-led learning
to anywhere, anytime learning as
well as the break from traditional
teacher-as-knowledge-custodian
to learner as agent and maker,
contributing to their own learning.
In its education thought leadership
paper series, Telstra has built on
the work of Leadbeater and others
to postulate the elements of best
practice that support personalised
learning and emerging 21st-Century
pedagogies. The series, commissioned
by Telstra’s Education Roundtable,
has been written in close consultation
with education stakeholders and
serves as an accurate record of
education issues and challenges
as well as benchmarking quality
examples of best practice in delivering
personalised learning experiences
in localised environments.
The challenge beyond the local was,
and remains: how do you scale up
service delivery for personalised
learning?
Telstra responded to the challenge
by coming up with a few ideas.
These were presented and discussed
with education customers, resulting
in an innovative service that is able
to deploy a personalised learning
ecosystem to the smallest school or
a university with campuses across
many locations, as well as to the world
of corporate learning. Moreover, the
model enables a globalised approach
to personalised learning.
The model is called Workspace X and
is a cloud-based personalised learning
ecosystem that liberates learners
from the four walls of the classroom,
provides access to quality learning
experiences and a wide range of state-
of-the-art digital tools that promote
21st-Century learning styles. It is also
an affordable and effective solution
for the vast numbers of disadvantaged
and remote students across the globe
who are still struggling to access
quality learning.
In a single sign-on, secure environment
learners can access enterprise-grade
education apps that cover the public
and private domains. They can also
access a range of collaboration
tools, digital learning content, video
streaming services and communicate
using their choice of social media
and networking tools. Workspace X
also integrates with your choice of
learning management system, student
information system and content
management system. Workspace X
users truly begin to take agency of
their learning while still beneting
from interactions and guidance
provided by quality teachers. The
ecosystem can be accessed from any
device, any time and also has plenty of
space for the specially-selected digital
content of your choice.
To round out the ecosystem and
bring personalised learning to life,
Workspace X also offers complete
integration with high-order learning
analytics. The learning analytics
package provides a dynamic
360-degree view of learning
progression displayed as an easy-
to-read dashboard. Now quality
mobile educational experiences are
brought to learners with the rigour of
benchmarked evidence of learning
progress and growth.
The possibilities to use data to improve
current educational models are here.
However, the distance from simple
data collection to effective data-based
design support is not small. Although
the number of institutions at different
levels embracing learning analytics is
increasing, widespread adoption still
requires an effort to change the
mentality towards the use of data,
facilitate data integration, target
analytics to the true needs of the
stakeholders, and make the process
robust from the point of view of
ethics and privacy. In this paper we
have presented a brief over view of
the state of the art, and the main
initiatives currently shaping the
learning analytics landscape. A slow
but constant shift to a culture in which
data analysis is used to support
decision-making at multiple levels is
here. This shift has to translate into
an overall more effective learning
experience for students, parents,
teachers, managers and governments.
This report was written by
Abelardo Pardo, Shane Dawson,
Dragan Gaš ević and Susi Steigler-Peter s
14 15
7.0 REFERENCES
Arnold, K. E., and Pistilli, M. D. (2012). Course Signals at
Purdue: Using Learning Analytics to Increase Student
Success. In S. Buckingham Shum, D. Gašević, and R.
Ferguson (Eds.), International Conference on Learning
Analytics and Knowledge (pp. 267–270). ACM Press.
Baer, L., and Campbell, J. P. (2012). From Metrics to Analytics,
Reporting to Action: Analytics’ Role in Changing the Learning
Environment (pp. 53–65). EDUCAUSE. Retrieved from
http://net/educause.edu/ir/library/pdf/pub7203.pdf
Baker, R. S. J. D., and Yacef, K. (2009). The State of
Educational Data Mining in 2009: A Review and Future
Visions. Journal of Educational Data Mining, 1(1) , 3 –17.
Bauer, S. (2015). Parents’ Perceptions of School
Accountability: A Case Study of an Urban School District.
San Diego State University.
Bowen, W. G., Chingos, M. M., Lack, K. A., and Nygren, T. I.
(2 012) . Interactive Learning Online at Public Universities:
Evidence from Randomized Trials. Ithaka S+R. Retrieved
from http://www.sr.ithaka.org/sites/default/les/reports/
sr-ithaka-interactive-learning-online-at-public-
universities.pdf
Dauenhauer, B. D. (2014). Data-driven Decision Making
in Physical Education: A Case Study. The University of
Texas at Austin. Retrieved from http://repositories.lib.
utexas.edu/bitstream/handle/2152/24745/DAUENHAUER-
DISSERTATION-2014.pdf?sequence=1
Desire2Learn. (2012). Desire2Learn Client Success Story:
Austin Peay State University. Retrieved from http://content.
brightspace.com/wp-content/uploads/Desire2Learn_
Success_Story-Degree-Compass-APSU.pdf
Duval, E. (2011). Attention please!: Learning Analytics
for Visualization and Recommendation. In International
Conference on Learning Analytics (pp. 9–17). ACM Press.
Retrieved from http://dl.acm.org/citation.cfm?id=2090118
Ferguson, R. (2012). Learning Analytics: Drivers,
Developments and Challenges. International Journal
of Technology Enhanced Learning, 4(5/6), 304–317.
http://doi.org/10.1504/IJTEL.2012.051816
Hamilton, L., Halverson, R., Jackson, S. S., Mandinach,
E. B., Supovitz, J. A., and Wayman, J. C. (2009). Using Student
Achievement Data to Support Instructional Decision Making
(No. NCEE 2009-4067) (p. 76). Washington, D.C.: National
Center for Education Evaluation and Regional Assistance,
Institute of Education Sciences, U.S. Department
of Education.
Hattie, J., and Timperley, H. (2007). The Power of Feedback.
Review of Educational Research, 77(1), 81–112.
http://doi.org/10.3102/003465430298487
Hubbard, L., Datnow, A., and Pruyn, L. (2014). Multiple
Initiatives, Multiple Challenges: The Promise and Pitfalls
of Implementing Data. Studies in Educational Evaluation,
42(2014), 54–62.
Jarret, J. (2013). Bigfoot, Goldilocks, and Moonshots:
A Report from the Frontiers of Personalised Learning.
EDUCAUSE Review, 48(2).
Kaufman, T. E., Graham, C. R., Picciano, A. G., Popham,
J. A., and Wiley, D. (2014). Data-Driven Decision Making in
the K-12 Classroom. In J. M. Spector, M. D. Merrill, J. Elen,
and M. J. Bishop (Eds.), Handbook of Research on
Educational Communications and Technology
(p. 337–). Springer.
Leadbeater, C. (2008) What’s Next? 21 Ideas for 21st
Century Learning.
Long, P. D., Siemens, G., Conole, G., and Gašević, D. (Eds.).
(2 011) . Proceedings of the rst International Conference on
Learning Analytics and Knowledge. ACM Press.
Lovett, M., Meyer, O., and Thille, C. (2008). The Open Learning
Initiative: Measuring the Effectiveness of the OLI Statistics
Course in Accelerating Student Learning. Journal of
Interactive Media in Education, 2008(1).
Macfadyen, L. P., and Dawson, S. (2010). Mining LMS data to
develop an “early warning system” for educators: A Proof of
Concept. Computers and Education, 54(2), 588–599. http://
doi.org/10.1016/j.compedu.2009.09.008
Mandinach, E. B. (2012). A Perfect Time for Data Use: Using
Data-Driven Decision Making to Inform Practice. Educational
Psychologist, 47(2), 71–85.
Mandinach, E. B., and Jackson, S. S. (2012). Transforming
Teaching and Learning Through Data-Driven Decision
Making. Corwin.
Moody, L., and Dede, C. A. (2008). Models of Data-Based
Decision Making: A Case Study of the Milwakee Public
Schools. In E. B. Mandinach and M. Honey (Eds.),
Data-Driven School Improvement. Teachers College,
Columbia University.
Pistilli, M. D., Arnold, K. E., and Bethune, M. (2012).
Signals: Using Academic Analytics to Promote Student
Success. EDUCAUSE Review Online. Retrieved from
http://www.educause.edu/ero/article/signals-using-
academic-analytics-promote-student-success
Rainey, L. R. (2013). What Are You Driving At?: How School
Leaders Use Data When Making School Level Decisions
About Instructional Improvement (PhD Dissertation).
University of Washington.
Romero, C., and Ventura, S. (2010). Educational Data Mining:
A Review of the State of the Ar t. IEEE Transactions on
Systems, Man, and Cybernetics, Part C: Applications and
Reviews, 40(6), 6 01–618 .
Schweer, M. and Steigler-Peters, S. (2011). Personalised
Learning: Meeting the Australian Education Challenge
Siemens, G., Gašević, D., Haythornthwaite, C., Dawson,
S., Buckingham Shum, S., Ferguson, R., and Baker, R. S.
J. D. (2011). Open Learning Analytics: An Integrated and
Modularized Platform (White Paper). Society for Learning
Analytics Research. Retrieved from http://solaresearch.org/
openlearninganalytics.pdf
Siemens, G., and Long, P. (2011). Penetrating the Fog:
Analytics in Learning and Education. Educause Review,
48(5), 31–4 0.
Staman, L., Visscher, A. J., and Luyten, H. (2014). The Effects
of Professional Development on the Attitudes, Knowledge
and Skills for Data-Driven Decision Making. Studies in
Educational Evaluation, 42(September 2014), 79–90.
Steigler-Peters, S. (2014) mEducation: Mobility Enabling
Personalised Learning.
Tanes, Z., Arnold, K. E., King, A. S., and Remnet, M. A. (2011).
Using Signals for Appropriate Feedback: Perceptions and
Practices. Computers and Education, 57(4), 2414–2422.
http://doi.org/10.1016/j.compedu.2011.05.016
Upbin, B. (2012, February 22). Knewton is Building the
World’s Smartest Tutor. Forbes. Retrieved from
http://www.forbes.com/sites/bruceupbin/2012/02/22/
knewton-is-building-the-worlds-smartest-tutor/
Wasson, L., and Steigler-Peters, S. (2012). Quality Teaching
for Personalised Learning: Leveraging Technology for
Exceptional Results.
Wright, M., McKay, T., Hershock, C., Miller, K., and Tritz, J.
(2014). Better Than Expected: Using Learning Analytics
to Promote Student Success in Gateway Science.
Change: The Magazine of Higher Learning, 46(1), 28–34.