Content uploaded by María Jesús Rodríguez-Triana
Author content
All content in this area was uploaded by María Jesús Rodríguez-Triana on Mar 14, 2016
Content may be subject to copyright.
Monitoring, Awareness and Reflection in Blended
Technology Enhanced Learning: a Systematic Review
María Jesús Rodríguez-Triana*
REACT Group, École Polytechnique Fédérale de Lausanne
Station 9, 1015 Lausanne, Switzerland
Email: maria.rodrigueztriana@epfl.ch
*Corresponding author
Luis P. Prieto
CHILI Group, École Polytechnique Fédérale de Lausanne
Station 20, 1015 Lausanne, Switzerland
Email: luis.prieto@epfl.ch
Andrii Vozniuk
REACT Group, École Polytechnique Fédérale de Lausanne
Station 9, 1015 Lausanne, Switzerland
Email: andrii.vozniuk@epfl.ch
Mina Shirvani Boroujeni
CHILI Group, École Polytechnique Fédérale de Lausanne
Station 20, 1015 Lausanne, Switzerland
Email: mina.shirvaniboroujeni@epfl.ch
Beat A. Schwendimann
CHILI Group, École Polytechnique Fédérale de Lausanne
Station 20, 1015 Lausanne, Switzerland
Email: beat.schwendimann@epfl.ch
Adrian Holzer
REACT Group, École Polytechnique Fédérale de Lausanne
Station 9, 1015 Lausanne, Switzerland
Email: adrian.holzer@epfl.ch
Denis Gillet
REACT Group, École Polytechnique Fédérale de Lausanne
2xxxx
Station 9, 1015 Lausanne, Switzerland
Email: denis.gillet@epfl.ch
Abstract: Education is experiencing a paradigm shift towards blended learning
models in technology-enhanced learning (TEL). Despite the potential benefits of
blended learning, it also entails additional complexity in terms of monitoring,
awareness and reflection, as learning happens across different spaces and
modalities. In recent years, literature on Learning Analytics (LA) and Educational
Data Mining (EDM) has gained momentum and started to address the issue. To
provide a clear picture of the current state of the research on the topic and to outline
open research gaps, this paper presents a systematic literature review of the state-
of-the-art of research in LA and EDM on monitoring, awareness and reflection
in blended TEL scenarios. The search included six main academic databases in
TEL that were enriched with the proceedings of the workshop on ’Awareness and
Reflection in TEL’ (ARTEL), resulting in 1089 papers out of which 40 papers
were included in the final analysis.
Keywords: monitoring, awareness, reflection, blended learning, learning
analytics, educational data mining
Reference to this paper should be made as follows: xxxx (xxxx) ‘xxxx’, xxxx,
Vol. x, No. x, pp.xxx–xxx.
Biographical notes:
María Jesús Rodríguez-Triana is a post-doctoral researcher in the Coordination
and Interaction Systems Group (REACT) at École Polytechnique Fédérale de
Lausanne (EPFL). Her research lines address blended and collaborative learning,
classroom orchestration, learning design, learning analytics, and distributed
learning environments. Currently, she is investigating the synergies between
learning design and learning analytics when using pedagogical approaches such
as inquiry based learning and computer-supported collaborative learning.
Luis P. Prieto is a Marie Curie postdoctoral fellow at the CHILI group in the
Swiss Federal Institute of Technology in Lausanne (EPFL). During his Ph.D. at
the University of Valladolid (Spain), he proposed technologies and conceptual
tools to support teachers in enacting blended collaborative learning activities.
Aside from investigating the use of paper-based learning technologies in schools,
at EPFL he is currently developing models of classroom orchestration by teachers
in face-to-face classrooms, using wearable sensor data (to be used for teachers’
everyday professional reflection).
Andrii Vozniuk is a PhD student in Computer Science in the Coordination and
Interaction Systems group (REACT) at the Swiss Federal Institute of Technology
in Lausanne (EPFL). His research and software engineering activities focus
on interaction systems and visual analytics in the educational and knowledge
management domains. In the course his PhD studies he has developed a number of
tools visualising students’ activities in order to support awareness and reflection
in blended inquiry learning sessions.
Mina Shirvani Boroujeni is a PhD student in Computer Science in the Computer
Human Interaction for Learning and Instruction group (CHILI) at the Swiss
Federal Institute of Technology in Lausanne (EPFL). Her research focus is on
learning analytics, social networks analysis and learning dashboards to support
awareness and reflection in blended learning settings.
Beat A. Schwendimann is a post-doctoral researcher in the CHILI group at the
xxxx 3
Swiss Federal Institute of Technology in Lausanne (EPFL). He conducted his
Ph.D. research as a Fulbright scholar at the University of California, Berkeley
exploring how collaborative knowledge visualization activities, embedded in a
technology-enhanced science learning environment, can foster a more coherent
understanding of biology. At EPFL, he is the coordinator of the Leading House
DUAL-T that develops and implements innovative learning technologies that
support vocational students by bridging the skill gap between what they learn in
school and in their workspaces.
Adrian Holzer is research associate at Ecole Polytechnique Fédérale de Lausanne
(EPFL) in the Coordination & Interaction Systems Group (REACT). He obtained
his PhD in Information Systems at the University of Lausanne, Switzerland. His
research and teaching activities focus on mobile social media interaction in the
learning and humanitarian contexts.
Denis Gillet received Ph.D. degree in Information Systems from the Swiss
Federal Institute of Technology in Lausanne (EPFL) in 1995. He is currently
Associate Professor at the EPFL School of Engineering, where he leads the
REACT multi-disciplinary research group. Dr. Gillet is Associate Editor of the
IEEE Transactions on Learning Technologies (TLT) and of the International
Journal of Technology Enhanced Learning. His current research interests include
Technologies Enhanced Learning (TEL), Human Computer Interaction (HCI),
Human Devices Interaction (HDI) and Optimal Coordination of Complex and
Distributed Systems.
1 Introduction
The Horizon report 2013 identifies, among the key trends in education, the shift of education
paradigms to use blended learning models (The New Media Consortium, 2013). Currently,
most learning activities occur in some form of blended learning (Oliver & Trigwell, 2005),
combining different technologies, locations, pedagogical approaches, learning theories,
objectives, interaction types and delivery modes.
In parallel, the usage of the technology-enhanced learning (TEL) environments, and
the consequent possibility of tracking students activity, have prompted the apparition of
research areas such as Educational Data Mining (EDM) and Learning Analytics (LA). Both
research fields have contributed extensively to address one of the grand challenge problems
in TEL: to make use and sense of data for improving teaching and learning (Sutherland et
al., 2012).
Blended learning poses additional complexity for tracking and interpreting students’
work. For example, in blended learning, face-to-face activities are usually interleaved with
on-line tasks. This combination results in actions happening outside of the technological
environments that often go unnoticed (Ruiz et al., 2013). Additionally, these obstacles
enlarge when the technological context is heterogeneous and decentralised (Sclater, 2008;
Ferguson, 2012), as it happens in distributed learning environments (DLEs) MacNeill
& Kraan (2010), or whenever monitoring data are not generated automatically through
technological means, but rather provided ad-hoc by the participants (Vavoula & Sharples,
2009; Rodríguez-Triana et al., 2013). Then, to gain a comprehensive understanding of
learners’ activities, different dimensions of blended learning need to be captured, processed,
and presented.
4xxxx
Learning Analytics and Educational Data Mining may help to make sense of the data
collected in blended learning scenarios from multiple perspectives. Among others, there is
a growing community of researchers interested in the problems of awareness and reflection,
as it is illustrated by the different editions of the workshops on ‘Awareness and Reflection in
Technology Enhanced Learning’ (ARTEL). Thus, in this paper, we focus on how these two
fields have contributed to monitor learners’ actions and to analyse data in order to support
awareness and reflection in blended TEL (hereinafter referred to as blended learning for
simplicity’s sake).
The purpose of this paper is to provide a systematic analysis of the state of research on
monitoring, awareness and reflection in blended learning. Concretely, our research questions
are: (1) in which learning contexts have monitoring, awareness or reflection been studied?
(2) what kind of research problems around monitoring, awareness and reflection have been
investigated? (3) what solutions have been provided for those problems? (4) what is the
maturity of these solutions in terms of evaluation? (5) what are the open issues and future
lines of work in this field?
To answer these questions, our systematic literature review has covered six main
academic databases in blended learning (ACM, AISEL, IEEE Xplore, SpringerLink,
Science Direct and Wiley), the proceedings of the ARTEL Workshop (editions 2011-2014),
and grey literature obtained from Google Scholar. Out of the initial collection of 1089
papers, 40 were included in the in-depth analysis.
The paper is structured as follows: Section 2 defines the key concepts and presents
related work. Section 3 describes the research methodology. Section 4 presents the results of
the literature review. Finally, section 5 discusses the main findings of the in-depth analysis,
their implications, and gathers the main open issues and future research lines in the area.
2 Monitoring, Awareness and Reflection in Blended Learning
This section provides a definition of the main concepts used in this review. First, we clarify
our understanding of blended learning and equivalent terms used for defining such concept.
Then, we focus on monitoring, awareness and reflection, highlighting the aspects that
differentiate them. Finally, we present related reviews carried out in the area.
2.1 Blended, Mixed and Hybrid learning
Over the past few years, blended learning, also referred to as ‘mixed learning’ or ‘hybrid
learning’, has been widely adopted by institutions of all types (Diaz & Brown, 2010).
Even though blended learning has become somewhat a buzzword in TEL, there is still
quite a bit of ambiguity about what the term refers to (Oliver & Trigwell, 2005). There
are some general definitions such as the one given by the Joint Information Systems
Committee (JISC) - “the inclusion of multiple approaches to teaching and learning within
a programme” (Busuttil-Reynaud & Winkley, 2006), or the definition provided by So
and Brush (2008) - “any combination of learning delivery methods” that leave the door
open to many kinds of learning. Pérez-Sanagustín proposed a wider definition by using
blended in a broad sense: a blend of spaces, a blend of activity types (formal and non-
formal), and a blend of technologies to integrate activities (Pérez-Sanagustín, 2011).
However, most commonly, blended learning refers to the combination of face-to-face and
technology-supported learning activities (Koper, 2005; Graham, 2005), some performed
xxxx 5
synchronously and others asynchronously (Diaz & Brown, 2010). Garrison and Kanuka
(2004) defined blended learning as “the thoughtful integration of classroom face-to-face
learning experiences with online learning experiences”. In summary, blended learning does
not refer to a single model but comes in many different flavors and styles (Picciano, 2014).
The term ‘blended’ refers to a mixture rather than a simple assembly of components.
‘Blending’ can exist along a spectrum of several axes, for example from conventional face-
to-face to fully online, from minimal use of technology to technology infused, and from
formal to informal settings.
2.2 Monitoring, Awareness, and Reflection
In this review, we distinguish the terms monitoring,awareness, and reflection (see Figure
1). Monitoring can be described as tracking learner’s activities and outcomes. Learners
can monitor themselves (self-monitoring) or learners can be monitored by another person,
usually by a teacher or an administrator. Monitoring can be activity-centred (process) or
outcome-centred (product) (Florian-Gaviria et al., 2013). Monitoring can take place in
real-time or post-hoc. Monitoring learners’ performance aims to make trends, patterns, or
changes available to stakeholders. Monitoring is a prerequisite for awareness and reflection.
While monitoring focuses on learner’s actions and outcomes, awareness infers the
current state of either the learner’s understanding or the learning artefacts. Awareness
can be seen as a subsequent step from monitoring. For learners, awareness refers to the
metacognitive process of being aware of one’s own state of understanding and progress
(self-awareness) as well as teachers’ awareness of the state of their students and classes,
which Dillenbourg called ‘state awareness’ (Dillenbourg et al., 2011). Awareness represents
an important part of CSCW and CSCL research (Ganoe et al., 2003; Gross, 2013; Phielix
et al., 2010).
Figure 1 Monitoring, awareness and reflection
Reflection builds on awareness. There are various definitions for reflection, but it
is generally agreed that reflection requires awareness of one’s experiences and critical
thinking to examine presented information, ponder experiences, question their validity,
and draw critical conclusions (Hoyrup & Elkjær, 2006). Depending on the emphasis on
theory or practice, definitions vary from philosophical articulations as in Dewey (1997),
6xxxx
formulations in theoretical frameworks, such as the ‘reflection-in action’ and ‘reflection-
on-action’ constructs developed by Schön (1983), to the use of reflection in the experiential
learning cycle by Kolb (2014). Research in CSCL and education, for example (Ackermann,
1996; De Jong, 2010; Davis, 2003) has argued that reflection is important for learning.
Reflection can be self-reflection (by the learner) or ‘state reflection’ about the learner’s
state of understanding by others (such as teachers or administrators). There is a widely-
documented argument that self-reflection enhances learning and practice, since the learner
is involved in processes that explore experiences as means of deepening understanding
(Boud et al., 2013; Linn & Eylon, 2011). Self-reflection enables learners to gain insights
from their experiences that can foster further learning. Both self- and state reflection lead to
decision making that influences further learning activities, for example learners can identify
activities needed to improve their understanding (self-regulation), or a teacher can design
activities to support learners’ particular needs.
2.3 Related reviews
There are numerous reviews on blended learning, elearning, mobile learning, or learning
dashboards. In blended learning research, existing reviews focused on different aspects
of blended learning, for example, Garrison (2004) reviewed blended learning from an
organizational viewpoint. They concluded that is inevitable that higher education will
adopt blended learning approaches and that this will have a transformative effect on
students’ learning experience. Similarly, Picciano (2014) reviewed the benefits and concerns
of big data and learning analytics in blended learning environments. The goal of the
review was to inform higher education administrators to make informed decisions when
investing in learning technologies. Other reviews focused more on technical aspects, such
as Verbert et al. (2014; 2013) who compared design features of selected learning analytics
dashboards. Conde et al. (2015) reviewed selected learning analytics tools that monitor
students’ activities in blended learning contexts supported by Moodle. Yengin et al. (2010)
reviewed research on teacher monitoring tools in learning management systems to introduce
their model of teachers’ roles in elearning environments. Corrin and de Barba (2014)
explored students’ interpretation of feedback delivered through dashboards finding out that
dashboards can support awareness, reflection, and performance. Lucke and Rensing (2014)
used the term ‘pervasive education’ to refer to mobile, ubiquitous, pervasive, contextualised
and seamless technologies for education. Their review noted that pervasive technologies
can support context awareness and identified best practice solutions for certain educational
settings.
Overall, several existing reviews focused on organizational aspects of blended learning
or on technical aspects of technology-enhanced learning. However, none of the existing
reviews addressed the cross-section of blended learning and monitoring, awareness, and
reflection. As blended learning is becoming the common model in a wide variety of
institutions, the important question arises how such blended learning models can be used
to monitor students’ activities and outcomes to promote awareness, reflection and decision
making. Our paper contributes a systematic review of the state of research on blended
learning with a focus on monitoring, awareness and reflection.
xxxx 7
3 Methodology
As posed in Section 1, the purpose of this paper is to systematically review existing
monitoring, awareness and reflection proposals in the areas of Educational Data Mining and
Learning Analytics that address blended learning. To carry out this research, we followed
the guidelines for a systematic literature review proposed by Kitchenham and Charters
(2007). Even though these guidelines were originally envisioned for software engineering,
they are based on existing guidelines for systematic reviews in other disciplines such as
medical and social sciences, and are applicable to other domains.
When conducting the review, we selected six main electronic databases in Technology
Enhanced Learning: ACM Digital Library1, AISEL2, IEEE Xplore3, SpringerLink4, Science
Direct5and Wiley6. Apart from the aforementioned databases, we included the Proceedings
of the Workshops on Awareness and Reflection in Technology Enhanced Learning (ARTEL)
available at the time of the review ( 20117, 20128, 20139, 201410 ) due to the relevance to the
research questions. In addition, Google Scholar11 was added in order to detect potentially
relevant grey literature.
To perform the search, we broke down the question into the delivery method (Blended,
Mixed or Hybrid learning), the purpose (awareness, reflection or monitoring) and the
research field where the proposal was framed (Learning Analytics or Educational Data
Mining). The resulting search string was: (“Blended Learning” OR “Mixed Learning” OR
“Hybrid Learning”) AND (Awareness OR Reflection OR Monitoring) AND (“Learning
Analytics” OR “Data Mining”). Using this query, we searched the title, abstract and
keywords (whenever possible, or the closest that each database query engine allowed, since
all of them differed). A total of 989 papers were obtained by running the query in each
database. Additionally, the top 100 papers from GScholar (out of a total of 4240) were
added. Since new papers may have appeared, it is noteworthy that the literature search was
conducted on June, 3rd 2015.
After performing the searches, each candidate paper passed through a set of three stages
until its eventual selection. First, we assessed the title and read the abstract, looking for
papers that used the keywords within the context of this review; we excluded all those
papers unrelated to educational contexts, that did not involved (or potentially involved)
technological support or did not entail user intervention (e.g., those focused on machine
awareness). 106 candidates passed the first stage out of which six were duplicates. Second,
we retrieved the remaining papers (98 out of 100 were available), read each fully and
critically appraised it - discarding it in case of being out of scope, of no credibility, or of very
low quality. Then, a data extraction form was filled out for each selected paper in order to
gather data to address the review goals 12. Third, we removed preliminary versions of works
already being analysed (unless they described different aspects). In the first two stages of
the review, the papers were randomly distributed among the six researchers in charge of the
review to ensure that each paper was reviewed by at least two people. Moreover, conflicting
views or unclear papers were discussed to reach consensus by the whole team of reviewers.
Finally, 40 papers passed all the aforementioned filters and were taken into account in the
review presented in Section 4. The list of papers taken into consideration as well as relevant
information about the review process are presented in Table 1.
8xxxx
Table 1 Overview of the reviewed papers in terms of type of blended learning (CF. Computer-mediated & F2F, DF. Distance & F2F, FN. Formal & Not formal,
-. Not specified), educational level (P. Primary, S. Secondary, or U. University Education, -. Not specified), Technological context (D. Desktop, M.
Mobile, T. Tabletop, or W. Web application, V. VLE, D*. DLE, -. Not specified), Aspects covered (A. Awareness, M. Monitoring, R. Reflection),
Target users (I. Institutions, R. Researchers, S. Students, T. Teachers), Type of proposal (T. Theoretical, A. Architecture, I. Indicators, V.
Visualization, D. Data Analysis, S. System), Data sources (D. Digital or P. Physical traces, U. Users’ feedback, A. Learning artefacts, I. Institutional
database, E. External APIs), Types of indicators (L. Learner, A. Action, L. Content, R. Result, C*. Context, or S. Social-related), and Type of
Evaluation (I. Informal, C. Controlled/Usability studies, AN. Authentic dataset analyses, AU. Evaluation in authentic setting, -. No evaluation).
Reference Blended learning Educational level Tech. context Aspect Target users Proposal type Data sources Indicators Evaluation
Cacciamani et al. (2012) CF - D* A, R, M S S D A, C -
Calvo & Ellis (2010) DF U D, W A, R, M S I, S A C AU
Carceller et al. (2013) DF, CF U V, D A, M S, T, I S D A, R AN
Chetlur et al. (2014) - U D* A, R, M T A, I, V, S D A AU
Cocea & Weibelzahl (2009) CF S M A, M T I, DA, S D A AN
Corrin & de Barba (2014) DF U W A, R S I, V D, I A, R C
Daley et al. (2014) CF S W R S I,V,S D, U A, R AU
Fidalgo-Blanco et al. (2015) - U D* A, M T S D, A A AN
Florian-Gaviria et al. (2013) DF, CF U V R T S I R AU
García et al. (2012) DF, CF U D, W, V A, R, M T A, I, V, S D, A A -
Giannakos et al. (2014) DF U W A, R T A, I, V, DA, S D,U,A A, C, R AU
Hecking et al. (2014) DF U V A T DA D A AN
Jyothi et al. (2012) CF U - A, M T I, V, DA, S D,A A, C, S -
Kotsiantis et al. (2013) CF U V A, M T DA D A, C* AN
Martin et al. (2008) CF U V A, R, M T DA, S D L, A C
Martinez-Maldonado et al. (2015a) CF U D, T A, M T A, I, V, S D,P A, C*, S AU
Martinez-Maldonado et al. (2015b) CF U D, T A, R, M T T, S D, P A, S, AU
Melero et al. (2015) CF U V A, R, M S, T V, S D,P A, R, C* AU
Miller et al. (2015) CF P, S V A T I, DA, S D,P A, R AN
Mödritscher et al. (2013) DF U V A T I, DA D L, A, R AN
Mohar et al. (2012) CF U V R T DA D A AN
Ozturk et al. (2014) DF U V R S, T DA A S AN
Palomo-Duarte et al. (2014) DF - - A, R, M T, R A, I, S D,A A,C,R -
Park & Jo (2015) DF U W A, R, M S, T I, V, S D A C, AU
Phillips et al. (2011) DF S, U V A, M S S D, U A AU
Ram et al. (2011) FN - D* A, M S, T A, S D, A C, S I
Rayón et al. (2014) CF S V A, M S, T A, I D A, C, R, C* -
Richards & DeVries (2011) DF U V A, R, M T T, I, S U C -
Rodriguez Groba et al. (2014) CF U PLE A, M T A, S, D, I, A A, C, C* AU
Rodríguez-Triana et al. (2015) DF U D* A, R, M T T, A, S D, U, A, E A, C, C*, S AU
Rodríguez-Triana et al. (2013) CF U D* A, M T T, S, D, A, E A, C, C*, S I
Ruiz et al. (2013) DF, CF U - A, M S, T A, V, S P, D, U A, C*, S -
Santos et al. (2013) CF U V A, R S S U, E A AU
Scheffel et al. (2012) DF, FN U M A, M T DA D A I
Sharma & Mavani (2011) DF U V A, R, M T, I T, S D, U, I S, R AN
Shum & Crick (2012) DF U V A, M S DA, S D, U, A, I A -
Tempelaar et al. (2013) DF - D* A, R, M S T, I D, U, A, I L, A I
Tibola et al. (2012) DF U V A, R, M S, T A D, I A, L, R, A, C -
Vozniuk et al. (2015) CF S D* A, R, M S, T A, I, V, S D A,C,R AU
Yen (2011) CF U V A, R T DA D A AN
xxxx 9
4 Results
This section presents the results of the review organised along the research questions of the
paper investigating the learning context, the addressed problems, the proposed solutions
and their evaluation.
4.1 Learning context
As outlined in Section 2, the definition of blended learning is not clear in the literature. Thus,
in order to better understand the learning contexts addressed in the different papers, we
analysed how the authors interpreted blended learning. As shown in Figure 2, the majority
of papers considered blended learning as the combination of either face-to-face and distance
learning (21), or face-to-face and computer-mediated interaction (20), or both of them.
Only a few cases (Ram et al., 2011; Scheffel et al., 2012) presented blended learning as the
mixture of formal and non-formal learning.
Figure 2 Definition of blended learning used in the reviewed papers
Despite people constantly learning anywhere and anytime, 95% of the reviewed papers
were devoted to formal learning and only one paper focused explicitly on non-formal settings
(Ram et al., 2011). Regarding formal education, university settings gathered the attention of
most of the research works (77.5%), followed by secondary education (15%), while primary
education was only targeted by Miller et al. (2015).
Another aspect in our analysis was the pedagogical approach. Based on the descriptions
provided for the learning activities presented in the papers, we clustered the papers
into different groups. Although many papers did not specify neither the pedagogical
approach nor the activities (42.5%), it appears significant that 42.5% papers dealt with
computer-supported collaborative learning (CSCL). The remaining papers covered different
approaches, such as mobile and location-based learning (7.5%), flipped-classroom (5%),
test-based learning (5%), inquiry-based learning (5%), problem-based learning (2.5%),
game-based learning (2.5%) or self-regulated learning (2.5%).
Our analysis of the description of learning activities presented in the papers revealed
that 70% were devoted to long-term activities (i.e., courses through one or more academic
10 xxxx
years), 17.5% to medium-term (multiple sessions) and only two cases (5%) referred to
single sessions.
From the technological perspective, web technologies constituted the main
technological context. Virtual learning environments (VLEs) were used in 47.50% of the
papers, especially Moodle, which appeared in 16 different occasions; web applications
appeared in 15% of the papers, and in 20% of the works both VLEs and web applications
were combined in a distributed learning environment (DLE). Additionally, in a few cases,
the technological context was supported by desktop (12.5%), mobile (5%) or tabletop (5%)
applications. Figure 3 provides an overview of the platforms used in the technological
contexts as well as the frequency of appearance in the papers. What is also noteworthy is the
number learning platforms that made up the learning context since it increases the difficulty
of the data gathering and integration. Since the heterogeneity of the platforms could be more
challenging that the number of instances of the same type, we have analysed how many
learning platforms were part of the learning activities (see Figure 4). In 57.5% of the papers
the learning activity involves only one platform, while in 37.5% two or more tools are used.
This is an indicator of how distributed is the learning activity from the technological point
of view.
Figure 3 Learning technologies used in the reviewed papers
4.2 Problem
In order to understand the type of problems being investigated in the reviewed papers, we
analysed which aspect(s) monitoring, awareness and reflection was covered by papers, what
particular problem they were addressing and who were the target users of the proposal.
xxxx 11
Figure 4 Number of different platforms that made up the learning context of the reviewed papers
Based on the definitions provided in section 2.2, the majority of reviewed papers (37
papers) were devoted to awareness and a considerably large proportion of them focused
on monitoring (30 papers) and reflection (24 papers). Figure 5 presents an overview of
the papers’ distribution among the three aspects. It is noteworthy that the papers generally
addressed more than one aspect, for instance 20 papers were dedicated to awareness and
reflections and 16 papers covered all the three aspects.
Figure 5 Aspects being covered in the reviewed papers
For each aspect, we analysed the description of problems being addressed by the papers
and classified them into 15 categories displayed in Figure 6. Each category label consists
of two parts. The first part refers to the aspect of the problem which can be monitoring,
12 xxxx
awareness or reflection. The second part of the label refers to the aspect of the solution and
can be actions referring to actions performed by the learner(s), scenario representing course
plan, design or script, interaction referring to learners interaction, collaboration or forum
discussion, resources referring to course materials or assignments, performance standing
for assessment or evaluation outcomes. The other category could be subject referring to the
learning topic, user-content representing the artifacts created by learner(s) such as submitted
assignment, user-defined referring to the cases when the user (i.e. the teacher) decides which
activities or indicators she wants to monitor, be aware of or reflect on, such as in Rodríguez-
Triana et al. (2013, 2015). As shown in Figure 6, monitoring and awareness of users’ actions
are the two most common problems targeted in the reviewed papers (26 and 25 papers).
A considerable number of papers (15 and 13 papers) were also focused on awareness of
users’ performance and awareness of resources (such as usage patterns) whereas monitoring
and reflection on these two topics was not a common concern of the papers. However,
course scenario and user interactions were almost evenly taken into account for monitoring,
awareness and reflection.
Figure 6 Type of problems addressed by the reviewed papers
The reviewed proposals were targeting four different user categories: teachers, students,
institutions and researchers. As depicted in Figure 7, teachers (32 papers) and students (17
papers) were clearly the main target users of the proposals, and three papers also mentioned
institutions or researchers as their target users.
Figure 7 Target users of the proposals in the reviewed papers
xxxx 13
4.3 Solution
We have analysed the following aspects of the solutions presented in the reviewed papers:
type of proposal, data sources used by the proposal, platforms that the data come from and
indicators presented in the solution. Further on we discuss each of these aspects.
In terms of the type of proposal, most of the papers (29 out of 40) presented an
implemented system. Otherwise, six papers presented a data analysis without implementing
a new system, two papers presented an architecture that was still to be implemented and
three papers presented indicators that were not implemented in a system. Out of the 29
implemented proposals, only six presented a theoretical model underlying the implemented
system and 10 presented the system architecture. Figure 8 shows how different types of
proposals are distributed within the papers.
Implemented System (29)
Visualisation (11)
Architecture (12)
Data Analysis (13)
Theoretical Model (6)
Indicators (17)
6
2
3
1
1
2
4
2
Figure 8 The main types of proposals identified in the papers. Overlapping areas contain papers
with multiple types of proposals
Digital user activity was the dominant data source (mentioned by 35 papers). The
next popular data sources were learning artifacts generated/used by the users (13 papers),
followed by the information obtained directly from the users (10 papers). Institutional
database records (7 papers), physical user activity (5 papers) and external APIs (3 papers)
were rarely represented in the papers. Figure 9 schematically shows the data sources
mentioned in the papers.
The single most popular platform for obtaining the data is Moodle, 12 papers mentioned
it. Three papers specified a forum as their data source, but without mentioning specific
name of the forum. Blackboard was used in 2 papers. GLUE!-CAS and External APIs were
mentioned in two papers each. Otherwise 39 platforms were mentioned only once in the
papers, showing diversity of the leaning tools where the data comes from.
During our analysis of the papers, we categorised the indicators into six groups
depending on the information they present (see Figure 10). Some of the indicators belonged
to several groups. Learner-related indicators present information describing the learner(s)
(e.g., prior education, competences, university entrance grade). Action-related indicators
14 xxxx
Digital activity (35)
Institutional database (7)
Learning artefacts (13)
Information asked (10)
Physical activity (5)
External APIs (3)
4
2
2
1
2
1
1
5
1
1
Figure 9 The main types of data sources mentioned in the papers. Overlapping areas contain
papers mentioning multiple data sources
present information about the actions performed by the learner(s) usually in an aggregated
form (e.g., number of page visits, number of file downloads). Content-related indicators
provide information about the content that the learner(s) interacted with or produced (e.g.,
sentiment of the forum messages, topics covered in the report). Result-related indicators
give information about the outcome of the learners’ activity (e.g., average grade in a group).
Context-related indicators provide information about the context, where the learning took
place (e.g., geographical location of learners, weather conditions during the activity). Social-
related indicators show how learners interact with others while learning (e.g. a graph
showing communication direction in a forum). Notably, all the reviewed papers specified
the indicators. Most of them (36) presented action-related indicators, followed by result-
related indicators mentioned in 15 papers, content-related (14 papers) and social-related
(11 papers). Context-related indicators were mentioned in 9 papers and learner-related
indicators in 5 papers.
In order to get an overview of the mapping between target problems and proposed
solutions in the reviewed papers, we analysed the connection between aspects (monitoring,
awareness, reflection) and indicators as represented in Figure 11. Clearly action-related
indicators are the main ones used to address monitoring, awareness and reflection needs and
content-based indicators are the second most dominant type for this purpose. In addition,
social-related and context-related indicators are mainly used for monitoring and awareness
purpose and less often for reflection, whereas result-based indicators are more dominant
for addressing awareness and reflection needs.
4.4 Evaluation
As noted at the beginning of the previous subsection, the analysed papers portray a
wide variety of contributions, from general theoretical proposals to concrete system
implementations. To assess the maturity of such proposals and, by extension, of this area of
research, we have analysed the presence, scale and main characteristics of the proposals’
empirical evaluations.
xxxx 15
Action-related (34)
Result-related (13)
Learner-related (4)
Content-related (13)
Social-related (9)
Context-related (8)
12
4
1
2
1
23
2
1
2
1
1
Figure 10 The main types of indicators mentioned in the papers. Overlapping areas contain papers
mentioning multiple indicators
Figure 11 The mapping between aspects and indicator types in the reviewed papers
Out of the 40 analysed papers, only nine of them (22.5%) had no evaluation whatsoever.
Among the remaining papers, we distinguished a set (11 papers), mainly proposing models,
indicators, visualizations or analytic methods, which applied their proposed model/analysis
to a dataset gathered from authentic educational contexts (e.g., the log files generated
during a course). The scale of these authentic dataset analyses varied greatly: certain studies
analysed data representing a few dozen students, while others used massive multi-course
ensembles (the extreme case being Carceller et al. (2013), in which nstudents = 12901).
The median size of these datasets, however, was nstudents = 252. Interestingly, only one
of these studies (Miller et al., 2015) also gathered data from the teachers (nteachers = 9).
Regarding the maturity of the system implementation proposals to support awareness
and reflection (29 of the 40 analysed proposals), only 14 of them (48.3%) have been applied
and evaluated in authentic conditions (i.e., during an actual course). Another five of these
systems (often, proposals about analytic processes) used data gathered in authentic settings,
but only performed post-hoc analyses, without actually feeding the results of those analyses
16 xxxx
back to stakeholders. Among the rest, there were two studies performing controlled or
usability evaluations (in the case of Park & Jo (2015), to complement an authentic setting
evaluation), and two other studies only reported evaluations informally. The scale and
duration of the system evaluations performed in authentic settings was relatively uniform:
13 out of the 14 authentic setting evaluation studies gathered data from students (ranging
from nstudents = 11 to nstudents = 300, mean value of nstudents = 93, with a standard
deviation of σ= 78.5); on the other hand, 10 of these 14 studies involved teachers in the
evaluation (normally, between one and four of them, with the exception of Florian-Gaviria et
al. (2013), in which nteachers = 20). All of the authentic setting evaluations were performed
over one to three courses, and the length of the interventions ranged from one week to a
full semester (per course).
Although these superficial figures regarding evaluation performed on the systems to
support awareness and reflection are not striking in the field of TEL, more interesting results
are obtained by looking at the constructs targeted by these evaluations (see Figure 12).
Evaluations of the analysed system proposals targeted a wide variety of constructs, but
general constructs such as usability, usefulness or user satisfaction were by far the most
common. It is remarkable that only a handful of them actually tried to evaluate the impact on
awareness or reflection specifically, and that only three of the studies targeted the benefits of
using the proposal in terms of student learning and achievement. Furthermore, the only study
that tried to compare the proposal with an alternative or control group (Park & Jo, 2015)
did not find statistically significant learning effects when using the system for enhanced
awareness and reflection.
Figure 12 Frequency of appearance of constructs among the 14 system implementations evaluated
in authentic settings
5 Discussion
The results from the analysis of the 40 publications on monitoring, awareness and reflection
in blended learning enable us to draw several conclusions about the state of this research
xxxx 17
area covered by Learning Analytics and Educational Data Mining. A first insight would
be that the field would benefit from reviewing the meaning of blended learning according
to the current educational practice. Based on our review, authors uses the term ’blended
learning’ as a multi-dimensional composite setting that can include face-to-face and
technology-supported learning as well as formal and informal activities and presence
and distance activities, which may be performed synchronously and/or asynchronously.
However, considering the current trends in TEL, blended learning could be understood in a
broader sense covering a blend of spaces, activity types, and technologies as it is proposed
by Pérez-Sanagustín (2011).
Despite the current trends towards self-directed, student-centric and lifelong learning,
the reviewed papers focused mainly on formal learning (95%) where teachers were the
main consumers of the data analyses (80%) leaving students aside. Thus, it seems that the
emphasis has been on enhancing teaching more than learning. Nevertheless, none of the
reviewed papers paid attention to teacher practice, all of them dealt with the analysis of
learning. Probably more research is needed in these two directions: supporting students in
monitoring, awareness and reflection activities; and providing feedback to teachers about
their own practice -a field that is starting to be explored as part of the Teaching Analytics
community (Prieto et al., 2015).
Besides, the majority of papers has been applied in university settings (77.5%). This
could be explained by the slower adoption of ICTs in primary and secondary schools. Taking
into account the effort and investment done at multiple levels (schools, policy makers,
governments) to promote the integration of ICTs in these educational settings, we envision
that providing awareness and reflection support in primary and secondary education will
be necessary very soon. Indeed, such support could contribute to reducing the current
uncertainty introduced by the usage of technologies, which frequently discourages teachers
and students from using them.
Several studies have identified an incremental adoption of VLEs, PLEs and Web 2.0
tools in the last few years (Hughes, 2009; Smith & Borreson Caruso, 2010). This aligns
with our finding that the main technological context used for awareness and reflection also
targeted these settings (72%). Within the reviewed papers, Moodle was the most popular
platform for obtaining the data. Another VLE, Blackboard, was also used often as a data
source. In the papers, 39 different learning tools were mentioned only once, which highlights
the high heterogeneity of the learning platforms currently in use. Although more than half
of the papers targeted a single tool/platform, 37.5% had to gather and integrate data coming
from different platforms. Different architectures such as Graasp and GLUE! have been
used in order to build DLEs and therefore centralize the data coming from the different
data sources. However, the usage of this kind of architectures is not very extensive. Thus,
teachers and students often are forced to combine multiple tools without having a more
generic solution that supports the data collection and integration for subsequent analyses.
Another conclusion that emerges from the analyses of the technological context is related
to mobile technologies. Despite current trends towards mobile and wearable technologies
and approaches such as ‘bring your own device’, there were not many works addressing
mobile settings in the review. We foresee that there will be a shift towards using mobile
apps as data sources for monitoring, awareness and reflection in the near future.
Most of the reviewed papers presented an implemented solution showing clear
motivation of the researchers to put their proposals into practice. At the same time, only a
few of the papers elaborated on the theoretical model underlying their solution. The lack
of proper theoretical foundations when designing and building learning analytics tools, is
18 xxxx
a known general issue in the field highlighted recently, among others, by Gaševi´c et al.
(2015).
The main two categories of problems addressed by the papers are related to user
actions and learning resources. This does not necessarily reflect the importance of these two
problems for the field, but could be related to the historic availability of logs (representing
user interactions) and content (for learning resources) in digital learning environments.
Indeed, the analysis of the data sources mentioned in the papers (see Figure 9) demonstrates
that the majority of the papers worked with the digital activity and learning artifacts data.
This ‘historic availability’ hypothesis could also explain why action-related indicators
are the most popular ones when presenting the information. Although the learner, context
and social related indicators could be more relevant than user actions for reflection purposes,
they are underrepresented in the analyzed papers. This could also be linked to current
unavailability of data describing the learners (for instance their age, or prior education) and
their context (placement in class, physical location when learning outside of the school),
or challenges related to accessing such data when it is available. These challenges include
fragmentation of the data (since records about the students are often located in multiple
institutional databases with different stakeholders) and privacy or ethical concerns emerging
when working with such personally identifiable information (Pardo & Siemens, 2014;
Drachsler et al., 2015).
One of the most popular definitions of blended learning is that of the combination of
face-to-face and technology-mediated interactions. While 21 papers explicitly involve face-
to-face and computer-mediated interaction, only 8 of them use data sources that may provide
evidence about the face-to-face part, e.g., integrating user activity registered via physical
sensors or gathering feedback ad-hoc from the participants involved in the learning scenario
by means of interviews, questionnaires, think-aloud sessions, etc. Building a new generation
of monitoring, awareness and reflection tools for blended learning requires to also focus
on capturing interactions happening in the physical world. This is a promising direction
of future research, especially taking into account the recent increase in affordability and
pervasiveness of sensors and the emergence of the Internet of things. Improvements in the
quality of algorithms is now starting to enable the capture of physical interactions, such as
visual object tracking (Raca & Dillenbourg, 2013) or automatic speech recognition (Worsley
& Blikstein, 2015), and facilitates multimodal learning analytics as discussed by Blikstein
(2013).
On the other hand, our analysis of the platforms involved shows that the technology-
mediated part of blended learning is becoming increasingly distributed and is often no longer
confined to a single institution-supplied platform. In order to have a complete picture of the
learning processes, then, it is necessary to aggregate data across the platforms where learning
happens. Given the big number of distinct learning tools mentioned in the reviewed papers,
such data aggregation is particularly challenging and requires the development of common
standards for learning data representation. Fortunately, this issue has been addressed in
recent years by a number of efforts aiming to unify learners’ interaction data representation,
for instance Vozniuk et al. (2013), Kitto et al. (2015) and Santos et al. (2015).
Finally, from our analysis of the evaluation of proposals to support awareness and
reflection in blended TEL, we see that there is a general concern in this sub-area of LA and
EDM to use authentic data, first as a way to validate what models and analyses may provide
useful insights for stakeholders in real-world cases, but also as the only way of evaluating
proposals in terms of ecological validity. However, a major gap found in most of the
system evaluations analysed is their focus on general technical and system implementation
xxxx 19
constructs (like usability and usefulness), something typical of early-stage efforts. Few
studies actually look at how (or how much) awareness and/or reflection are improved or,
more importantly, what are the effects of such enhancements on student learning. Also, we
found that longer-term, longitudinal studies of the usage of these proposals in everyday
educational practice (e.g., beyond one semester) were lacking; however, we recognise that,
first, more rigorous inquiry (even if short-term) into the effects ofthe proposals on awareness,
reflection and learning, is needed.
Overall, our analyses paint the picture of a relatively young field, with a large
proportion of divergent explorations (e.g., proposals that are never implemented, or system
implementations that only get cursory evaluations), but wherein little accumulation of
applicable knowledge is being done. Moreover, a large majority of the proposals builds
upon general educational lore about the benefits of awareness and reflection, but remain
reluctant to gather evidence about the benefits (for awareness, for reflection, and ultimately
for learning) of their concrete approaches in longitudinal and authentic setting evaluations.
We suggest that this explicit evaluation of benefits in terms of awareness, reflection and
learning is essential for the accumulation of knowledge in the area, which could eventually
lead to the proposal of technology (and intervention) design principles for effective support
of awareness and reflection addressing the complexity of blended learning. This should be
another of the main directions of future research in the support for awareness and reflection
in blended technology-enhanced learning.
Acknowledgements
This research was supported by the European Union in the context of the Go-Lab project
(Grant Agreement no. 317601) under the Information and Communication Technologies
(ICT) theme of the 7th Framework Programme for R&D (FP7), a Marie Curie Fellowship
within the 7th European Community Framework Programme (MIOCTI, FP7-PEOPLE-
2012-IEF project no. 327384), and DUAL-T, a leading house funded by the Swiss State
Secretariat for Education, Research and Innovation (SERI).
References
Ackermann, E. (1996), ‘Perspective-taking and object construction: two keys to learning’,
Constructionism in practice: designing, thinking, and learning in a digital world,
Lawrence Erlbaum, Mahwah, NJ pp. 25–35.
Blikstein, P. (2013), Multimodal learning analytics, in ‘Proceedings of the Third
International Conference on Learning Analytics and Knowledge’, LAK ’13, ACM, New
York, NY, USA, pp. 102–106.
Boud, D., Keogh, R. & Walker, D. (2013), Reflection: Turning experience into learning,
Routledge.
Busuttil-Reynaud, G. & Winkley, J. (2006), JISC e-assessment glossary, Technical report,
Joint Information Systems Committee (JISC), Bristol.
20 xxxx
Cacciamani, S., Cesareni, D., Martini, F., Ferrini, T. & Fujita, N. (2012), ‘Influence of
Participation, Facilitator Styles, and Metacognitive Reflection on Knowledge Building in
Online University Courses’, Computers & Education 58(3), 874–884.
Calvo, R. A. & Ellis, R. A. (2010), ‘Students’ Conceptions of Tutorand Automated Feedback
in Professional Writing’, Journal of Engineering Education 99(4), 427–438.
Carceller, C., Dawson, S. & Lockyer, L. (2013), ‘Improving Academic Outcomes: Does
Participating in Online Discussion Forums Payoff?’, International Journal of Technology
Enhanced Learning. 5(2), 117–132.
Chetlur, M., Tamhane, A., Reddy, V. K., Sengupta, B., Jain, M., Sukjunnimit, P. & Wagh,
R. (2014), EduPaL: Enabling Blended Learning in Resource Constrained Environments,
in ‘Proceedings of the Fifth ACM Symposium on Computing for Development’, ACM
DEV-5 ’14, ACM, New York, NY, USA, pp. 73–82.
Cocea, M. & Weibelzahl, S. (2009), ‘Log File Analysis for Disengagement Detection in
e-Learning Environments’, User Modeling and User-Adapted Interaction 19(4), 341–385.
Conde, M. A., Hérnandez-García, A., García-Peñalvo, F. J. & Séin-Echaluce, M. L.
(2015), Exploring Student Interactions: Learning Analytics Tools for Student Tracking,
in ‘Learning and Collaboration Technologies’, Springer, pp. 50–61.
Corrin, L. & de Barba, P. (2014), Exploring students’ interpretation of feedback delivered
through learning analytics dashboards, in ‘Proceedings of the ascilite 2014 conference’.
Daley, S. G., Hillaire, G. & Sutherland, L. M. (2014), ‘Beyond performance data: Improving
student help seeking by collecting and displaying influential data in an online middle-
school science curriculum’, British Journal of Educational Technology.
Davis, E. A. (2003), ‘Prompting middle school science students for productive reflection:
Generic and directed prompts’, The Journal of the Learning Sciences 12(1), 91–142.
De Jong, T. (2010), ‘Cognitive load theory, educational research, and instructional design:
some food for thought’, Instructional Science 38(2), 105–134.
Dewey, J. (1997), How we think, Courier Corporation.
Diaz, V. & Brown, M. (2010), ‘Blended Learning: a report on the ELI focus session’,
EDUCAUSE Learning Initiative.
Dillenbourg, P., Zufferey, G., Alavi, H., Jermann, P., Do-Lenh, S., Bonnard, Q., Cuendet, S.
& Kaplan, F. (2011), ‘Classroom orchestration: The third circle of usability’, CSCL2011
Proceedings 1, 510–517.
Drachsler, H., Hoel, T., Scheffel, M., Kismihók, G., Berg, A., Ferguson, R., Chen, W.,
Cooper, A. & Manderveld, J. (2015), Ethical and privacy issues in the application of
learning analytics, in ‘Proceedings of the Fifth International Conference on Learning
Analytics And Knowledge’, ACM, pp. 390–391.
Ferguson, R. (2012), ‘Learning analytics: drivers, developments and challenges’,
International Journal of Technology Enhanced Learning.
xxxx 21
Fidalgo-Blanco, A., Sein-Echaluce, M. L., García-Peñalvo, F. J. & Conde, M. A. (2015),
‘Using Learning Analytics to improve teamwork assessment’, Computers in Human
Behavior 47, 149–156.
Florian-Gaviria, B., Glahn, C. & Fabregat Gesa, R. (2013), ‘A Software Suite for Efficient
Use of the European Qualifications Framework in Online and Blended Courses’, IEEE
Transactions on Learning Technologies 6(3), 283–296.
Ganoe, C. H., Somervell, J. P., Neale, D. C., Isenhour, P. L., Carroll, J. M., Rosson,
M. B. & McCrickard, D. S. (2003), Classroom bridge: using collaborative public and
desktop timelines to support activity awareness, in ‘Proceedings of the 16th annual ACM
symposium on User interface software and technology’, ACM, pp. 21–30.
García, R. M. C., Pardo, A., Kloos, C. D., Niemann, K., Scheffel, M. & Wolpers, M. (2012),
‘Peeking into the Black Box: Visualising Learning Activities’, International Journal of
Technology Enhanced Learning 4(1/2), 99–120.
Garrison, D. R. & Kanuka, H. (2004), ‘Blended learning: Uncovering its transformative
potential in higher education’, The internet and higher education 7(2), 95–105.
Gaševi´c, D., Dawson, S. & Siemens, G. (2015), ‘Let’s not forget: Learning analytics are
about learning’, TechTrends 59(1), 64–71.
Giannakos, M. N., Chorianopoulos, K. & Chrisochoides, N. (2014), Collecting and making
sense of video learning analytics, in ‘Frontiers in Education Conference (FIE), 2014
IEEE’, IEEE, pp. 1–7.
Graham, C. R. (2005), Blended learning systems: Definition, current trends, and future
directions, in C. J. Bonk & C. R. Graham, eds, ‘Handbook of blended learning: global
perspectives, local designs,’, Pfeiffer, San Francisco, CA, pp. 3—-21.
Gross, T. (2013), ‘Supporting effortless coordination: 25 years of awareness research’,
Computer Supported Cooperative Work (CSCW) 22(4-6), 425–474.
Hecking, T., Ziebarth, S. & Hoppe, H. U. (2014), Analysis of Dynamic Resource Access
Patterns in a Blended Learning Course, in ‘Proceedings of the Fourth International
Conference on Learning Analytics And Knowledge’, LAK ’14, ACM, New York, NY,
USA, pp. 173–182.
Hoyrup, S. & Elkjær, B. (2006), ‘Reflection: Taking it beyond the individual’, Productive
reflection at work pp. 29–42.
Hughes, A. (2009), Higher Education in a Web 2.0 world, Technical Report March, Joint
Information Systems Committee (JISC).
Jyothi, S., McAvinia, C. & Keating, J. (2012), ‘A Visualisation Tool to Aid Exploration of
Students’ Interactions in Asynchronous Online Communication’, Computers & Education
58(1), 30–42.
Kitchenham, B. & Charters, S. (2007), Guidelines for performing systematic literature
reviews in software engineering, Technical report, Keele University (UK).
22 xxxx
Kitto, K., Cross, S., Waters, Z. & Lupton, M. (2015), Learning analytics beyond the lms: The
connected learning analytics toolkit, in ‘Proceedings of the Fifth International Conference
on Learning Analytics And Knowledge’, LAK ’15, ACM, New York, NY, USA, pp. 11–
15.
Kolb, D. A. (2014), Experiential learning: Experience as the source of learning and
development, Pearson Education.
Koper, R. (2005), An introduction to Learning Design, in R. Koper & C. Tattersall, eds,
‘Learning Design: A Handbook on Modelling and Delivering Networked Education
Training’, Springer Berlin Heidelberg, pp. 3–20.
Kotsiantis, S., Tselios, N., Filippidi, A. & Komis, V. (2013), ‘Using Learning Analytics
to Identify Successful Learners in a Blended Learning Course’, International Journal of
Technology Enhanced Learning 5(2), 133–150.
Linn, M. C. & Eylon, B.-S. (2011), Science learning and instruction: Taking advantage of
technology to promote knowledge integration, Routledge.
Lucke, U. & Rensing, C. (2014), ‘A survey on pervasive education’, Pervasive and Mobile
Computing 14, 3–16.
MacNeill, S. & Kraan, W. (2010), Distributed Learning environments: A briefing paper,
JISC Center for Educational Technology and Interoperability Standards (CETIS).
Martin, M., Alvarez, A., Fernandez-Castro, I. & Urretavizcaya, M. (2008), Generating
Teacher Adapted Suggestions for Improving Distance Educational Systems with SIgMa,
in ‘Eighth IEEE International Conference on Advanced Learning Technologies, 2008.
ICALT ’08’, pp. 449–453.
Martinez-Maldonado, R., Clayphan, A. & Kay, J. (2015a), ‘Deploying and Visualising
Teacher’s Scripts of Small Group Activities in a Multi-surface Classroom Ecology: a
Study in-the-wild’, Computer Supported Cooperative Work (CSCW) 24(2-3), 177–221.
Martinez-Maldonado, R., Yacef, K. & Kay, J. (2015b), ‘TSCL: A Conceptual Model to
Inform Understanding of Collaborative Learning Processes at Interactive Tabletops’,
International Journal of Human-Computer Studies.
Melero, J., Hernández-Leo, D., Sun, J., Santos, P. & Blat, J. (2015), ‘How was the activity?
A visualization support for a case of location-based learning design’, British Journal of
Educational Technology 46(2), 317–329.
Miller, W. L., Baker, R. S., Labrum, M. J., Petsche, K., Liu, Y.-H. & Wagner, A. Z.
(2015), Automated Detection of Proactive Remediation by Teachers in Reasoning Mind
Classrooms, in ‘Proceedings of the Fifth International Conference on Learning Analytics
And Knowledge’, LAK ’15, ACM, New York, NY, USA, pp. 290–294.
Mohar, T., Sraka, D. & Kauˇciˇc, B. (2012), ‘Analyzing blended based learning systems’.
Mödritscher, F., Andergassen, M. & Neumann, G. (2013), Dependencies Between E-
Learning Usage Patterns and Learning Results, in ‘Proceedings of the 13th International
Conference on Knowledge Management and Knowledge Technologies’, i-Know ’13,
ACM, New York, NY, USA, pp. 24:1–24:8.
xxxx 23
Oliver, M. & Trigwell, K. (2005), ‘Can ‘blended learning’be redeemed?’, E-learning and
Digital Media 2(1), 17–26.
Ozturk, H. T., Deryakulu, D., Ozcinar, H. & Atal, D. (2014), Advancing learning analytics in
online learning environments through the method of sequential analysis, in ‘Multimedia
Computing and Systems (ICMCS), 2014 International Conference on’, IEEE, pp. 512–
516.
Palomo-Duarte, M., Berns, A., Dodero, J. M. & Cejas, A. (2014), Foreign Language
Learning Using a Gamificated APP to Support Peer-assessment, in ‘Proceedings of
the Second International Conference on Technological Ecosystems for Enhancing
Multiculturality’, TEEM ’14, ACM, New York, NY, USA, pp. 381–386.
Pardo, A. & Siemens, G. (2014), ‘Ethical and privacy principles for learning analytics’,
British Journal of Educational Technology 45(3), 438–450.
Park, Y. & Jo, I.-H. (2015), ‘Development of the Learning Analytics Dashboard to Support
Students’ Learning Performance’, Journal of Universal Computer Science 21(1), 110–
133.
Pérez-Sanagustín, M. (2011), Operationalization of collaborative blended learning scripts:
a model, computational mechanisms and experiments, PhD thesis, Universitat Pompeu
Fabra.
Phielix, C., Prins, F. J. & Kirschner, P. A. (2010), ‘Awareness of group performance in a cscl-
environment: Effects of peer feedback and reflection’, Computers in Human Behavior
26(2), 151–161.
Phillips, R., Maor, D., Cumming-Potvin, W., Roberts, P., Herrington, J., Preston, G., Moore,
E. & Perry, L. (2011), ‘Learning analytics and study behaviour: A pilot study’.
Picciano, A. G. (2014), ‘Big Data and Learning Analytics in Blended Learning
Environments: Benefits and Concerns’, International Journal of Artificial Intelligence
and Interactive Multimedia 2(7), 35–43.
Prieto, L. P., Sharma, K. & Dillenbourg, P. (2015), Studying Teacher Orchestration Load
in Technology-Enhanced Classrooms A Mixed-Method Approach and Case Study Luis,
in ‘10 European Conference on Technology Enhanced Learning: Designing for leaching
and Learning in a networked world’, Vol. 9307 of LNCS, Springer International, Toledo,
Spain, pp. 268–281.
Raca, M. & Dillenbourg, P. (2013), System for Assessing Classroom Attention, in
‘Proceedings of 3rd International Learning Analytics & Knowledge Conference’.
Ram, A., Ai, H., Ram, P. & Sahay, S. (2011), Open Social Learning Communities,
in ‘Proceedings of the International Conference on Web Intelligence, Mining and
Semantics’, WIMS ’11, ACM, New York, NY, USA, pp. 2:1–2:6.
Rayón, A., Guenaga, M. & Núñez, A. (2014), Integrating and Visualizing Learner and
Social Data to Elicit Higher-order Indicators in SCALA Dashboard, in ‘Proceedings of the
14th International Conference on Knowledge Technologies and Data-driven Business’,
i-KNOW ’14, ACM, New York, NY, USA, pp. 28:1–28:4.
24 xxxx
Richards, G. & DeVries, I. (2011), Revisiting Formative Evaluation: Dynamic Monitoring
for the Improvement of Learning Activity Design and Delivery, in ‘Proceedings of the 1st
International Conference on Learning Analytics and Knowledge’, LAK ’11, ACM, New
York, NY, USA, pp. 157–162.
Rodriguez Groba, A., Vazquez Barreiros, B., Lama, M., Gewerc, A. & Mucientes, M. (2014),
Using a learning analytics tool for evaluation in self-regulated learning, in ‘Frontiers in
Education Conference (FIE), 2014 IEEE’, IEEE, pp. 1–8.
Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I. & Dimitriadis, Y. (2013),
‘Towards a Script-aware Monitoring Process of Computer-supported Collaborative
Learning Scenarios’, International Journal of Technology Enhanced Learning 5(2), 151–
167.
Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I. & Dimitriadis, Y.
(2015), ‘Scripting and monitoring meet each other: Aligning learning analytics and
learning design to support teachers in orchestrating CSCL situations’, British Journal of
Educational Technology 46(2), 330–343.
Ruiz, S., Urretavizcaya, M. & Fernandez-Castro, I. (2013), Monitoring F2f interactions
through attendance control, in ‘2013 IEEE Frontiers in Education Conference’, pp. 226–
232.
Santos, J. L., Verbert, K., Govaerts, S. & Duval, E. (2013), Addressing Learner Issues
with StepUp!: An Evaluation, in ‘Proceedings of the Third International Conference on
Learning Analytics and Knowledge’, LAK ’13, ACM, New York, NY, USA, pp. 14–22.
Santos, J. L., Verbert, K., Klerkx, J., Duval, E., Charleer, S. & Ternier, S. (2015), ‘Tracking
data in open learning environments’, Journal of Universal Computer Science 21(7), 976–
996.
Scheffel, M., Niemann, K., Leony, D., Pardo, A., Schmitz, H.-C., Wolpers, M. &
Delgado Kloos, C. (2012), Key Action Extraction for Learning Analytics, in ‘Proceedings
of the 7th European Conference on Technology Enhanced Learning’, EC-TEL’12,
Springer-Verlag, Berlin, Heidelberg, pp. 320–333.
Schön, D. A. (1983), The reflective practitioner: How professionals think in action, Vol.
5126, Basic books.
Sclater, N. (2008), ‘Web 2.0, personal learning environments, and the future of learning
management systems’, EDUCAUSE research bulletin.
Sharma, M. & Mavani, M. (2011), Development of Predictive Model in Education System:
Using NaÏVe Bayes Classifier, in ‘Proceedings of the International Conference &
Workshop on Emerging Trends in Technology’, ICWET ’11, ACM, New York, NY, USA,
pp. 185–186.
Shum, S. B. & Crick, R. D. (2012), Learning Dispositions and Transferable Competencies:
Pedagogy, Modelling and Learning Analytics, in ‘Proceedings of the 2Nd International
Conference on Learning Analytics and Knowledge’, LAK ’12, ACM, New York, NY,
USA, pp. 92–101.
xxxx 25
Smith, S. D. & Borreson Caruso, J. (2010), ‘The ECAR study of undergraduate students
and information technology’, EDUCAUSE - Center for applied research (October), 1–13.
So, H.-J. & Brush, T. A. (2008), ‘Student perceptions of collaborative learning, social
presence and satisfaction in a blended learning environment: Relationships and critical
factors’, Computers & Education 51(1), 318–336.
Sutherland, R., Eagle, S. & Joubert, M. (2012), A vision and strategy for Technology
Enhanced Learning, Report from the STELLAR Network of Excellence.
Tempelaar, D. T., Heck, A., Cuypers, H., van der Kooij, H. & van de Vrie, E. (2013),
Formative assessment and learning analytics, in ‘Proceedings of the Third International
Conference on Learning Analytics and Knowledge’, ACM, pp. 205–209.
The New Media Consortium, N. (2013), NMC Horizon project preview: 2013 Higher
Education edition, Technical report, The New Media Consortium.
Tibola, L., Schaf, F. & Pereira, C. (2012), Engineering educational cockpit: Visualization by
integration of heterogeneous environments, in ‘2012 6th IEEE International Conference
on e-Learning in Industrial Electronics (ICELIE)’, pp. 18–23.
Vavoula, G. N. & Sharples, M. (2009), ‘Meeting the challenges in evaluating mobile
learning: a 3-level evaluation framework’, International Journal of Mobile and Blended
Learning 1(2), 54–75.
Verbert, K., Duval, E., Klerkx, J., Govaerts, S. & Santos, J. L. (2013), ‘Learning analytics
dashboard applications’, American Behavioral Scientist p. 0002764213479363.
Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Assche, F., Parra, G. & Klerkx, J. (2014),
‘Learning Dashboards: An Overview and Future Research Opportunities’, Personal and
Ubiquitous Computing. 18(6), 1499–1514.
Vozniuk, A., Govaerts, S. & Gillet, D. (2013), Towards portable learning analytics
dashboards, in ‘International Conference on Advanced Learning Technologies’.
Vozniuk, A., Rodrıguez-Triana, M. J., Holzer, A., Govaerts, S., Sandoz, D. & Gillet, D.
(2015), Contextual Learning Analytics Apps to Create Awareness in Blended Inquiry
Learning, in ‘14th International Conference on Information Technology Based Higher
Education and Training’.
Worsley, M. & Blikstein, P. (2015), Leveraging multimodal learning analytics to
differentiate student learning strategies, in ‘Proceedings of the Fifth International
Conference on Learning Analytics And Knowledge’, LAK ’15, ACM, New York, NY,
USA, pp. 360–367.
Yen, C.-H. (2011), An Analytic Study and Modeling of Online Asynchronous Instruction
via the Notion of Interaction Profiles, in ‘Proceedings of the 4th International Conference
on Hybrid Learning’, ICHL’11, Springer-Verlag, Berlin, Heidelberg, pp. 325–335.
Yengin, I., Karahoca, D., Karahoca, A. & Yücel, A. (2010), ‘Roles of teachers in e-learning:
How to engage students & how to get free e-learning and the future’, Procedia - Social
and Behavioral Sciences 2(2), 5775–5787.
26 xxxx
Note
1ACM: dl.acm.org (last visit: 29/10/2015)
2AISEL: http://aisel.aisnet.org/ (last visit: 29/10/2015)
3IEEE: http://ieeexplore.ieee.org (last visit: 29/10/2015)
4SpringerLink: http://link.springer.com (last visit: 29/10/2015)
5Science Direct: http://www.sciencedirect.com (last visit: 29/10/2015)
6Wiley: http://onlinelibrary.wiley.com (last visit: 29/10/2015)
7ARTEL 2011: http://ceur-ws.org/Vol-790/ (last visit: 29/10/2015)
8ARTEL 2012: http://ceur-ws.org/Vol-931/ (last visit: 29/10/2015)
9ARTEL 2013: http://ceur-ws.org/Vol-1103/ (last visit: 29/10/2015)
10ARTEL 2014: http://ceur-ws.org/Vol-1238/ (last visit: 29/10/2015)
11Google Scholar: https://scholar.google.ch (last visit: 29/10/2015)
12The review form: http://goo.gl/forms/bYYIZvq2Ym