PresentationPDF Available

Unpacking the Bi-directional relationship between learning analytics and learning design in blended learning environments



This paper describes my PhD project that seeks to unpack the bi-directional relationship between learning analytics/learning design in blended learning environments.
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
Unpacking the Bi-directional relationship between learning
analytics and learning design in blended learning environments
Rogers Kaliisa
Department of Education, University of Oslo, Norway
ABSTRACT: This paper suggests a technology-supported teacher-led approach that includes
leveraging learning analytics (LA) to support data-informed learning design (LD) decisions in
blended learning environments. The context of this study is three to five blended Bachelors
courses using the Canvas learning management system (LMS) at two public universities in
Norway. This is a design-based research study, employing quantitative ethnography
approaches. Data will be collected from multiple sources, i.e. course analytics, discussion
forums, interviews, teachers LD representations, and in-class observations. The analysis will
be conducted using social and epistemic network analysis, automated discourse analysis,
inferential statistics and inductive thematic analysis. This PhD project is anticipated to
contribute towards an empirically based theoretical discussion about the potential
affordances of LA to transform LD from a craft into a sounder and more evidence-based field
of research and practice.
Keywords: Learning analytics, learning design, Canvas, design-based research
Learning design (LD), which in this study encompasses “tasks, assessments, learning environments,
and resources needed to promote effective interactions between teachers and students and students
and students to support learning” (Goodyear & Yang, 2009, p.168), plays an important role in creating
an effective learning environment. An LD illustrates the learning objective of a unit of study and is
thus useful to teachers and learning designers in supporting them to document their practice
(Agostinho, 2011) and improve student learning (Mor, Ferguson, & Wasson, 2015). The common
features of all LDs include identifying the key actors (teachers and students), the representations and
expectations of each stakeholder (teaching and learning tasks), the resources needed, and the
schedule of activities (Lockyer, Heathcote, & Dawson, 2013). Nonetheless, although LD has the
potential to highlight pedagogical intentions, it does not always follow an iterative process, which is
the hallmark of design and does not take into account how students are engaged in the current
course at a fine-grained level of analysis. It also fails to specify the amount of learning that takes place
during and after the learning process as specified in the design (Lockyer et al., 2013). Consequently,
teachers and learning designers rely on summative assessments (coarse-grained analysis) such as the
end of term examinations, course evaluations/surveys, in-class observations, and their previous
experience to retrospectively make decisions regarding how best to teach their subjects to the next
cohort of students (Persico & Pozzi, 2015). However, with such an approach, little support is given to
current students, as changes within the course are only possible and relevant for the next cohort of
students (Persico & Pozzi, 2015). Besides, such methods are prone to challenges such as bias, hence
providing less objective results (Rienties, Cross, & Zdrahal, 2017). One way to deal with this challenge
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
is by using more objective and proactive methods to evaluate students’ learning in real-time and to
enable teachers to make timely informed educational decisions.
Recently, the increasing adoption of education technologies e.g., learning management systems
(LMS), online learning approaches such as MOOCs, and content-based learning environments have
led to a greater quantity of analyzable learning data and given birth to the field of learning analytics
(LA) (Siemens & Long, 2011). These kinds of data, if suitably collected and analyzed, offer more
objectivity to the design process by providing immediate feedback and proactive evaluation of
students’ learning (Persico & Pozzi, 2015). It has been argued that this provides a good base for
teachers to make timely, informed educational decisions about redesigning and improving a course
and to gain valuable insights into how students react to different learning designs (Nguyen, Rienties,
Toetenel, Ferguson, & Whitelock, 2017). The rich and fine-grained data about students’ learning
behaviours provide teachers with important insights into how students react to different designs,
thus allowing educators to make personalized interventions (Rienties et al., 2017). In light of this,
within the learning analytics and knowledge community (LAK), there is an increasing interest in
exploring the dynamics between LA and LD (Lockyer et al., 2013; Rienties et al., 2017).
The interplay between LA and LD has gained considerable interest over the past few years. For
example, Rienties et al. (2017) evaluated the weekly LD data of 2,111 learners in four language
studies classes and found that the teachers’ course design explained 55% of the variance in weekly
online engagement. In another study, Rienties and Toetenel (2016) linked 151 modules taught at
the Open University (OU), in which 111,256 students were enrolled, and found that LD was a strong
predictor of student satisfaction. A similar approach was taken by Nguyen et al. (2017), who studied
74 modules to examine the impact of assessment design on students’ engagement, focusing on
fine-grained weekly LD data. Their study indicated that the course workload for other activities
diminished after assessment activities were introduced. Moreover, Haya, Daems, Malzahn,
Castellanos, and Hoppe (2015) demonstrated the value of an approach that combines social
networks and content analysis to support LD decisions by providing indicators that support teachers
in their assessment of their LDs.
In another example, McKenney and Mor (2015) argued that the retrospective analysis of LA
can support pedagogy-driven data collection and analysis, which could, in turn, offer insight into
learning and teaching practices. Meanwhile, Michos, HernándezLeo, and Albó (2018) more
recently explored the connection between LD and data-informed reflection in school
environments. Findings from this study indicate that LA was useful in connecting pedagogical
intentions and collective reflective practices in school environments.
Recent research has begun to synthesize the corpus of existing research that explores the
connection between LA and LD. For instance, Mangaroska and Giannakos (2018) reviewed 43
empirical studies on LA for LD; they depicted ongoing design patterns and detected learning
phenomena (i.e. moments of learning or misconception) arising from the connection between LA
and LD. Moreover, to aid LA-LD alignment, other research has focused on providing tools and
conceptual frameworks to inform the connection between LA and LD (e.g., Bakharia et al., 2016;
HernándezLeo, MartinezMaldonado, Pardo, MuñozCristóbal, & RodríguezTriana, 2019; Lockyer
et al., 2013; Persico & Pozzi, 2015;) within online and physical learning settings.
However, and in spite of the increasing interest in exploring the dynamics between LA and
LD, my literature review shows that the amount of empirical studies on the subject is still limited.
In particular, there is a dearth of evidence to explain how LA is deployed iteratively by instructors
to reflect and make informed decisions on their own course designs and to tailor individualized
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
student support. Also, there is little research on how individual students’ fine-grained VLE
engagement at the activity level can facilitate the customization of LD. For example, even though
research at the Open University (UK) has linked large data sets with students’ VLE behaviour, the
large log files analyzed make it hard to integrate fine-grained data. Similarly, existing LA and LD
studies have not explicitly considered combining digital traces and content-based data (i.e
discussion posts) as a valuable resource in redesigning courses, with inferences only based on trace
data (Rienties et al., 2017; Rienties & Toetenel, 2016). Besides, most current LA and LD studies
have only been tested in a distance learning setting at one particular, non-traditional university,
mainly through applying advanced statistical approaches. An even smaller amount of research (see
Michos, HernándezLeo, & Albó, 2018) has sought information about teachers’ experiences with
aligning LA and LD based on their generated outputs. The apparent scarcity of studies that use
content data and teachers’ experiences to acquire a holistic understanding of the connection
between LA and LD seems contrary to the documented evidence of utilizing different datasets to
offer comprehensive insights and practical comments to support informed future course
improvements (Mangaroska & Giannakos, 2018). Therefore, with the motivation to address these
research problems, this proposal sketches a study in a traditional blended/face-to-face learning
environment using a design-based approach to allow a closer connection of LA with interventionist
types of educational research.
The aim of this doctoral study is threefold. Firstly, to understand the current teacher practice of LD
and the perceived potential of LA to support LD decisions across different disciplines at two large
public universities in Norway. Secondly, to explore the existing LA models and frameworks, and
assess their potential towards LA and LD decisions. Lastly, by building on previous research (Rienties
& Toetenel, 2016), my dissertation aims to explore how LA can support data-informed learning
design decisions by teachers and how these affect students’ learning experiences and performance.
The overall research question is: To what extent does a technology-supported teacher-led approach
that includes the use of LA help higher education teachers to make data-informed learning design
decisions? This question will be investigated through the following specific research questions.
What is the current teacher practice of LD, and the state of awareness, acceptance, needs
and beliefs about applying LA to support learning design decisions?
What are the features and relevance of the existing LA frameworks in helping teachers to
overcome the challenges of LA adoption in their everyday practice?
What are the opportunities of LA in terms of generating relevant insights about students
online learning processes which teachers can use to make timely and informed pedagogical
How can teachers refine, change or adapt the course design while the experience is being
delivered using detailed data and representations captured by LA tools and techniques?
The central methodological framework guiding this research project is design-based research (DBR).
Thus, the study will involve multiple iterations with the aim of understanding possible ways of
improving teachers’ LD practice through the use of LA. This study will employ quantitative
ethnography (QE) approaches such as epistemic network analysis, which will be used to analyze
students’ online discussions and construct models of student learning that are visualized as network
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
graphs, and mathematical representations of students’ patterns of connections (Shaffer, 2017). The
primary sources of data are course analytics data (e.g., activity metrics, and discussion forum posts)
collected through the Canvas LMS and representations of teachers’ learning designs as visualized on
the Canvas LMS. This will be followed by the collection of qualitative data (i.e. interviews with
students and teachers) to investigate the implicit meanings/micro-processes and patterns from
quantitative analytics data (i.e., why students access certain sites). The student and course weekly
statistics will be the unit of analysis to examine student learning behaviour, thus promoting a ‘grain
size’ approach in this study. The study will take place at two large public universities in Norway. The
universities offer courses through the traditional face-to-face approach, supported by web-based
learning management systems (Canvas) to support face-to-face instruction. To ensure cross-
disciplinary representation, the intervention courses will be selected from across the social sciences,
arts and science disciplines. Later, results will be compared and aggregated into a body of
knowledge to understand the effect of different learning designs on students’ learning experiences
and performance. It is hoped that this will lead to the identification of good practices in each of the
cases and contribute to a community of inquiry (Mor, Ferguson, & Wasson, 2015) at the two
universities. While the findings from these two cases may not be generalizable to other contexts, I
expect to generate relevant lessons that can be extrapolated with caution elsewhere. To aid the
analysis and interpretation of empirical findings, this study will be grounded in a pragmatic, socio-
cultural perspective (Knight et al., 2014). Ethical clearance will be obtained by following the
Norwegian Centre for Research Data (NSD) and General Data Protection Regulation (GDPR)
During the first half of my Ph.D., I have conducted four studies in response to the first three
research questions. Study 1: Current state of LA and LD use: To establish a theoretical basis for my
PhD project, I conducted a qualitative investigation with 16 teachers at two large public universities
in Norway. The main objective was to understand teachers’ current practice of LD, their awareness
and perceptions about LA, and whether they perceive the connection between LA and LD useful in
their everyday practice. Overall; teachers were positive about LA but also critical about the
relevance of the LA outputs and fears of increasing their workload. The findings also revealed that
teachers mainly rely on student evaluations, summative feedback and personal experiences to make
LD decisions. These findings are in harmony with the core aim of my PhD project which seeks to
leverage LA to support teachers with data-informed LD decisions. The main contribution of study
one is the proposed Bi-directional LA-LD conceptual framework which considers the synergic
relationship between LA and LD (Paper under review).
Study 2: Review of LA frameworks: This paper presents the results of a review of 18
frameworks of relevance to teacher adoption of learning analytics (LA), and discusses how these
frameworks have tried to address prominent challenges in the adoption of LA through the lens of
relevant literature on the conceptualization of LA adoption. The results show that researchers have
made significant advances in developing appropriate frameworks and tools to conceptualize LA
adoption at the practitioner level. It was also revealed that LA frameworks have considerably
advanced in connecting LA and learning theory. However, the analysis also showed a shortage of
explicit guidelines on the required competencies for LA adoption, and strategies to improve inter-
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
stakeholder communication. Moreover, the review highlights the need to empirically validate,
elaborate and put into use the most promising existing frameworks (Paper under review).
Study 3: Exploring social learning analytics (SLA) to inform learning and teaching decisions: This
study explored how SLA can be used as a proxy by teachers to understand studentslearning
processes and to support them in making informed pedagogical decisions. The findings revealed
that SLA provides insight and an overview of the studentscognitive and social learning processes in
online learning environments. This exploratory study contributes to an improved conceptual
understanding of SLA and details some of the methodological implications of an LA approach to
enhance teaching and learning decisions in online and blended learning environments (Kaliisa,
Mørch, & Kluge, 2019).
Study 4: Combining checklist and process learning analytics to support Learning Design: This
study explored the potential of LA to inform LD and how they are experienced by the teachers in a
blended learning context. Findings showed that valuable connections between LA and LD require a
detailed analysis of students’ checkpoint (i.e. online logins see Appendix Fig. 1) and process analytics
(i.e., online content and interaction dynamics) to find meaningful learning behaviour patterns that
can be presented to the teachers to support design adjustments. Moreover, teachers found LA
visualizations valuable to understand students’ online learning processes but also argued for the
timely sharing of LA visualizations in a simplified interpretable format. The results of this study will
be used as input for the next steps (i.e. developing and testing an LA-LD prototype) in authentic
learning environments (Paper under review).
Study 5: Developing an LA prototype for LD with run time application: This phase will involve the
development of a research prototype to support the alignment between LA and LD. This will later be
applied in real practice to assess the extent to which the detailed data captured by the prototype
can support teachers with making real-time adjustments to the LD during the run of the course. The
design of the tool will be guided by the insights from studies 1-3, and the recommendations
provided by teachers in study 4 (i.e. providing simple LA visualizations to teachers, and hiding
unnecessary complexity, but still open to interpretation). This phase will respond to the main PhD
research question: How can teachers refine, change or adapt the course design while the experience
is being delivered using detailed data and representations captured by LA tools and techniques?
Study 6: Evaluation of the LA-LD prototype: Lastly, an evaluation will be conducted to assess the
extent to which the prototype supported teachers with LD decisions. The results from this study will
guide future iterations, improvement of the prototype into a learning analytics-learning design tool,
and refining of the Bi-directional LA-LD conceptual framework proposed in study one which
considers the bi-directional relationship between LD and the LA methods. In other words, the data
captured may not only affect LD adaptation but also the type of data to collect and how it is to be
structured (i.e. data capturing, sense-making etc.). This means that valuable recommendations for
teachers and researchers might be generated.
The expected contribution of this PhD project is threefold (i) Conceptually (i.e. developing an
empirically grounded LA-LD tool and conceptual framework) (ii) Empirically (contributing towards an
Companion Proceedings 10th International Conference on Learning Analytics & Knowledge (LAK20)
Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)
empirically based theoretical discussion about the potential of LA toward informed LD decisions in
authentic learning environments, and guidelines to inform practitioners who develop curriculum
and technology developers and LA researchers), and (iii) Methodologically (using DBR, quantitative
ethnography, and theoretically grounded computational tools). This is an important contribution
since rigorous qualitative and design-based research is required to yield actionable insights, provide
an explanation for the identified patterns, but also spell out explicitly how LA approaches can be
used in different phases of design-based research.
Agostinho, S. (2011). The use of a visual learning design representation to support the design process of
teaching in higher education. Australasian Journal of Educational Technology, 27(6).
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., & Lockyer, L. (2016, April). A
conceptual framework linking learning design with learning analytics. In Proceedings of the Sixth
International Conference on Learning Analytics & Knowledge (pp. 329-338). ACM.
Goodyear, P., & Yang, D. F. (2009). Patterns and pattern languages in educational design. In Handbook of
research on learning design and learning objects: Issues, applications, and technologies (pp. 167-187).
IGI Global.
Haya, P. A., Daems, O., Malzahn, N., Castellanos, J., & Hoppe, H. U. (2015). Analyzing content and patterns of
interaction for improving the learning design of networked learning environments. British Journal of
Educational Technology, 46(2), 300-316. doi:10.1111/bjet.12264
HernándezLeo, D., MartinezMaldonado, R., Pardo, A., MuñozCristóbal, J. A., & RodríguezTriana, M. J. (2019).
Analytics for learning design: A layered framework and tools. British Journal of Educational
Technology, 50(1), 139-152.
Knight, S., Shum, S. B., & Littleton, K. (2014). Epistemology, assessment, pedagogy: where learning meets
analytics in the middle space. Journal of Learning Analytics, 1(2), 23-47.
Kaliisa, R., Mørch, A. I., & Kluge, A. (2019). Exploring Social Learning Analytics to Support Teaching and Learning
Decisions in Online Learning Environments. Paper presented at the European Conference on
Technology Enhanced Learning.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning Analytics with
learning design. American Behavioural Scientist, 57(10), 1439-1459.
Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning
analytics: A call for action. British Journal of Educational Technology, 46(2), 221-229.
Mangaroska, K., & Giannakos, M. N. (2018). Learning analytics for learning design: A systematic literature
review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies.
doi: 10.1109/TLT.2018.2868673
Michos, K., HernándezLeo, D., & Albó, L. (2018). Teacherled inquiry in technologysupported school
communities. British Journal of Educational Technology, 49(6), 1077-1095.
McKenney, S., & Mor, Y. (2015). Supporting teachers in datainformed educational design. British Journal of
Educational Technology, 46(2), 265-279.
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-
based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in
Human Behavior, 76, 703-714
Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher
inquiry. British Journal of Educational Technology, 46(2), 230-248.
Rienties, B., Cross, S., & Zdrahal, Z. (2017). Implementing a Learning Analytics Intervention and Evaluation
Framework: what works? In Big Data and Learning Analytics in Higher Education (pp. 147-166).
Springer International Publishing
Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and
performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60,
Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE
review, 46(5), 30.
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Most teachers to date have adopted summative assessment items as a benchmark to measure students’ learning and for making pedagogical decisions. However, these may not necessarily provide comprehensive evidence for the actual learning process, particularly in online learning environments due to their failure to monitor students’ online learning patterns over time. In this paper, we explore how social learning analytics (SLA) can be used as a proxy by teachers to understand students’ learning processes and to support them in making informed pedagogical decisions during the run of a course. This study was conducted in a semester-long undergraduate course, at a large public university in Norway, and made use of data from 4 weekly online discussions delivered through the university learning management system Canvas. First, we used NodeXL a social network analysis tool to analyze and visualize students’ online learning processes, and then we used Coh-Metrix, a theoretically grounded, computational linguistic tool to analyze the discourse features of students’ discussion posts. Our findings revealed that SLA provides insight and an overview of the students’ cognitive and social learning processes in online learning environments. This exploratory study contributes to an improved conceptual understanding of SLA and details some of the methodological implications of an SLA approach to enhance teaching and learning in online learning environments. Full Published paper available at this link:
Full-text available
As the fields of learning analytics and learning design mature, the convergence and synergies between these two fields became an important area for research. This paper intends to summarize the main outcomes of a systematic literature review of empirical evidence on learning analytics for learning design. Moreover, this paper presents an overview of what and how learning analytics have been used to inform learning design decisions and in what contexts. The search was performed in seven academic databases, resulting in 43 papers included in the main analysis. The results from the review depict the ongoing design patterns and learning phenomena that emerged from the synergy that learning analytics and learning design impose on the current status of learning technologies. Finally, this review stresses that future research should consider developing a framework on how to capture and systematize learning design data grounded in learning analytics and learning theory, and document what learning design choices made by educators influence subsequent learning activities and performances over time.
Full-text available
Substantial progress in learning analytics research has been made in recent years to predict which groups of learners are at risk. In this chapter, we argue that the largest challenge for learning analytics research and practice still lies ahead of us: using learning analytics modelling, which types of interventions have a positive impact on learners’ Attitudes, Behaviour and Cognition (ABC). Two embedded case-studies in social science and science are discussed, whereby notions of evidence-based research are illustrated by scenarios (quasi-experimental, A/B-testing, RCT) to evaluate the impact of interventions. Finally, we discuss how a Learning Analytics Intervention and Evaluation Framework (LA-IEF) is currently being implemented at the Open University UK using principles of design-based research and evidence-based research.
Full-text available
Pedagogically informed designs of learning are increasingly of interest to researchers in blended and online learning, as learning design is shown to have an impact on student behaviour and outcomes. Although learning design is widely studied, often these studies are individual courses or programmes and few empirical studies have connected learning designs of a substantial number of courses with learning behaviour. In this study we linked 151 modules and 111.256 students with students' behaviour (<400 million minutes of online behaviour), satisfaction and performance at the Open University UK using multiple regression models. Our findings strongly indicate the importance of learning design in predicting and understanding Virtual Learning Environment behaviour and performance of students in blended and online environments. In line with proponents of social learning theories, our primary predictor for academic retention was the time learners spent on communication activities, controlling for various institutional and disciplinary factors. Where possible, appropriate and well designed communication tasks that align with the learning objectives of the course may be a way forward to enhance academic retention.
Conference Paper
Full-text available
In this paper we present a learning analytics conceptual framework that supports enquiry-based evaluation of learning designs. The dimensions of the proposed framework emerged from a review of existing analytics tools, the analysis of interviews with teachers, and user scenarios to understand what types of analytics would be useful in evaluating a learning activity in relation to pedagogical intent. The proposed framework incorporates various types of analyt-ics, with the teacher playing a key role in bringing context to the analysis and making decisions on the feedback provided to students as well as the scaffolding and adaptation of the learning design. The framework consists of five dimensions: temporal analytics, tool-specific analytics, cohort dynamics, comparative analytics and contingency. Specific metrics and visualisations are defined for each dimension of the conceptual framework. Finally the development of a tool that partially implements the conceptual framework is discussed.
Learning design is a research field which studies how to best support teachers as designers of Technology Enhanced Learning (TEL) situations. Although substantial work has been done in the articulation of the learning design process, little is known about how learning designs are experienced by students and teachers, especially in the context of schools. This paper empirically examines if a teacher inquiry model, as a tool for systematic research by teachers into their own practice, facilitates the connection between the design and data‐informed reflection on TEL interventions in two school communities. High school teachers participated in a learning design professional development program supported by a web‐based community platform integrating a teacher inquiry tool (TILE). A multiple case study was conducted aimed at understanding: (a) current teacher practice and (b) teacher involvement in inquiry cycles of design and classroom implementations with technologies. Multiple data sources were used over a one year period including focus groups transcripts, teacher interview protocols, digital artifacts, and questionnaires. Sharing teacher‐led inquiries together with learning analytics was perceived as being useful for connecting pedagogical intentions with the evaluation of their enactment with learners, and this differed from their current practice. Teachers’ reflections about their designs focused on the time management of learning activities and their familiarity with the enactment and analytics tools. Results inform how technology can support teacher‐led inquiry and collective reflective practice in schools.
The field of learning design studies how to support teachers in devising suitable activities for their students to learn. The field of learning analytics explores how data about students' interactions can be used to increase the understanding of learning experiences. Despite its clear synergy, there is only limited and fragmented work exploring the active role that data analytics can play in supporting design for learning. This paper builds on previous research to propose a framework (analytics layers for learning design) that articulates three layers of data analytics—learning analytics, design analytics and community analytics—to support informed decision‐making in learning design. Additionally, a set of tools and experiences are described to illustrate how the different data analytics perspectives proposed by the framework can support learning design processes.
Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students' engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students' time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.
This chapter provides an overview of recent research and development (R&D) activity in the area of educational design patterns and pattern languages. It provides a context for evaluating this line of R&D by sketching an account of the practice of educational design, highlighting some of its difficulties and the ways in which design patterns and other aids to design might play a role. It foregrounds a tension between optimising design performance and supporting the evolution of design expertise. The chapter provides examples of recent research by the authors on design patterns for networked learning, as well as pointers to complementary research by others. Connections are made with R&D work on learning design and other approaches to supporting design activity.
Learning Analytics is an emerging research field and design discipline that occupies the “middle space” between the learning sciences/educational research and the use of computational techniques to capture and analyze data (Suthers & Verbert, 2013). We propose that the literature examining the triadic relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching), and assessment provide critical considerations for bounding this middle space. We provide examples to illustrate the ways in which the understandings of particular analytics are informed by this triad. As a detailed worked example of how one might design analytics to scaffold a specific form of higher order learning, we focus on the construct of epistemic beliefs: beliefs about the nature of knowledge. We argue that analytics grounded in a pragmatic, socio-cultural perspective are well placed to explore this construct using discourse-centric technologies. The examples provided throughout this paper, through emphasizing the consideration of intentional design issues in the middle space, underscore the “interpretative flexibility” (Hamilton & Feenberg, 2005) of new technologies, including analytics.