ArticlePDF Available

Abstract and Figures

Learning Design, as a field of research, provides practitioners with guidelines towards more effective teaching and learning. In parallel, observational methods (manual or automated) have been used in the classroom to reflect on and refine teaching and learning, often in combination with other data sources (such as surveys and interviews). Despite the fact that both Learning Design and classroom observation aim to support teaching and learning practices (respectively a priori or a posteriori), they are not often aligned. To better understand the potential synergies between these two strategies, this paper reports on a systematic literature review based on 24 works that connect learning design and classroom observations. The review analyses the purposes of the studies, the stakeholders involved, the methodological aspects of the studies, and how design and observations are connected. This review reveals the need for computer-interpretable documented designs; the lack of reported systematic approaches and technological support to connect the (multimodal) observations with the corresponding learning designs; and, the predominance of human-mediated observations of the physical space, whose applicability and scalability are limited by the human resources available. The adoption of ICT tools to support the design process would contribute to extracting the context of the observations and the pedagogical framework for the analysis. Moreover, extending the traditional manual observations with Multimodal Learning Analytic techniques, would not only reduce the observation burden but also support the systematic data collection, integration, and analysis, especially in semi-structured and structured studies.
Content may be subject to copyright.
Educ. Sci. 2019, 9, 91; doi:10.3390/educsci9020091 www.mdpi.com/journal/education
Review
A Conversation between Learning Design and
Classroom Observations: A Systematic
Literature Review
Maka Eradze *, María Jesús Rodríguez-Triana and Mart Laanpere
School of Digital Technologies, Tallinn University, Tallinn 10120, Estonia; maka@tlu.ee
* Correspondence: maka@tlu.ee
Received: 22 March 2019; Accepted: 23 April 2019; Published: 26 April 2019
Abstract: Learning Design, as a field of research, provides practitioners with guidelines towards
more effective teaching and learning. In parallel, observational methods (manual or automated)
have been used in the classroom to reflect on and refine teaching and learning, often in combination
with other data sources (such as surveys and interviews). Despite the fact that both Learning Design
and classroom observation aim to support teaching and learning practices (respectively a priori or
a posteriori), they are not often aligned. To better understand the potential synergies between these
two strategies, this paper reports on a systematic literature review based on 24 works that connect
learning design and classroom observations. The review analyses the purposes of the studies, the
stakeholders involved, the methodological aspects of the studies, and how design and observations
are connected. This review reveals the need for computer-interpretable documented designs; the
lack of reported systematic approaches and technological support to connect the (multimodal)
observations with the corresponding learning designs; and, the predominance of human-mediated
observations of the physical space, whose applicability and scalability are limited by the human
resources available. The adoption of ICT tools to support the design process would contribute to
extracting the context of the observations and the pedagogical framework for the analysis.
Moreover, extending the traditional manual observations with Multimodal Learning Analytic
techniques, would not only reduce the observation burden but also support the systematic data
collection, integration, and analysis, especially in semi-structured and structured studies.
Keywords: learning design; multimodal learning analytics; classroom observations;
evidence-based practice
1. Introduction
Learning Design or Design for Learning [1], as a field of educational research and practice, aims
to improve the effectiveness of learning, e.g., helping teachers to create and make explicit their own
designs [2]. A similar term, “learning design”, is also used to refer either to the creative process of
designing a learning activity or to the artefact resulting from such a process [3]. Despite this emphasis
on the creation of learning designs, there is a lack of frameworks to evaluate the implementation of
the designs in the classroom [4]. Moreover, in order to evaluate the implementation of learning
design, there is a need for evidence coming from those digital or physical spaces where teaching and
learning processes take place [5].
Observations (or observational methods) have been traditionally used by researchers and
practitioners to support awareness and reflection [6,7]. Especially in educational contexts that occupy,
fully or partially, physical spaces, observations offer an insight not easily available through other
data sources (e.g., surveys, interviews, or teacher and student journals). Indeed, since human
Educ. Sci. 2019, 9, 91 2 of 13
observations are limited to what the eye can see, and are done through the human resources available,
automated observations can provide a complementary view and lower effort solution, especially
when the learning scenario is supported by technology [8,9]. Thus, the integration of manual and
automated observations with other data sources [10] offers a more complete and triangulated picture
of the teaching and learning processes [11].
Interestingly, while both learning design (hereafter, LD) and classroom observation (CO) pursue
the support of teaching and learning practices, often they are not aligned. To better understand why
and how LD and CO have been connected in the existing literature, this paper reports on the results
obtained from a systematic literature review. More precisely, this paper explores the nature of the
observations, how the researchers establish the relationship between LD and CO and how this link
is implemented in practice. Then, the lessons learnt from the literature review led us to spot open
issues and future directions that are to be addressed by the research community.
Out of 2793 papers obtained from different well-known databases in the area of technology-
enhanced learning, 24 articles were finally considered for the review. In the following sections, we
introduce related works that motivated this study, describe the research methodology followed
during the review process, and finally, discuss the results obtained in relation to the research
questions that guided the study.
2. Supporting Teaching Practice through Learning Design and Classroom Observation
While Learning Design refers to the field of educational research and practice, different
connotations are linked to the term ‘learning design’ (without capitals) [5,12–14]. According to some
authors, learning design (LD) can be seen as a product or an artefact that describes the sequence of
teaching and learning activities [5,15–17], including the actors’ roles, activities, and environments as
well as the relations between them [18]. At the same time, learning design is also referred to as the
process of designing a learning activity and or creating the artefacts that describe the learning
activity [1,13,19]. In this paper, we wil l refl ect not only on the artefact but also on the process of designing
for learning, trying to clarify which one, and how, it is connected with classroom observations.
While designing for learning, practitioners develop hypotheses about the teaching and learning
process [20]. The collection of evidence during the enactment to test these hypotheses contributes to
the orchestration tasks (e.g., by detecting deviations from the teacher’s expectations that may require
regulation) to the teacher professional development (leading to the better understanding and
refinement of the teaching and learning practices) [16,21] and to the decision making at the
institutional level (e.g., in order to measure the impact of their designs and react upon them) [22].
However, the support available for teachers for design evaluation is still low [4] and, as Ertmer et al.
note, scarce research is devoted to evaluating the designs [23].
In a parallel effort to support teaching and learning, classroom observation (CO) contributes to
refining and reflecting on those practices. CO is a “non-judgmental description of classroom events
that can be analysed and given interpretation” [24]. Through observations, we can gather data on
individual behaviours, interactions, or the educational setting both in physical and digital
spaces [8,25] using multiple machine- and human-driven data collection techniques (such as surveys,
interviews, activity tracking, teaching and learning content repositories, or classroom and wearable
sensors). Indeed, Multimodal Learning Analytics (MMLA) solutions can be seen as "modern"
observational approaches suitable for physical and digital spaces [26], to infer climate in the
classroom [27], or to observe technology-enhanced learning [28] or to put in evidence the human and
machine-generated data for the design of LA systems[29].
According to the observational methods, the design of the observation should be aligned with
the planned activities [30], which, in the case of the classroom observations, are described in the
learning design. Later, observers must be aware of the context where the teaching and learning
processes take place including, among others, the subjects and objects involved. Again, this need for
context awareness can be satisfied with the details provided in the LD artefacts [31]. Finally, going
one step further, the context and the design decisions may guide the analysis of the
observations [6,32].
Educ. Sci. 2019, 9, 91 3 of 13
Another main aspect of the observations is the protocol guiding the data collection.
Unstructured protocols provide observers with full expressivity to describe what they see, with the
risk of producing big volumes of unstructured data that is more difficult and time-consuming to
interpret [33]. On the contrary, structured observations are less expressive but, on the other hand, are
more prone to automatization with context-aware technological means, reduce the observation effort
and tend to be more accurate in systematic data gathering [34]; this factor allows for more efficient
data processing [35] and makes the integration with other sources in multimodal datasets easier, thus
enabling data triangulation [36].
From the (automatic) data gathering and analysis perspective, LD artefacts have been used in
the area of LA to contextualise the analysis [37,38] and LD processes to customise such solutions [39].
Symmetrically, both the field of Learning Design and the practitioners also benefit from this
symbiosis [5], e.g., by analysing the design process or assessing the impact of the artefacts on learning,
new theories can be extracted. Thus, classroom observations (beyond the mere data gathering and
analysis technique) could profit from similar synergies with LD processes and artefacts, as some
authors have already pointed out [23].
3. Research Questions and Methodology
In order to better understand how learning design and classroom observation have been
connected in the existing literature, we carried out a systematic literature review [40] to answer the
following research questions:
RQ1: What is the nature of the observations (e.g., stakeholders, unit of analysis, observation types,
when the coding is done, research design, complementary sources for data triangulation,
limitations of observations and technological support)?
RQ2: What are the purposes of the studies connecting learning design and classroom observations?
RQ3: What is the relationship between learning design and classroom observations established at the
methodological, practical and technical levels?
RQ4: What are the important open issues and future lines of work?
While the first three research questions are aimed at being descriptive and mapping the existing
reality based on the research and theoretical works, the last research question was aimed at being
prescriptive; by identifying the gaps in literature based on corresponding limitations and research
results, we offer future research directions.
To answer these research questions, we selected six main academic databases in Technology
Enhanced Learning: IEEE Xplore, Scopus, AISEL, Wiley, ACM, and ScienceDirect. Additionally, Google
Scholar (top 100 papers out of 15500 hits) was added in order to detect “grey literature” not indexed
in most common literature databases but potentially relevant to assess the state of a research field.
After taking into account alternative spellings, the resulting query was: ("classroom
observation*" OR "lesson observation*" OR "observational method*") AND ("learning design" OR
"design for learning" OR "lesson plan" OR "instructional design" OR scripting). Aside from this, the
first part of the query was decided based on different possible uses of the term “observation”,
whereas in the part of the query “learning design” or “design for learning” there are established
differences in the use of these related concepts [19] as already discussed in the previous section. At
the same time, “instructional design”, although it has a different origin, sometimes is used
interchangeably [3] and “scripting” [36] are also widely used.
The query was run on 15 March 2018. To select the suitable papers we followed the PRISMA
statement [41]—guideline and process used for rigorous systematic literature reviews. Although
several papers contained these keywords in the body of the paper, we narrowed the search down to
title, abstract, and keywords, aiming for those papers where these terms could have a more significant
role in the contribution. Therefore, whenever the research engine allowed it, the query was applied
to title, abstract, and keyword, obtaining a total of 2793 items from the different databases. After the
duplicates were removed, we ended up with 2392 papers. Then, to apply the same criteria to all
papers, we conducted a manual secondary title/abstract/keyword filtering, obtaining 81 publications.
Educ. Sci. 2019, 9, 91 4 of 13
Finally, abstracts and full papers were reviewed, excluding those that were not relevant for our
research purpose (i.e., no direct link between LD and observations—43), not accessible (the paper
could not be found on the internet nor provided by the authors—14 papers). Finally, 24 papers were
selected for in-depth analysis.
The analysis of the articles was guided by the research questions listed previously. According to
the content analysis method [42], we applied inductive reasoning followed by iterative deductive
analysis. While the codes in some categories were predefined, others emerged during the analysis
(e.g., when identifying complementary data-sources, or when eliciting the influence that LD has on
CO and vice versa). As a result, the articles were fully read and (re)coded through three iterations.
Figure 1 provides an overview of the codification scheme, showing the categories, the relations and
whether the codes were predefined (using normal font) or emerged during the process (in italics).
Figure 1. Topics analysed during the paper review and corresponding categories. In bold, central
concepts of the analysis. Normal font indicates predefined codes derived deductively. In italics,
emerging codes derived inductively.
It should be noted that both predefined and emerging codes were agreed on between the
researchers. Although a single researcher did the coding, the second author was involved in
ambiguous cases. In most of the cases, the content analysis only required identifying the topics and
categories depicted in Figure 1. However, in some cases, it was necessary to infer the categories such
as the unit of analysis, which had to be identified, based on the research methodology information
(further details are provided in the following section).
4. Results and Findings
Table 1 shows the main results of our analysis, including the codification assigned per article.
While the main goal was to identify empirical works, we also included theoretical papers in the
analysis since they could provide relevant input for the research questions. More concretely, out of
24 papers, we identified 3 papers without empirical evidence: Adams et al provide guidelines for the
classroom observation at scale and the other two papers by Eradze and Laanpere, Eradze et al, 2017
reflect on the connections between classroom observations and learning analytics. This section
summarises the findings of the systematic review, organised along with our four research questions.
Educ. Sci. 2019, 9, 91 5 of 13
Table 1. Overview of the reviewed papers.
Reference
Data subjects
Data objects
Unit of analysis
Observation type
Coding time
Complementary
sources
Design of the study
Aim of the
study/pape
r
LD guides CO
CO informs LD
Limitations of the
observations
Adams et al,
2012 [43]
H T, S A S RT I, DA QT TPD D A NA
Anderson, 2015
[44]
H NA E NA NA DA NA TPD D, A A NA
Eradze &
Laanpere, 2017
[45]
H, A T, S E SS, S,
U
RT -- QT,
QL
TPD,
O, CI,
CL, U
D, A P, A T, C
Eradze et al,
2017 [46]
H, A T, S E SS, S,
U
RT -- QT,
QL
TPD,
O, CI,
CL, U
D, A P, A NA
Freedman et al,
2012 [47]
H T, S NA SS AP I QL O, U A P NA
Ghazali et al,
2010 [48]
H T, S A, E U AP I, DA QL TPD,
U
D F NA
Hernandez et al,
2015 [49]
H T, S A, E SS RT DA,
A
QL O D, A A NA
Jacobs et al, 2008
[50]
H T, S L S RT S,
DA
QT TPD D P SS
Jacobson et al,
1991 [51]
H T NA S RT I QT TPD D P NA
Kermen, 2015
[52]
H T, S A U AP I, DA QL O, U,
TPD
D P NA
Molla & Lee,
2012 [53]
H T, S E, A S RT I, DA QT,
QL
CI, CL D, A A NA
Nichols, 2007
[54]
H T, S E S RT I,
DA,
A
QT U,
TPD
D, A P NA
Phaikhumnam
& Yuenyong,
2018 [55]
H T, S NA U AP DA,
A
QT,
QL
O,
TPD
D, A A NA
Procter, 2004
[56]
H T, S A S RT I QT,
QL
O D P SS
Ratnaningsih,
2007 [57]
H T, S A SS AP I, DA QL CL A P NA
Rozario et al,
2016 [58]
H T, S A, E SS AP I, DA QL O D P NA
Simwa &
Modiba, 2015
[59]
H T, S E SS AP I, DA QL CL,
TPD
D, A P NA
Sibanda, 2010
[60]
H T, S E U AP I,
DA,
S
QT,
QL
U A A T
Suherdi, 2017
[61]
H T, S A SS AP I, DA QT,
QL
CL, U D, A P S
Solomon, 1971
[62]
H T, S E SS AP NA QL CL, O D, A F T
Educ. Sci. 2019, 9, 91 6 of 13
Notes: RQ1: Data subjects/data objects (H = external human observer, A = automated observer,
T = teacher, S = student, NA = not available); Unit of analysis (A = activity, E = event/interaction,
D = discourse, L = lesson, NA = not available); Observation type (S = structured, SS = semi-structured,
U = unstructured, NA = not available; Coding time (RT = real time, AP = a posteriori, NA = not
available); Complementary sources (S = survey, I = interview, DA = document analysis, A =
assessment, NA = not available); Design of the study (QT = quantitative, QL = qualitative, NA = not
available); RQ2: Aims of the study/paper (O = orchestration, CI = compare different implementations,
CL = compare lesson plan and lesson enactment, U = understand the impact of the LD, TPD = support
teacher professional development); RQ3: LD guides CO (D = LD guides the observation design and
data collection, A = LD guides the data analysis); CO informs LD (A = recommendations to improve
the design artefact, P = recommendations to improve the design process, F = support the theories of
the field); RQ4: Limitations: (T = time constraints, S = Space constraints, SS = sample size,
C= complexity).
4.1. RQ1—What is the Nature of the Observations?
The distribution of observation roles among data subjects and objects was clear and explicit in
every paper. In all cases, external human observers were in charge of the data collection and coding—
twice in combination with automated LA solutions (in this case, a proposal to involve
LA solutions) [45,46]—with both teachers and students as the common data objects (22 papers).
Although the definition of the unit of analysis is an important methodological decision in
observational studies or research in general [35,67–69] we only found an explicit reference to it in one
paper [66]. Nevertheless, looking at the description of the research methodology, we can infer that
most of the studies focused on events (directed at interaction and behavioural analysis) (14) and
activities (10).
While either structured or semi-structured observations (10 and 10 respectively) were the most
common observation types, unstructured observations were also mentioned (7). Interestingly, just
two papers conceived the option of combining the three different observational protocols [45,46].
Going one step further and looking at how the observation took place, there was an equal distribution
between real-time and a posteriori cases, but in all cases following traditional data collection (i.e., by
a human). The existence of so many a posteriori observational data collection could be closely related
to the limited resources and effort often available to carry out manual observations.
A variety of research designs were followed in the studies: 9 papers reported qualitative
methods, 6 quantitative and 8 mixed methods. Most of them combined observations with additional
data sources, including documents (16), interviews (15), assessment data (4), and surveys (3). In a
majority of cases, aside from observations, there were at least two other sources of data used (16
cases).
Most of the papers use (or consider using) additional data sources that were not produced
automatically, as happened with the observations. This fact illustrates how demanding data
integration of (often multimodal) data can be. While MMLA solutions could be applied in a variety
of studies, quantitative and mixed-method studies that enriched event observations with additional
data sources—see, e.g., [53,54,60,65] are potential candidates to benefit from MMLA solutions that
aid not only the systematic data gathering but also the integration and analysis of multiple data
sources.
Suppa, 2015 [63] H T, S NA S RT I,
DA,
A, S
QT O D P T, SS
Vantassel-baska
et al, 2003 [64]
H T, S E S RT NA QT TPD,
O
A P NA
Varsfeld, 1998
[65]
H T, S E SS RT I, DA QT,
QL
U, CL,
TPD
A P NA
Zhang, 2016 [66] H T, S E, D,
A
U AP DA QL U D F NA
Educ. Sci. 2019, 9, 91 7 of 13
Regarding the learning designs, the majority of papers included the artefact as a data source
where they applied document analysis to extract the design decisions. Moreover, in several studies—
e.g., [44,58,59]—the learning design was not available and was inferred a posteriori, with indirect
observations. These two situations illustrate one of the main limitations for the alignment with
learning design: LDs are not always explicit or, if they are documented, come in different forms (e.g.,
including texts, graphical representations, or tables) and level of detail tables [70,71]. Apart from
being time-consuming, inferring or interpreting the design decisions is error prone and can influence
the contextualization. This problem, also mentioned by the LA community when attempting to
combine LD and LA [32,72], shows the still low adoption of digital solutions (see for example the
Integrated learning design environment (ILDE: http://ilde.upf.edu) that supports the LD process and
highlights the need for a framework on how to capture and systematise learning design data.
4.2. RQ2—What are the Purposes of Studies Connecting LD and CO?
According to the papers, the main reasons identified in the studies were: To support teacher
professional development (13), classroom orchestration (11), and reflection, e.g., understanding the
impact of the learning design (10) or comparing the design and its implementation (8). Moreover, in
many cases (13), the authors report connecting LD and CO for two or more purposes at the same
time. Therefore, linking LD and CO can be useful to cater to several research aims and teacher needs.
The fact that this synergy is mostly used to support teacher professional development can be also
explained with the wide use of classroom observations in teacher professional development and
teacher training.
4.3. RQ3—What is the Relationship between Learning Design and Classroom Observations Established at
the Methodological, Practical and Technical Level?
One of the aims of our study was to identify the theoretical contributions that aim at connecting
CO with LD. Only three papers aimed at contributing to linking learning design and classroom
observations. Solomon in 1971 was a pioneer in bringing together learning design and classroom
observations. In his paper [62] the author suggested a process and a model for connecting CO and
LD in order to compare planned learning activities with the actual implementation in the classroom.
In his approach, data was collected and analysed based on LD, attending specific foci of interest. It
also looks at previous lessons to get indicators on the behavioural changes, and aligns them with the
input (strategies in the lesson plan), coding student and teacher actions and learning events by
identifying actors (according to objectives in the lesson plan), output (competencies gained in the
end). The approach also places importance on the awareness and reflection possibilities of such
observations, not only from teachers but also from students. Later on, Eradze et al. proposed a model
and a process for lesson observation, which were framed by the learning design. The output of the
observation is a collection of the statements represented in a computational format (xAPI) so that
they can be interpreted and analysed by learning analytics solutions [45,46]. In these papers, the
authors argue that the learning design not only guides the data gathering but also contextualises the
data analysis, contributing to a better understanding of the results.
At the practical level, the relation established between learning design and classroom
observation was mainly a guidance at different degrees: either the authors reported to have observed
aspects related to the learning design (eight papers), or to interpret the results of the observational
analysis (six papers) or, from the beginning, the learning design guided the whole observation cycle
(i.e., design, data gathering, and analysis) (10 papers). How is CO reflected on LD and Learning Design
as a practice? In 15 cases, the final result of the synergy was recommendations for teaching and
learning practice (design for learning), in eight cases the use of observations aimed at informing the
LD, and three papers had used CO to contribute to theory or the field of LD in general. In other
words, while many papers used the learning design artefact, the observations contributed to inform
the (re)design process.
Additionally, from the technical perspective, it should be noted that none of the papers reported
having used specific tools to create learning design or to support the observational design, the data
Educ. Sci. 2019, 9, 91 8 of 13
collection nor the analysis process. Nevertheless, one paper [45] presented a tool that uses the
learning design to support observers in the codification and contextualization of interaction data. The
fact that most of the papers have extracted the LD using document analysis indicates low adoption
of LD models and design tools by researchers and practitioners. Thus, there is a need for solutions
that enable users to create or import the designs that guide the contextualization of the data collection
and analysis.
4.4. RQ4—What are the Important Open Issues and Future Lines of Work?
Although most of the papers did not report limitations in connecting LD and CO (18 papers),
those who did refer to problems associated with the observation itself such as time constraints
(difficulties annotating/coding in the time available [62,63], space constraints - observer mobility [61]
and sample size [50,56,63].
Furthermore, as a result of the paper analysis, we have identified different issues to be addressed
by the research community to enable the connection between LD and CO, and achieve it in more
efficient ways, namely:
4.4.1. Dependence on the Existence of Learning Design
Dependence on the LD as an artefact is one of the issues for the implementation of such a
synergy: while in this paper we assume that the learning design is available, in practice, this is not
always the case. Often, the lesson plan remains in the head of the practitioner without being
registered or formalised [32,72]. Therefore, for those cases, it would be necessary to rely on bottom-
up solutions whose goal is to infer the lesson structure from the data gathered in the learning
environment [73]. However, solutions of this type are still scarce and prototypical.
4.4.2. Compatibility with Learning Design Tools
The studies reviewed here did not report using any LD or CO tool. However, to aid the
connection between learning design and classroom observations, it is necessary to have access to a
digital representation of the artefact. Tools such as WebCollage (https://analys.gsic.uva.es/webcollage),
LePlanner (https://beta.leplanner.net) or the ILDE (Integrated Learning Design Environment,
https://ilde.upf.edu) guide users through the design process. To facilitate compatibility, it would be
recommendable to use tools that rely on widespread standards (e.g., IMS-LD – a specification that
enables modelling of learning processes) instead of proprietary formats. From the observational side,
tools such as KoboToolbox (http://analys.kobotoolbox.org), FieldNotes (http://fieldnotesapp.info), Ethos
(https://beta2.ethosapp.com), Followthehashtag (http://analys.followthehashtag.com, Storify
(https://storify.com), VideoAnt (https://ant.umn.edu), and LessonNote (http://lessonnote.com) have
been designed to support observers during the data collection. Also, in this case, for compatibility
reasons, it would be preferable to use tools that allow users to export their observations following
standards already accepted by the community (e.g., xAPI).
4.4.3. Workload and Multimodal Data Gathering
As we have seen in the reviewed papers, observation processes often require the participation
of ad-hoc observers. To alleviate the time and effort that observations entail, technological means
could be put in place, enabling teachers to gather data by themselves [74]. For example, (multimodal)
learning analytics solutions that monitor user activity and behaviour [26,73,75,76] could be used to
automate part of the data collection or to gather complementary information about what is happening
in the digital and the physical space. It is also worth noting that the inclusion of new data sources
may contribute not only to promoting the quality of analysis (by triangulating the evidence), but also
to obtaining a more realistic interpretation of the teaching and learning processes under study.
Educ. Sci. 2019, 9, 91 9 of 13
4.4.4. Underlying Infrastructure.
To the best of our knowledge, there is no tool or ecosystem that enables the whole connection
between LD and CO (i.e., creation of the learning design, observational design, data gathering,
integration, and analysis). From the reviewed literature only one tool, Observata [45,46] could fit this
purpose. However, this tool was still under design and therefore not evaluated by the time this
review took place.
5. Conclusions
This paper reports a systematic literature review on the connection between learning design and
classroom observation, where 24 papers were the subject of analysis. These papers illustrate the
added value that the alignment between these two areas may bring, including but not limited to
teacher professional development, orchestration, institutional decision-making and educational
research in general. To cater to the needs for evidence-based teaching and learning practices, this
review contextualises classroom observations within modern data collection approaches and
practices.
Despite the reported benefits, the main findings from the papers lead us to conclude that in order
to make use of the synergies of linking LD and CO, technological infrastructure plays a crucial role.
Starting from the learning design, this information is not explicit and formalising it implies adding
extra tasks for the practitioners. Similarly, ad-hoc observers are in charge of data collection and
analysis. Taking into account that the unit of analysis in most cases is the event (interaction-driven) or
the activity, the workload that the observations entail might not be compatible with teaching at the
same time, and, therefore, require external support. Nevertheless, despite using multiple data sources
in research, none of the papers have reported automatic data gathering or the use of MMLA solutions
for its analysis. Thus, to enable inquiry processes where teachers and researchers can manage the
whole study, we suggest that MMLA solutions could contribute to reducing the burden by inferring
the lesson plan and by automatically gathering parts of the observation.
Moreover, to operationalise the connection between the designs, it will be necessary to promote
the usage of standards both in the LD and the CO solutions, so that we can increase the compatibility
between platforms. This strategy could contribute to the creation of technological ecosystems that
support all the steps necessary to support the connection between the design and the observations.
Additionally, there is a need for methodological frameworks and tools that guide the data gathering
and integration, so that the learning design is taken into consideration not only to frame the data
analysis but also to inform the observational design. Furthermore, this paper mainly illustrates the
benefits that LD and CO synergies may bring to researchers focusing on educational research, but
more development would be needed for teacher adoption and teaching practice.
Finally, coming back to the research methodology of this paper, our study presents a number of
limitations: First, restricting the search to the title, abstract or keywords may have caused the
exclusion of valuable contributions; and second, the lack of explicit descriptions or omission about
the LD and CO processes/artefacts in the papers may have caused deviations in the codifications.
Nevertheless, the analysis of the collected papers still illustrates the synergies and challenges of this
promising tandem of learning design and classroom observation.
Author contributions: conceptualization, methodology, data curation, formal analysis, investigation,
visualization, writing—original draft, M.E.; Supervision, formal analysis, visualization, writing—
review and editing, M.J.R.-T.; Supervision, writing—review and editing, M.L.
Funding: This research was funded by the European Union’s Horizon 2020 Research and Innovation Programme
under Grant Agreement No. 731685 (Project CEITER).
Acknowledgements: The authors thank Jairo Rodríguez-Medina for inspiration and helpful advice.
Conflicts of Interest: The authors declare no conflict of interest. Ethics approval was not required. All available
data is provided in the paper.
Educ. Sci. 2019, 9, 91 10 of 13
References
1. Goodyear, P.; Dimitriadis, Y. In medias res: Reframing design for learning. Res. Learn. Technol. 2013, 21,
doi:10.3402/rlt.v21i0.19909.
2. Laurillard, D. Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology;
Routledge: New York, NY, USA, 2013; ISBN 1136448209.
3. Mor, Y.; Craft, B.; Maina, M. Introduction—Learning Design: Definitions, Current Issues and Grand
Challenges. In The Art & Science of Learning Design; Sense Publishers: Rotterdam, The Netherlands, 2015;
pp. 9–26.
4. Hernández-Leo, D.; Rodriguez Triana, M.J.; Inventado, P.S.; Mor, Y. Preface: Connecting Learning Design
and Learning Analytics. Interact. Des. Archit. J. 2017, 33, 3–8.
5. Hernández-Leo, D.; Martinez-Maldonado, R.; Pardo, A.; Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J.
Analytics for learning design: A layered framework and tools. Br. J. Educ. Technol. 2019, 50, 139–152.
6. Wragg, T. An Introduction to Classroom Observation (Classic Edition); Routledge: Abingdon, UK, 2013; ISBN
1136597786.
7. Cohen, L.; Manion, L.; Morrison, K. Research Methods in Education; Routledge: Abingdon, UK, 2013; ISBN
113572203X.
8. Hartmann, D.P.; Wood, D.D. Observational Methods. In International Handbook of Behavior Modification and
Therapy: Second Edition; Bellack, A.S., Hersen, M., Kazdin, A.E., Eds.; Springer US: Boston, MA, USA, 1990;
pp. 107–138, ISBN 978-1-4613-0523-1.
9. Blikstein, P. Multimodal learning analytics. In Proceedings of the Third International Conference on
Learning Analytics and Knowledge (LAK ’13), Leuven, Belgium, 8–13 April 2013; ACM Press: New York,
NY, USA, 2013; p. 102.
10. Eradze, M.; Rodríguez-Triana, M.J.; Laanpere, M. How to Aggregate Lesson Observation Data into
Learning Analytics Datasets? In Proceedings of the 6th Multimodal Learning Analytics Workshop
(MMLA), Vancouver, BC, Canada, 14 March 2017; CEUR: Aachen, Germany, 2017; Volume 1828, pp. 1–8.
11. Martínez, A.; Dimitriadis, Y.; Rubia, B.; Gómez, E.; De la Fuente, P. Combining qualitative evaluation and
social network analysis for the study of classroom social interactions. Comput. Educ. 2003, 41, 353–368.
12. Cameron, L. How learning design can illuminate teaching practice. In Proceedings of The Future of
Learning Design Conference, Wollongong, Australia, 10 December 2009; Faculty of Education, University
of Wollongong: Wollongong, NSW, Australia, 2009.
13. Dobozy, E. Typologies of Learning Design and the introduction of a “LD-Type 2” case example. eLearn. Pap.
2011, 27, 1–11.
14. Law, N.; Li, L.; Herrera, L.F.; Chan, A.; Pong, T.-C. A pattern language based learning design studio for an
analytics informed inter-professional design community. Interac. Des. Archit. 2017, 33, 92–112.
15. Conole, G. Designing for Learning in an Open World; Springer Science & Business Media: Berlin, Germany,
2012; Volume 4, ISBN 1441985166.
16. Mor, Y.; Craft, B. Learning design: Reflections upon the current landscape. Res. Learn. Technol. 2012, 20,
19196.
17. Dalziel, J. Reflections on the Art and Science of Learning Design and the Larnaca Declaration. In The Art
and Science of Learning Design; Sense Publishers: Rotterdam, The Netherlands, 2015; pp. 3–14, ISBN
9789463001038.
18. Jonassen, D.; Spector, M.J.; Driscoll, M.; Merrill, M.D.; Van Merrienboer, J. Handbook of Research on
Educational Communications and Technology: A Project of the Association for Educational Communications and
Technology; Routledge: Abingdon, UK, 2008; ISBN 9781135596910.
19. Muñoz-Cristóbal, J.A.; Hernández-Leo, D.; Carvalho, L.; Martinez-Maldonado, R.; Thompson, K.; Wardak,
D.; Goodyear, P. 4FAD: A framework for mapping the evolution of artefacts in the learning design process.
Australas. J. Educ. Technol. 2018, 34, 16–34.
20. Mor, Y.; Ferguson, R.; Wasson, B. Editorial: Learning design, teacher inquiry into student learning and
learning analytics: A call for action. Br. J. Educ. Technol. 2015, 46, 221–229.
21. Hennessy, S.; Bowker, A.; Dawes, M.; Deaney, R. Teacher-Led Professional Development Using a Multimedia
Resource to Stimulate Change in Mathematics Teaching; SensePublishers: Rotterdam, The Netherlands, 2014;
ISBN 9789462094345.
22. Rienties, B.; Toetenel, L. The Impact of Learning Design on Student Behaviour, Satisfaction and
Performance. Comput. Hum. Behav. 2016, 60, 333–341.
Educ. Sci. 2019, 9, 91 11 of 13
23. Ertmer, P.A.; Parisio, M.L.; Wardak, D. The Practice of Educational/Instructional Design. In Handbook of
Design in Educational Technology; Routledge: New York, NY, USA, 2013; pp. 5–19.
24. Moses, S. Language Teaching Awareness. J. Engl. Linguist. 2001, 29, 285–288.
25. Marshall, C.; Rossman, G.B. Designing Qualitative Research; Sage Publications: Rotterdam, The Netherlands,
2014; ISBN 1483324265.
26. Ochoa, X.; Worsley, M. Augmenting Learning Analytics with Multimodal Sensory Data. J. Learn. Anal. 2016,
3, 213–219.
27. James, A.; Kashyap, M.; Victoria Chua, Y.H.; Maszczyk, T.; Nunez, A.M.; Bull, R.; Dauwels, J. Inferring the
Climate in Classrooms from Audio and Video Recordings: A Machine Learning Approach. In Proceedings
of the 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE),
Wollongong, NSW, Australia, 4–7 December 2018; pp. 983–988.
28. Howard, S.K.; Yang, J.; Ma, J.; Ritz, C.; Zhao, J.; Wynne, K. Using Data Mining and Machine Learning
Approaches to Observe Technology-Enhanced Learning. In Proceedings of the 2018 IEEE International
Conference on Teaching, Assessment, and Learning for Engineering (TALE), Wollongong, NSW, Australia,
4–7 December 2018; pp. 788–793.
29. Cukurova, M.; Luckin, R.; Mavrikis, M.; Millán, E. Machine and Human Observable Differences in Groups’
Collaborative Problem-Solving Behaviours. In Proceedings of the Data Driven Approaches in Digital Education:
EC-TEL 2017; Lavoué, É., Drachsler, H., Verbert, K., Broisin, J., Pérez-Sanagustín, M., Eds.; Springer
International Publishing: Cham, Switzerland, 2017; Volume 10474, pp. 17–29.
30. Anguera, M.T.; Portell, M.; Chacón-Moscoso, S.; Sanduvete-Chaves, S. Indirect observation in everyday
contexts: Concepts and methodological guidelines within a mixed methods framework. Front. Psychol. 2018,
9, 13.
31. Rodríguez-Triana, M.J.; Vozniuk, A.; Holzer, A.; Gillet, D.; Prieto, L.P.; Boroujeni, M.S.; Schwendimann,
B.A. Monitoring, awareness and reflection in blended technology enhanced learning: A systematic review.
Int. J. Technol. Enhanc. Learn. 2017, 9, 126–150.
32. Lockyer, L.; Heathcote, E.; Dawson, S. Informing pedagogical action: Aligning learning analytics with
learning design. Am. Behav. Sci. 2013, 57, 1439–1459.
33. Gruba, P.; Cárdenas-Claros, M.S.; Suvorov, R.; Rick, K. Blended Language Program Evaluation; Palgrave
Macmillan: London, UK, 2016; ISBN 978-1-349-70304-3 978-1-137-51437-0.
34. Pardo, A.; Ellis, R.A.; Calvo, R.A. Combining observational and experiential data to inform the redesign of
learning activities. In Proceedings of the Fifth International Conference on Learning Analytics And
Knowledge (LAK ’15), Poughkeepsie, NY, USA, 16–20 March 2015; ACM: New York, NY, USA, 2015; pp.
305–309.
35. Bakeman, R.; Gottman, J.M. Observing Interaction; Cambridge University Press: Cambridge, UK, 1997; ISBN
9780511527685.
36. Rodríguez-Triana, M.J.; Martínez-Monés, A.; Asensio-Pérez, J.I.; Dimitriadis, Y. Scripting and monitoring
meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL
situations. Br. J. Educ. Technol. 2015, 46, 330–343.
37. Soller, A.; Martınez, A.; Jermann, P.; Muehlenbrock, M. From Mirroring to Guiding: A review of the state
of the art in interaction analysis. Int. J. Artif. Intell. Educ. 2005, 15, 261–290.
38. Corrin, L.; Lockyer, L.; Corrin, L.; Mulder, R.; Williams, D.; Dawson, S. A Conceptual Framework linking
Learning Design with Learning Analytics Learning Analytics. In Proceedings of the Sixth International
Conference on Learning Analytics & Knowledge, Edinburgh, UK, 25–29 April 2016; pp. 329–338.
39. Rodríguez-Triana, M.J.; Prieto, L.P.; Martínez-Monés, A.; Asensio-Pérez, J.I.; Dimitriadis, Y. The teacher in
the loop. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge
(LAK ’18), Sydney, NSW, Australia, 7–9 March 2018; ACM: New York, NY, USA, 2018; pp. 417–426.
40. Kitchenham, B. Procedures for Performing Systematic Reviews; Keele University: Keele, UK, 2004.
41. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.A.; Clarke, M.; Devereaux,
P.J.; Kleijnen, J.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of
studies that evaluate health care interventions: explanation and elaboration. J. Clin. Epidemiol. 2009, 62,
31–34.
42. Elo, S.; Kyngäs, H. The qualitative content analysis process. J. Adv. Nurs. 2008, 62, 107–115.
Educ. Sci. 2019, 9, 91 12 of 13
43. Adams, C.M.; Pierce, R.L. Differentiated Classroom Observation Scale—Short Form. In Proceedings of the
The Proceedings of the 22nd Annual Conference of the European Teacher Education Network, Coimbra,
Portugal, 19–21 April 2012; pp. 108–119.
44. Anderson, J. Affordance, learning opportunities, and the lesson plan pro forma. ELT J. 2015, 69, 228–238.
45. Eradze, M.; Laanpere, M. Lesson Observation Data in Learning Analytics Datasets: Observata; Springer: Berlin,
Germany, 2017; Volume 10474, ISBN 9783319666099.
46. Eradze, M.; Rodríguez-Triana, M.J.; Laanpere, M. Semantically Annotated Lesson Observation Data in
Learning Analytics Datasets: a Reference Model. Interact. Des. Archit. J. 2017, 33, 75–91.
47. Freedman, A.M.; Echt, K. V.; Cooper, H.L.F.; Miner, K.R.; Parker, R. Better Learning Through Instructional
Science: A Health Literacy Case Study in “How to Teach So Learners Can Learn”. Health Promot. Pract. 2012,
13, 648–656.
48. Ghazali, M.; Othman, A.R.; Alias, R.; Saleh, F. Development of teaching models for effective teaching of
number sense in the Malaysian primary schools. Procedia Soc. Behav. Sci. 2010, 8, 344–350.
49. Hernández, M.I.; Couso, D.; Pintó, R. Analyzing Students’ Learning Progressions Throughout a Teaching
Sequence on Acoustic Properties of Materials with a Model-Based Inquiry Approach. J. Sci. Educ. Technol.
2015, 24, 356–377.
50. Jacobs, C.L.; Martin, S.N.; Otieno, T.C. A science lesson plan analysis instrument for formative and
summative program evaluation of a teacher education program. Sci. Educ. 2008, 92, 1096–1126.
51. Jacobson, Larry; Hafner, L.P. Using Interactive Videodisc Technology to Enhance Assessor Training.
Annual Conference of the International Personnel Management Association Assessment Council, Chicago,
IL, USA, 23–27 June 1991.
52. Kermen, I. Studying the Activity of Two French Chemistry Teachers to Infer their Pedagogical Content
Knowledge and their Pedagogical Knowledge. In Understanding Science Teachers’ Professional Knowledge
Growth; Grangeat, M., Ed.; SensePublishers: Rotterdam, The Netherlands, 2015; pp. 89–115, ISBN 978-94-
6300-313-1.
53. Molla, A.S.; Lee, Y.-J. How Much Variation Is Acceptable in Adapting a Curriculum? Nova Science Publishers:
Hauppauge, NY, USA, 2012; Volume 6; ISBN 978-1-60876-389-4.
54. Nichols, W.D.; Young, C.A.; Rickelman, R.J. Improving middle school professional development by
examining middle school teachers’ application of literacy strategies and instructional design. Read. Psychol.
2007, 28, 97–130.
55. Phaikhumnam, W.; Yuenyong, C. Improving the primary school science learning unit about force and
motion through lesson study. AIP Conf. Proc. 2018, 1923, 30037.
56. Yuval, L.; Procter, E.; Korabik, K.; Palmer, J. Evaluation Report on the Universal Instructional Design
Project at the University of Guelph. 2004. Available online:
https://www.google.com.hk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=2ahUKEwiXgNat4-
rhAhUNGaYKHdMiAsoQFjAAegQIABAC&url=https%3A%2F%2Fopened.uoguelph.ca%2Finstructor-
resources%2Fresources%2Fuid-summaryfinalrep.pdf&usg=AOvVaw2Hht_P2gKzmq0zydIrEe5Q
(accessed on 24 April 2019)
57. Ratnaningsih, S. Scientific Approach of 2013 Curriculum: Teachers’ Implementation in English Language
Teaching. J. Engl. Educ. 2017, 6, 33–40.
58. Rozario, R.; Ortlieb, E.; Rennie, J. Interactivity and Mobile Technologies: An Activity Theory Perspective.
In Lecture Notes in Educational Technology; Springer: Berlin, Germnay, 2016; pp. 63–82, ISBN 978-981-10-
0027-0; 978-981-10-0025-6.
59. Simwa, K.L.; Modiba, M. Interrogating the lesson plan in a pre-service methods course: Evidence from a
University in Kenya. Aust. J. Teach. Educ. 2015, 40, 12–34.
60. Sibanda, J. The nexus between direct reading instruction, reading theoretical perspectives, and pedagogical
practices of University of Swaziland Bachelor of Education students. RELC J. 2010, 41, 149–164.
61. Suherdi, I.S.N. and D. Scientific Approach: An English Learning-Teaching (Elt) Approach in the 2013
Curriculum. J. Engl. Educ. 2017, 5, 112–119.
62. Solomon, W.H. Participant Observation and Lesson Plan Analysis: Implications for Curriculum and
Instructional Research. 1971. Available online: https://files.eric.ed.gov/fulltext/ED049189.pdf (accessed on
24 April 2019).
63. Suppa, A. English Language Teachers’ Beliefs and Practices of Using Continuous Assessment: Preparatory Schools
in IIu Abba Bora Zone in Focus; Jimma University: Jimma, Ethiopia, 2015.
Educ. Sci. 2019, 9, 91 13 of 13
64. Vantassel-Baska, J.; Avery, L.; Struck, J.; Feng, A.; Bracken, B.; Drummond, D.; Stambaugh, T. The William
and Mary Classroom Observation Scales Revised. The College of William and Mary School of Education
Center for Gifted Education. 2003. Available online: https://education.wm.edu/centers/cfge/_documents/
research/athena/cosrform.pdf (accessed on 24 April 2019).
65. Versfeld, R.; Teaching, T.H.E. Investigating and Establishing Best Practices in the Teaching of English as a
Second Language in Under-Resourced and Multilingual Contexts. Teaching and Learning Resources
Centre University of Cape Town. Report. 1998. Available online:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.569.115&rep=rep1&type=pdf (accessed on 24
April 2019)
66. Zhang, Y. Multimodal teacher input and science learning in a middle school sheltered classroom. J. Res. Sci.
Teach. 2016, 53, 7–30.
67. Matusov, E. In search of “the appropriate” unit of analysis for sociocultural research. Cult. Psychol. 2007,
13, 307–333.
68. Rodríguez-Medina, J.; Rodríguez-Triana, M.J.; Eradze, M.; García-Sastre, S. Observational Scaffolding for
Learning Analytics: A Methodological Proposal. In Lecture Notes in Computer Science; Springer: Berlin,
Germany, 2018; Volume 11082, pp. 617–621.
69. Eradze, M.; Väljataga, T.; Laanpere, M. Observing the Use of E-Textbooks in the Classroom: Towards
“Offline” Learning Analytics. In Lecture Notes in Computer Science; Springer: Berlin, Germany, 2014; Volume
8699, pp. 254–263.
70. Falconer, I.; Finlay, J.; Fincher, S. Representing practice: practice models, patterns, bundles… Learn. Media
Technol. 2011, 36, 101–127.
71. Agostinho, S.; Harper, B.M.; Oliver, R.; Hedberg, J.; Wills, S. A Visual Learning Design Representation to
Facilitate Dissemination and Reuse of Innovative Pedagogical Strategies in University Teaching. In
Handbook of Visual Languages for Instructional Design: Theories and Practices; Information Science Reference:
Hershey, PA, USA, 2008; ISBN 9781599047294.
72. Mangaroska, K.; Giannakos, M. Learning Analytics for Learning Design: Towards Evidence-Driven
Decisions to Enhance Learning. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in
Artificial Intelligence and Lecture Notes in Bioinformatics); Lavoué, É., Drachsler, H., Verbert, K., Broisin, J.,
Pérez-Sanagustín, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; Volume 10474, pp.
428–433 ISBN 9783319666099.
73. Prieto, L.P.; Sharma, K.; Kidzinski; Rodríguez-Triana, M.J.; Dillenbourg, P. Multimodal teaching analytics:
Automated extraction of orchestration graphs from wearable sensor data. J. Comput. Assist. Learn. 2018, 34,
193–203.
74. Saar, M.; Prieto, L.P.; Rodríguez-Triana, M.J.; Kusmin, M. Personalized, teacher-driven in-action data
collection: technology design principles. In Proceedings of the 18th IEEE International Conference on
Advanced Learning Technologies (ICALT), Mumbai, India, 9–13 July 2018.
75. Merenti-Välimäki, H.L.; Lönnqvist, J.; Laininen, P. Present weather: Comparing human observationsand
one type of automated sensor. Meteorol. Appl. 2001, 8, 491–496.
76. Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J.; Gallego-Lema, V.; Arribas-Cubero, H.F.; Asensio-Pérez, J.I.;
Martínez-Monés, A. Monitoring for Awareness and Reflection in Ubiquitous Learning Environments. Int.
J. Hum. Comput. Interact. 2018, 34, 146–165.
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
... Comparison between Traditional Learning and Active Learning Traditional Learning, Its Advantages and Disadvantages and How It Affects Learning Space Design Traditional teaching refers to the teacher-centred teaching approach that places a focus on providing students with information in a one-way manner (Table 1). For the design of a traditional classroom, a teacher stand is usually placed in the front and long tables for students face the front stand, which forms an authoritative position which means that teachers can monitor and ensure their students' seats remain stationary and arranged in rows [38]. As pupils take a passive role in absorbing knowledge and information, this setup is discovered to be more ideal for the memorisation of facts and coursework with a theoretical focus. ...
... Traditional Learning Benefits of Traditional Learning Direct information from teacher [11,38] Timesaving (group discussion may waste time) [11,12] Allows more time for Q&A [11,12] Teaching conducted in an orderly manner [11,12] Understanding of the subject matter [11,12] ...
... Classroom Facilities Types of facilities required in classroom IT/AV provisions [12,38,45] Large monitors for presentation [12,38] ...
Article
Full-text available
Active learning has been increasingly important in tertiary education in recent years due to its powerfully favourable impact on students’ learning attitudes and efficacy. Indeed, the way that a classroom is set up has a direct impact on how well students learn and how well teachers teach. The continuous evaluation of students’ learning performance is essential for guiding future classroom renovations and creating a cutting-edge learning environment for both students and teachers. The aims of this paper were to provide a better understanding of the latest development trend of learning mode preference in tertiary education and to investigate any underlying similarities and differences in the perceptions between teachers and students. To support both teaching and learning, an empirical questionnaire survey was conducted among teachers and students in Hong Kong to assess the effectiveness of various active learning techniques and passive learning techniques adopted in tertiary education. Opinion-based data were collected on the perceived benefits and disadvantages of both learning techniques as well as the importance of various classroom design features. To determine the significance of the variations in opinions between teachers and students on the survey responses, descriptive statistical analyses using the mean score and Mann–Whitney U-test were carried out. The results of the Mann–Whitney U-test on the advantages of traditional learning showed that the following variables significantly varied: ‘direct information from the teacher’; ‘timesaving (group discussion may waste time)’ and ‘allow more time for Q&A’. These advantages were generally rated higher from the viewpoint of students rather than teachers. However, no significant difference was established concerning the limitations of traditional learning. The findings of this study can help teachers and instructors to understand how different teaching and learning methods affect students’ ability to learn effectively, which can ultimately help institutional policymakers to determine the necessary essential requirements for orchestrating classroom designs to create more conducive teaching and learning environments. The findings also aim to inform policymakers and educational institutions on the impact of pedagogical change on the fundamental design requirements for a flexible classroom environment supportive of students’ active learning, especially in tertiary education.
... On the contrary, while reducing expressivity, systematic (structured) observations allow for more efficient analysis and data processing [21]. Therefore, systematic observations are especially suitable to be combined with digital traces, enriching each other to understand learning processes and contexts with the help of multimodal learning analytics [22]. ...
... Considering the aforementioned information, based on the lessons learned from previous studies [12] [22], we have proposed the Context-aware Multimodal Learning Analytics Taxonomy (Fig. 1) [34]. The taxonomy classifies different research designs depending on how systematic the documentation of the learning design and the data collection have been: Ideal -Systematic documentation and data collection: In the most desirable case, the learning design (including actors, roles, resources, activities, timeline, and learn-ing objectives) is set up-front and documented in an authoring tool (e.g., LePlanner 1 or WebCollage 2 ). ...
... Automated or human-mediated data brings different semantics and meaning in the datasets. Each level of the taxonomy can be used for different types of research designs [22], i.e. the use of highly structured observations based on predefined coding can contribute confirmatory research and creation of hypothesis space through labelling of learning constructs within MMLA as indicated by other researchers [32]. Overall, based on the feedback of the users ideal, authentic or limited scenarios of data collection and analysis, the benefit of contextualisation for data analysis and sense-making is evident. ...
Article
Full-text available
Educational processes take place in physical and digital places. To analyse educational processes, Learning Analytics (LA) enable data collection from the digital learning context. At the same time, to gain more insights, the LA data can be complemented with the data coming from physical spaces enabling Multimodal Learning Analytics (MMLA). To interpret this data, theoretical grounding or contextual information is needed. Learning designs (LDs) can be used for contextualisation, however, in authentic scenarios the availability of machine-readable LD is scarce. We argue that Classroom Observations (COs), traditionally used to understand educational processes taking place in physical space, can provide the missing context and complement the data from the co-located classrooms. This paper reports on a co-design case study from an authentic scenario that used CO to make sense of the digital traces. In this paper we posit that the development of MMLA approaches can benefit from co-design methodologies; through the involvement of the end-users (project managers) in the loop, we illustrate how these data sources can be systematically integrated and analysed to better understand the use of digital resources. Results indicate that CO can drive sense-making of LA data where predefined LD is not available. Furthermore, CO can support layered contextualisation depending on research design, rigour and systematic documentation/data collection efforts.Also, co-designing the MMLA solution with the end-users proved to be a useful approach.
... Their research specifically examined space-related issues in higher education, including learning space planning, campus planning, and construction design. In a separate study, Eradze et al. (2019) conducted a systematic review aiming to understand the potential synergies between learning design (LD) and classroom observation (CO). Their findings suggested that the utilization of ICT tools to support the design process could enhance the extraction of contextual information from observations and facilitate the analysis within a pedagogical framework. ...
... The findings reveal several environments (physical and digital), and methods (teacher-led, student-centered, active learning, collaborative, and technology-enhanced approaches). These findings complement previous reviews (Eradze et al., 2019;Zainuddin et al., 2020;Martínez-Ramos et al., 2021;Sara et al., 2021;Suraini & Aziz, 2023) by mapping the learning environments and approaches in support of the design of future classrooms. The implications of this rapid review involve theoretical and practical areas. ...
Chapter
Full-text available
The design of future classrooms has become a topic of great priority as technology and pedagogy continue to evolve. However, there is a lack of consensus on the different designs of future classrooms that can support diverse learning needs and styles. Therefore, this rapid review explores the various learning space designs and learning methods, aiming to provide insights into the development of effective learning environments that promote student engagement and success. The methodology section presents the search strategy, selection criteria, and data extraction and analysis. Key findings indicate various physical (e.g. library, laboratory) and digital learning environments (e.g. social media platforms), and several teaching and learning approaches, such as teacher-led, student-centered, collaborative, or technology-enhanced. This chapter presents key environments and methods, and their potential impact on future classroom design. The study offers a significant map for both scholars and practitioners to overview key elements in the design of future learning spaces and methods.
... (Baepler et al., 2014) It contains elements that can motivate the student. (Castro et al., 2021) Student participation is limited (Chan et al., 2023) Student participation is encouraged (Yow, 2022) Individual learning is underrated (Sampson & Karagiannidis, 2002) Individual learning is deemed important (Li, 2022) Attention-grabbing elements are limited (Alden Rivers et al., 2015) The learning environment is rich and comprehensive (Vera et al., 2024) Critical thinking skills remain weak (Baepler et al., 2014) Higher-order thinking skills are involved (Hu & Hwang, 2024) Learning is limited to a fixed classroom organisation (Eradze et al., 2019) Authentic learning takes place (Lo Turco et al., 2019) have been integrated with virtual museums (Uslu, 2008). Figure 1 illustrates this integration in different areas. ...
Article
Full-text available
This study aims to reveal pre-service teachers’ experience in virtual museum design that they can use in social studies teaching, and their opinions on virtual museum applications. In line with this purpose, phenomenology design was used as one of the qualitative research approaches. Selected by the criterion sampling method, the study sample consisted of a total of 15 pre-service social studies teachers (9 female, 6 male) who were studying in year 4 at the Department of Social Studies Education of a State University in the 2021/22 academic year. During the 9-week virtual museum design process, virtual museums on “epidemics, women’s rights, population, environmental problems, climate, human rights, and migration” were designed through the Artsteps application. The study was executed in a dynamic manner in co-operation and interaction with pre-service teachers based on the principles of design, implementation and evaluation. A semi-structured interview form was used as a data collection tool to determine the opinions of pre-service teachers about virtual museums and the use of virtual museums in social studies teaching. The data was analysed by content analysis. The results revealed that the virtual museum design process positively affected the views of pre-service teachers and that virtual museums are very effective and applicable tools in social studies teaching. This study suggests that virtual museums be used in social studies courses since they offer rich content to achieve meaningful learning in social studies courses owing to easy accessibility, and that future studies focus on examining the effects of popularizing virtual museums designed with gamification and guided content.
... A second limitation of the study is that the research design did not include a pre-and post-test control group to draw safer conclusions. In addition teacher's systematic real-time silent observation was not based on a structured instrument that promotes class observation or an organized observational protocols (Eradze et al., 2019;van der Mars et al, 2018). ...
Article
Full-text available
This study reports the results obtained from the implementation of an educational program for primary education in relation to the development or enhancement of students' learning motivation. The publication of a school journal by 24 6 th grade students at a public primary school in Greece and their teacher was the main project of the project. The students worked in a communicative and collaborative environment and were encouraged to actively participate in all stages of writing and publishing the magazine. The study describes the pedagogical approach followed by the teacher and the main topics of the journal. The way of collecting the qualitative and quantitative data of the research conducted and the analysis of the data are also mentioned. In relation to the quantitative data, the students' motivation index (SMI) was calculated before and after the educational project. The results of the quantitative analysis showed that after the implementation of the program, SMI had increased, and the results were statistically significant for both boys and girls. Qualitative data collected agreed with the quantitative results.
... Classroom observation allows us to identify pedagogical practices, which contribute to student learning and aid teamwork, progressively improving relevant teaching practices. In other words, it aims to support the teaching and learning practices in the instructional process (Eradze et al., 2019). It analyses the characteristics of the performance of teachers and their students in the real context in which the educational process takes place, avoiding making inferences about what actually happens in classrooms (O'Leary, 2020). ...
Article
Full-text available
The need to increase educational quality has led public policymakers to create and implement strategies for improving teachers’ skills. One such strategy, adapted in Chile, is the classroom accompaniment program, which has become a case of teacher professional development. The present study primarily seeks to understand public schoolteachers’ perception on classroom pedagogical accompaniment program (CPAP), and at the same time its effectiveness. This qualitative research is a case study framed within an interpretive paradigm, in which semi-structured interviews were conducted to collect data. A content analysis was done with 4 categories and 10 subcategories of perceptions attributed to the program and its effectiveness by 13 teachers, 8 females and 5 males, belonging to four public educational establishments. The results show opposed perceptions about the existing accompaniment program. On one hand, some teachers rate it positively and consider it beneficial for them and their students, who also received adequate feedback. On the other hand, another group of teachers considered that there were no positive contributions to their work performance, with impacts including greater reticence during the in-class observation process. Thus, the study concludes that the initial orientations and instruction regarding the role of the observing teacher are fundamental for the classroom accompaniment process to be effective and that it can be a valuable tool to apply for improving teacher performance.
... Otherwise, data use is often neglected (similarly to automatically collected data), even when the data have proven to be meaningful for the teacher and help to make informed decisions. The main constraints hindering classroom data collection are still shortage of time and intense workload (also indicated by Dillenbourg & Jermann, 2010;Prieto et al., 2020), therefore the tools have to be easy to set up and use, not require much additional effort (to use or analyse the data) and help to collect data that match the teacher's needs (e.g., classroom activity patterns and student feedback) (similarly voiced by Eradze et al., 2019). ...
Article
Full-text available
Research indicates that data-informed practice helps teachers change their teaching and promotes teacher professional development (TPD). Although educational data are often collected from digital spaces, in-action evidence from physical spaces is seldom gathered, providing an incomplete view of the classroom reality. Also, most learning analytics tools focus on learners and do not explicitly collect or analyse teaching data. To support teacher-led inquiries in TPD, the authors’ Design-Based Research explores the feasibility and effects of teachers actively collecting, with the help of technology, data about their classroom practice and the possible impact of such data on their own teaching. Based on an online survey (N = 94), prior research literature and feedback from teachers (N = 11), the authors demonstrate the feasibility of such data collection and suggest design principles for classroom data-collection tools as, besides usability and ease of use, they also detected interest in customisation, triggering teacher interest and inclusion of teaching data.
... Traditional teaching (TL) refers to the teacher-centered teaching mode that concerns primarily one-way delivery of information. For the design of a traditional classroom, a teacher stand is usually placed in the front, and long tables for students are facing the front stand, which forms an authoritative position that teachers could monitor and control their students' seats remain stationary, and in arranged rows (Eradze et al., 2019). It is more suitable for memorizing facts and theory-based coursework as students act as the passive role of receiving knowledge and information. ...
Conference Paper
Full-text available
Active learning has played an important role in recent years as it carries a highly positive effect of improving students' learning attitude and successiveness. Indeed, classroom design is a determinative factor in the performance of learning and teaching. Continuous assessment of learning performance is crucial that could indicate the direction of future classroom renovation to provide an advanced learning and teaching environment for students and teachers. Hence, this paper aims to provide a better understanding of the latest trend in learning mode preference. The characteristics and reasons for choosing active and traditional learning were examined from the students' perspectives using questionnaire surveys. Moreover, this paper focus on the effects of pedagogical transition on design factors and design criteria for a flexible classroom. The analysis of the responses collected from the students' perspectives shows interesting findings useful for adopting the active learning approach and for classroom design. Most of the factors were rated very important by the students, which shows the various factors should be considered by teachers and administrators in the delivery of teaching instructions and the classroom design, respectively. A different mix of traditional and active learning approaches might be suitable, depending on the subject's intended learning outcomes. A traditional approach will facilitate learning more orderly, although most of the instructions are from the teacher, while active learning is more collaborative and engaging for the students. Practical implications of the study were also discussed.
Article
Full-text available
Розглянуто педагогічний дизайн як складову фахових компетентностей педагогів професійного навчання та важливість цієї складової у підготовці майбутніх педагогів для закладів професійної освіти. Здійснено короткий аналіз підходів науковців до трактування педагогічного дизайну, ролі та місця педагогічного дизайну в сучасному освітньому процесі, зокрема, в умовах застосування інформаційно-комунікаційних технологій. Представлено результати анкетування педагогів професійної школи щодо застосування педагогічного дизайну як перспективного напряму, що дозволяє будувати результативне онлайн навчання та змішане. Було виявлено, що сучасний стан підготовки майбутніх педагогів професійної школи щодо застосування педагогічного дизайну в освітньому процесі є неефективним. Проведено теоретичне обґрунтування компетентності майбутніх педагогів професійної школи з педагогічного дизайну, що визначається єдністю когнітивних та процедурних компонентів. Критеріями формування цих компонентів є здатність побудови результативного освітнього процесу на основі цілей навчання, навчального матеріалу і сучасних інструментів, які можуть бути доступні в інформаційно‐освітньому середовищі, а також здатність забезпечувати ефективну психолого-педагогічну взаємодію з тими, хто навчається. Виявлено та експериментально підтверджено можливості навчальної дисципліни «Педагогічна майстерність» та переддипломної практики для набуття компетентності з педагогічного дизайну. Під час експерименту, у якому взяли участь здобувачі вищої освіти магістерського рівня зі спеціальності «Професійна освіта (за спеціалізаціями)», відбулися позитивні зміни у формуванні когнітивних та процедурних компонентів компетентності. Це підтверджує доцільність подальшого використання можливості педагогічної майстерності та переддипломної практики для набуття компетентності з педагогічного дизайну майбутніми педагогами професійної школи.
Article
Full-text available
Research on instructional and learning design is ‘booming’ in Europe, although there has been a move from a focus on content and the way to present it in a formal educational context (i.e., instruction), to a focus on complex learning, learning environments including the workplace, and access to learner data available in these environments. We even see the term ‘learning experience design’ (Neelen and Kirschner 2020) to describe the field. Furthermore, there is an effort to empower teachers (and even students) as designers of learning (including environments and new pedagogies), and to support their reflection on their own practice as part of their professional development (Hansen and Wasson 2016; Luckin et al. 2016; Wasson et al. 2016). While instructional design is an often heard term in the United States and refers to “translating principles of learning and instruction into plans for instructional materials, activities, information resources, and evaluation” (Smith and Ragan 1999), Europe tends to lean more towards learning design as the key for providing efficient, effective, and enjoyable learning experiences. This is not a switch from an instructivist to a constructivist view nor from a teacher-centred to a student-centred paradigm. It is, rather, a different mind-set where the emphasis is on the goal (i.e., learning) rather than the approach (i.e., instruction). Designing learning opportunities in a technology enhanced world builds on theories of human learning and cognition, opportunities provided by technology, and principles of instructional design. New technology both expands and challenges some instructional design principles by opening up new opportunities for distance collaboration, intelligent tutoring and support, seamless and ubiquitous learning and assessment technologies, and tools for thinking and thought. In this article, the authors give an account of their own and other research related to instructional and learning design, highlight related European research, and point to future research directions.
Article
Full-text available
The primary focus of the study is to investigate the practice of a teacher implementing scientific approach in English learning-teaching in one junior high school in Bandung and reveal the difficulties encountered by the teacher in the process. In particular, this study portrays the occurrence of activities and the quality of the teaching process through pedagogical microscope. This study employs a descriptive-qualitative research design. The data were procured from classroom observation, teacher's lesson plan analysis and interview. Those data were analyzed by Pedagogical Microscope instrument (Suherdi, 2009). The findings show to some extents. First, the finding shows that all the five stages of scientific approach were completely executed in four meetings of delivering one material or one Basic Competence (KD) eventhough the five stages were not always conducted in every meeting which was different from lesson plan made. The teacher provided plenty activities in each stage. Scientific approach implemented by the teacher could engage students in active learning activities and develop various students' contributions. The ways the teacher led the active learning activities and students' contributions were varied depending on the stages. Scientific approach implemented successfully fostered students' critical thinking and developed high-thinking level of students' learning behaviour. Second, the difficulties encountered by the teacher during implementation were the problem on the students with low English proficiency, time allotment, and the teacher's teaching management.
Article
The research is aimed at investigating the teachers� implementation of scientific approach in English Language Teaching in one state junior high school in Bandung Regency. In addition, this research discusses the conformation of the Scientific Approach implementation and the lesson plans based on the 2013 curriculum. This research employs a case study qualitative research design. The data were obtained from classroom observation and teachers� lesson plan analysis and interview. The findings showed that the teachers implemented the scientific stages in English Language Teaching. They conducted observing, questioning, experimenting, associating and communicating in the sequence activities. Besides, the teachers can demonstrate the student-centered learning strengthened by collaborative, cooperative, active and meaningful learning. However, concerning the conformation of the implementation with lesson plans, based on the indicators, learning objectives, learning materials, learning media, scientific stages and Scientific Approach model (discovery learning, inquiry leaning, problem based learning and project based learning), the teachers still have to underline and mention the Scientific Approach model and state learning objectives. Furthermore, the other components have been presented well in both teaching and lesson plans.Keywords: English language teaching, lesson plan, scientific approach, teaching practice, the 2013 curriculum