ArticlePDF Available

Orchestration tools to support the teacher during student collaboration: a reviewDer Einsatz von Orchestration Tools zur Unterstützung von Lehrkräften bei der Anleitung kooperativen Lernens: Eine Übersicht

Authors:

Abstract

Teachers play an important role during student collaboration by monitoring and stimulating interactions between students that are effective for learning. In the present paper, we review existing research concerning orchestration tools developed for teachers that take data concerning collaborating students as input and provide analyses or visualizations of the data for the benefit of more effective teacher guidance of student collaboration. Studies were coded for their methodological design, the function of the orchestration tool (mirroring, alerting, or advising), the type of information that was provided to the teacher (cognitive or social), and on what level the influence of the tool was evaluated (teacher or student level). It was found that most studies had a descriptive or exploratory design, with small sample sizes. Most orchestration tools fulfilled a mirroring function. There was diversity in the type of information provided. Most included studies focused on the influence of the tool on the teacher, and those studies showed mixed findings on whether the orchestration tool enhanced their practice. Recommendations for future research are provided, and include the need for more systematic development and comparison of the various characteristics of orchestration tools.
THEMENTEIL
https://doi.org/10.1007/s42010-019-00052-9
Unterrichtswiss
Orchestration tools to support the teacher during
student collaboration: a review
Anouschka van Leeuwen · Nikol Rummel
© The Author(s) 2019
Abstract Teachers play an important role during student collaboration by moni-
toring and stimulating interactions between students that are effective for learning.
In the present paper, we review existing research concerning orchestration tools de-
veloped for teachers that take data concerning collaborating students as input and
provide analyses or visualizations of the data for the benefit of more effective teacher
guidance of student collaboration. Studies were coded for their methodological de-
sign, the function of the orchestration tool (mirroring, alerting, or advising), the
type of information that was provided to the teacher (cognitive or social), and on
what level the influence of the tool was evaluated (teacher or student level). It was
found that most studies had a descriptive or exploratory design, with small sample
sizes. Most orchestration tools fulfilled a mirroring function. There was diversity in
the type of information provided. Most included studies focused on the influence
of the tool on the teacher, and those studies showed mixed findings on whether the
orchestration tool enhanced their practice. Recommendations for future research are
provided, and include the need for more systematic development and comparison of
the various characteristics of orchestration tools.
Keywords Collaborative learning · Teacher orchestration · Learning analytics
A. van Leeuwen ()
Department of Education, Utrecht University, Heidelberglaan 1, 3584CS Utrecht, The Netherlands
E-Mail: A.vanLeeuwen@uu.nl
N. Rummel
Pädagogische Psychologie, Institut für Erziehungswissenschaft, Ruhr-Universität Bochum,
Universitätsstraße 150, 44801 Bochum, Germany
K
A. van Leeuwen, N. Rummel
Der Einsatz von Orchestration Tools zur Unterstützung von
Lehrkräften bei der Anleitung kooperativen Lernens: Eine Übersicht
Zusammenfassung Beim kooperativen Lernen von Schülerinnen und Schülern
(SuS) nehmen Lehrkräfte eine zentrale Rolle ein. Ihre Aufgabe ist es, die Zusam-
menarbeit zwischen den SuS zu beobachten und lernförderliche Interaktionen an-
zuregen. Im vorliegenden Übersichtsartikel werden Forschungsarbeiten zu teacher
orchestration tools betrachtet. Diese Tools bieten Lehrkräften Visualisierungen oder
Analysen von Kooperationsdaten ihrer SuS an, um eine effektivere Unterstützung
des kooperativen Lernens in Kleingruppen zu ermöglichen. Bei der Auswertung der
Studien wurden folgende Aspekte in den Blick genommen: das methodische De-
sign, die Funktion des orchestration tools (spiegeln, alarmieren oder beraten), die
Art der Informationen, die das eingesetzte Tool den Lehrenden zur Verfügung stellt
(kognitiv oder sozial), sowie die Ebene, auf der die Wirkung des eingesetzten Tools
untersucht wurde (aus Sicht der Lehrkraft oder aus Sicht der SuS). Die Auswertung
zeigte, dass die meisten Studien zu teacher orchestration tools deskriptiver und ex-
plorativer Natur sind und nur kleine Stichproben untersuchen. Zumeist hatten die
eingesetzten orchestration tools eine Spiegelfunktion, jedoch unterschieden sie sich
darin, welche Art von Informationen sie den Lehrenden bereitstellten. Der Großteil
der berücksichtigten Studien fokussierte auf die Wirkung der Tools auf die Lehren-
den. Dabei zeichnen die analysierten Studien kein konsistentes Bild zu der Frage, ob
sich die Unterstützung kooperativen Lernens durch den Einsatz der Tools verbessert
hat. Auf Grundlage der Ergebnisse dieser Übersichtsarbeit werden Empfehlungen
für zukünftige Forschungsvorhaben abgeleitet. Insbesondere wird die Notwendig-
keit eines systematischeren Vorgehens sowohl bei der Entwicklung als auch beim
Vergleich verschiedener Merkmale von orchestration tools betont.
Schlüsselwörter Kooperatives Lernen · Lehrunterstützung · Orchestration tools ·
Learning analytics
1 Introduction
Collaborative learning (CL) denotes situations in which two or more students work
together on a shared goal (Dillenbourg 1999). Although collaborative problem solv-
ing has been shown to be effective for learning (e.g., Kyndt et al. 2014), adequate
support is necessary to ensure its effectiveness. Without proper support, there is
a danger of the occurrence of problems such as groups straying off-task or so-
cial conflict within a group. The field of computer-supported collaborative learning
(CSCL) is based on the idea of facilitating or supporting CL by means of digital
tools(Stahletal.2006). CSCL environments can for example provide collabora-
tion scripts that aid students in interacting with each other, or provide explanation
prompts that help students solve the task at hand. As in the context of CSCL the col-
laboration occurs in a digital setting, students’ activities can often be automatically
captured by the system. Thus, a wealth of data becomes available that can be used to
K
Orchestration tools to support the teacher during student collaboration: a review
study what activities predict the effectiveness of CL, and to use these insights to pro-
vide automated, real time support that is tailored to a group’s needs. Such automated
support systems have indeed been shown to be effective in contexts of individual
work (Gerard et al. 2015;VanLehn2011). However, as Gerard et al. describe, these
studies mostly concern mathematical problems that have clear cut correct answers,
in which case it is easier to apply if-then rules to provide automated guidance. In
contrast, in the context of collaboration, it is much more difficult to provide auto-
mated, adapted support (Walker et al. 2009). This difficulty arises because students
often work on open-ended problems that have multiple pathways of arriving at a so-
lution or even multiple correct solutions (Munneke et al. 2007). Furthermore, in case
of collaboration it is not only the cognitive level on which students need guidance,
but to a large extent also the social domain, that is, the way students collaborate
and interact with each other needs to be monitored and supported (Kaendler et al.
2015). Although promising steps have been made concerning adaptive collaboration
support, it remains challenging to automatically determine the appropriate support
at any single time point because there are several levels on which assistance needs
to be delivered during student collaboration (Walker et al. 2009).
In the context of CL and CSCL, the teacher therefore plays a large role in
making sure the types of interactions between students occur that are effective
for learning (Van Leeuwen and Janssen 2019). As Swidan et al. (this issue) note,
teachers need a thorough conceptual understanding of the progress of the group
to provide adequate support for the group. While the importance of the teacher
during CL is stressed, it is also acknowledged that it is a demanding task for the
teacher to monitor multiple groups at the same time and to decide about the adequate
type of support at any given moment without disrupting the collaborative process
(Kaendler et al. 2015; Van Leeuwen and Janssen 2019). A possible way to aid
teachers in this task is by providing them with information about their collaborating
students. Besides using automatically collected data about collaborating students
for a direct form of automated support, these data can also be used as input for
teacher tools that aim to indirectly support students by informing the teacher of
their students’ activities (Rummel 2018). Thus, the assumption is that by enhancing
teachers’ understanding or diagnosis of the collaborating activities of their teachers,
teachers can provide more adequate support, and in turn enhance the effectiveness
of the collaboration. It is thus assumed that such tools aid the teacher to orchestrate
student collaboration, in the sense of coordinating the learning situation by managing
and guiding the activities within the collaborating groups (Prieto et al. 2011). In the
present paper, we focus on what we henceforth term orchestration tools developed
for teachers that take data concerning collaborating students as input and provide
analyses or visualizations of the data to teachers for the benefit of more effective
teacher guidance of student collaboration.
Orchestration tools thus build upon learning analytics, an area of research that is
summarized as the measurement, collection, analysis, and reporting of data about
learners and their contexts, for purposes of understanding and optimizing learning
and the environments in which it occurs (Lang et al. 2017). Recent reviews of the
field of learning analytics (Papamitsiou and Economides 2014), monitoring tools
(Rodríguez-Triana et al. 2017), and learning analytics dashboards (visual displays
K
A. van Leeuwen, N. Rummel
that provide information for students or teachers, see Schwendimann et al. 2017)all
show that the teacher is increasingly being targeted as a potential user for which ana-
lytics about students could have large benefits. Sergis and Sampson (2017)reviewed
tools that support teacher inquiry in the classroom, that is, tools that allow teachers
to analyze the effectiveness of their own teaching, including analytics about learners
that help the teacher to reflect on and improve their own practice. These reviews all
point out that initial findings show the positive role learning analytics can play for
teachers. On the other hand, the reviews also show that the type of analytics that
are collected and displayed show large diversity, and that most empirical studies
evaluating the effects of such tools are of an exploratory nature. Thus, drawing firm
conclusions remains difficult.
Relevant to the present paper, these reviews do not specifically focus on the
context of (CS)CL and the hypothesized affordances or challenges of teacher or-
chestration tools in that context. Sharples (2013) describes the unique nature of the
task of the teacher during (CS)CL and how this task might be further complicated by
the addition of orchestration tools, because it is not only the interaction between the
teacher and the tool that is important, but also the interactions between the teacher
and the students, between the students and the CSCL platform, and the interactions
between the students themselves. As Sharples (2013) notes, it has to be carefully
considered what function the orchestration tool fulfills and what this means for the
teacher’s role in the classroom.
If we look at examples of studies that examined teacher orchestration tools fo-
cused on supporting (CS)CL, it becomes apparent that there is diversity in the types
of research that are carried out. First, the studies differ with respect to the function
that the orchestration tools fulfill. Some systems provide analytics about collabo-
rating students to the teacher, and leave all interpretation of the information up to
the teacher (i.e., mirroring, for example Looi and Song 2013). Other tools go a step
further and provide alerts (e.g., Casamayor et al. 2009) or even advice (e.g., Berland
et al. 2015) about how the teacher could act in a certain collaborative situation. Sec-
ond, studies that evaluate orchestration tools differ in terms of their design, ranging
from exploratory to experimental, and showing diversity in their sample sizes.
There is currently no review of the research available that would allow us to
even draw preliminary conclusions about what function an orchestration tool should
fulfill, and how strong the evidence for these claims is. Such a review could not only
provide a summary of the state of the research, but also a means to discuss what
aspects deserve more attention in future research and thus be valuable input to the
discussions surrounding teacher orchestration tools.
In the present paper, we aim to take a first step at filling this gap by providing
an overview of studies that describe and evaluate teacher orchestration tools in the
context of student collaboration, henceforth abbreviated as TOSC tools. Thus, we
focus on studies that not only present an orchestration tool, but also investigate
teachers’ use of the tool and the resulting influence on the teacher or the effect on
the collaborating students, or both.
The remainder of this paper is structured as follows. In Sect. 2 (Method), we
outline how we identified the papers included in this review and what information we
extracted from each paper. In Sect. 3 (Results), we present and discuss a summary
K
Orchestration tools to support the teacher during student collaboration: a review
of the characteristics of the included studies. In Sect. 4 (Discussion), we outline
directions for future research based on the current state of this area of research.
2Method
The review was conducted in two steps: 1) find relevant studies by means of a search
query and a snowballing technique, and 2) extract information from the included
studies. Each of these steps is explained in more detail below.
2.1 Step 1: Find relevant studies
As outlined above, this review was performed in a very specific area of research.
We therefore had a set of strict inclusion criteria that studies had to meet to be in-
cluded in the review. To summarize, we were interested in studies with the following
characteristics:
Context: Studies had to be conducted in the context of synchronous collaborative
learning, because in these contexts teachers can have a direct impact on the inter-
action between students. Studies dealing with asynchronous collaboration, such as
discussion forums, were therefore out of the scope of this review.
Technology: Studies had to involve a TOSC tool, that is, an orchestration tool
that displays some form of analytics to the teachers for enabling them to support
collaborating students.
Focus: Studies needed to include the evaluation of the TOSC tool in terms of
reporting about a situation in which teachers actually made use of the tool. Note
that this means that studies that implemented the tool for teacher use, and reported
on student outcome measures instead of teacher measures, were also included.
2.1.1 Search query
In accordance with these criteria, a search query was composed for the search engine
WebOfScience (see Appendix). The search query consisted of three parts, namely
one part with keywords for collaborative learning, one part with keywords concern-
ing teacher presence, and the final part consisting of keywords for orchestration tools.
Whereas the keywords for the first two parts were quite straightforward, this was not
the case for keywords concerning orchestration tools. In this area of research, many
different terminologies are employed due to the area’s interdisciplinary nature. For
example, besides the more obvious terms “learning analytics”, “dashboards”, and
“orchestration tool”, other terms that are being used include “smart classroom”
(Mercier 2016) and “artificial intelligence techniques” (Slotta et al. 2013). To avoid
the excess of search results that we would obtain if we included these broad key-
words, we chose a very limited set of keywords. To make sure we would find as many
relevant papers as possible, we opted for an elaborate snowballing technique after
performing this initial conservative search query. The search query was inputted in
K
A. van Leeuwen, N. Rummel
WebOfScience in February 2019, and resulted in 113 articles that potentially could
be included in the review.
After reading the abstracts and method sections of these papers, 9 remained that
adhered to the inclusion criteria. The resulting set of papers is thus quite limited,
largely because in very few studies, an evaluation of the TOSC tool of any kind
was performed. In many of the search results a TOSC tool was described in terms
of its design and underlying analyses, but the report did not include the subsequent
step of teachers actually using it. The review by Schwendimann et al. (2017)shows
a similar finding: none of the 21 studies that they identified that targeted teachers
as users of analytics dashboards provided any evaluation. Similarly, a large number
of our initial WebOfScience search results concerned studies that analyzed student
behavior in some way, and indicated that the results might be useful for teachers,
but did not implement or evaluate the use of the analytics for that purpose (e.g.,
Cukurova et al. 2018).
2.1.2 Snowballing
After we obtained these 9 studies, we used a snowballing technique to uncover
other relevant studies. In particular, we used three sources as a starting point. First,
we checked the reference lists of the reviews by Sergis and Sampson (2017) and
Schwendimann et al. (2017), which are closest in focus to our aims, for potentially
relevant studies. Second, we checked the archives of the ijCSCL and AIED journals
for potentially relevant articles because, again, these journals are closest in focus
to our purposes. Third, we checked the reference lists of the 9 studies we obtained
from the WoS search. Snowballing led to the identification of 16 additional papers
that were included in the review. Also, we included the paper by Swidan et al. that
can be found in the current special issue, leading to a total of 26 included studies.
2.2 Step 2: Extracting information from the included studies
Once the final list of studies was collected, we proceeded to extract information
from the manuscripts. We listed the following information for each included study:
Sample size in terms of the number of participating teachers.
Design of the study (i.e., descriptive, experimental, etc).
The function that the TOSC tool fulfilled. The tools were classified as either mir-
roring, alerting, or advising. By Mirroring, we mean systems that provide infor-
mation but do not aid in the interpretation thereof. By alerting, we mean systems
that in some way alert the teacher to important events during collaboration. By
advising, we mean systems that advise the teacher about the status of the current
situation or about possible ways to act to support students.
The type of analytics that the TOSC tool displayed. The analytics were broadly
categorized as cognitive or social. By cognitive analytics we mean indicators re-
lated to the task content, such as how many tasks are solved. By social analytics
we mean indicators related to the collaboration between students, such as graphs
that display each member’s contributions to the task.
K
Orchestration tools to support the teacher during student collaboration: a review
Tab l e 1 Included studies and their characteristics
# Study Source Design Nteach-
ers
Function of
orchestration
tool
Type of ana-
lytics
Examples of displayed information Measures
1Alaviand
Dillenbourg
(2012)
S Descriptive 3 Mirroring/
Alerting
Cognitive The particular exercise students are work-
ing on, and whether students have re-
quested support
Teacher+
student level
2 Berland et al.
(2015)
S Quantitative,
pre-posttest
design
3 Advising Cognitive+
Social
The tool advises which students to group
based on several cognitive and social indi-
cators
Teacher+
student level
3 Casamayor
et al. (2009)
S(Quasi)
Experimental
1 Alerting Cognitive+
Social Cognitive; e. g., reading activity Teacher+
student level
Social: e. g., participation rates
4 Chounta and
Avouris
(2016)
WoS ( Qua si)
Experimental
2 Mirroring/
Alerting
Cognitive+
Social
Cognitive: e.g., inputted answers on the
task
Teacher level
Social: e. g., quality of collaboration
5 Duque et al.
(2015)
WoS Quantitative,
Wilcoxon
rank sum test
2 Advising Cognitive+
Social
The tool advises which students to group
based on several social and cognitive indi-
cators
Student level
6 Looi and
Song (2013)
S Descriptive 1 Mirroring Cognitive+
Social Cognitive: e. g., types of contributions Teacher level
Social: e. g., participation rates
7 Marcos-Garcia
et al. (2015)
S Descriptive 1 Mirroring/
Alerting
Social Role identification of teacher and student Teacher+
student level
K
A. van Leeuwen, N. Rummel
Tab l e 1 (Continued)
# Study Source Design Nteach-
ers
Function of
orchestration
tool
Type of ana-
lytics
Examples of displayed information Measures
8 Martinez-
Maldonado
et al. (2015a)
WoS ( Qua si)
Experimental
3 Alerting Cognitive Amount of activity, correctness of answers Teacher+
student level
9 Martinez-
Maldonado
et al. (2013)
S Descriptive+
Correlational
1 Mirroring Cognitive+
Social
Cognitive: e.g., progress son task Teacher+
student level
Social: e. g., participation rates
10 Martinez-
Maldonado
et al. (2015b)
WoS Descriptive 3 Mirroring Cognitive+
Social
Cognitive: e. g., duration of activities Teacher level
Social: e. g., participation rate
11 Melero et al.
(2015)
S Descriptive 2 Mirroring Cognitive+
Social Cognitive: e. g., correctnes s of answ ers Teacher+
student level
Social: e. g., participation rate
12 Mercier
(2016)
S Descriptive 2 Mirroring Cognitive Correctness of answers Teacher+
student level
13 Rodriguez-
Triana et al.
(2015)
WoS Descriptive 2 Mirroring Not clearly
outlined
Not clearly outlined Teacher level
14 Schwarz et al.
(2018)
S Descriptive 1 Mirroring/
Alerting
Cognitive+
Social Cognitive: e. g., of-task behavior Teacher level
Social: e. g., types of contributions
K
Orchestration tools to support the teacher during student collaboration: a review
Tab l e 1 (Continued)
# Study Source Design Nteach-
ers
Function of
orchestration
tool
Type of ana-
lytics
Examples of displayed information Measures
15 Schwarz and
Asterhan
(2011)
S Descriptive 3 Mirroring Social Group relations, type of interactions Teacher+
student level
16 Segal et al.
(2017)
S Descriptive 2 Mirroring/
Alerting
Cognitive+
Social Cognitive: e. g., of-task behavior Teacher level
Social: e. g., types of contributions
17 Slotta et al.
(2013)
S Descriptive Not
clear
Mirroring/
Advising
Cognitive Topics students are working on, correctness
of solutions
Teacher level
18 Swidan et al.
(2019)
Special
Issue
(Quasi)
Experimental
11 Mirroring/
Alerting
Cognitive+
Social Cognitive: e. g., of-task behavior Teacher level
Social: e. g., types of contributions
19 Trausan-Matu
et al. (2014)
WoS Descriptive+
(Quasi)
Experimental
5 + 4 Mirroring Cognitive+
Social Cognitive: e. g., content of discussions Teacher+
student level
Social: e. g., types of contributions
20 Van Leeuwen
et al. (2017)
S Descriptive 1 Mirroring Cognitive Use of keywords in discussions, progress
on task
Teacher level
21 Van Leeuwen
et al. (2014)
WoS ( Qua si)
Experimental
28 Mirroring Social Participation rates, types of interactions Teacher level
22 Van Leeuwen
et al. (2015a)
WoS ( Qua si)
Experimental
40 Mirroring Cognitive Use of keywords in discussions, progress
on task
Teacher level
23 Van Leeuwen
et al. (2015b)
WoS Descriptive 2 Mirroring Social Participation rates, types of interactions Teacher level
24 Voyiatzaki
and Avouris
(2014)
S(Quasi)
Experimental
12 Alerting Cognitive+
Social
Cognitive: e. g., inputted answers Teacher level
Social: e. g., types of interactions
25 Wichmann
et al. (2009)
S(Quasi)
Experimental
2 Mirroring Social Sequences of interactions Teacher level
26 Yen et al.
(2015)
S Descriptive 3 Mirroring Cognitive+
Social Cognitive: e. g., duration of activites Student level
Social: e. g., types of contributions
Source is coded as WoS = WebOfScience and S= Snowballing
K
A. van Leeuwen, N. Rummel
The actors from which data was collected. We coded whether studies measured the
influence of the orchestration tool on the teacher or on the collaborating students,
or both.
This information is summarized in Table 1(see the Results section). Furthermore,
we carefully read the Results and Discussion sections from each included article,
and synthesized the most important themes and discussion points that the authors
described in terms of the usefulness and influence of the TOSC tools on teachers’
practice.
3Results
Table 1displays the full list of included studies and the information that we extracted
from each study. Below, we discuss each coded aspect.
3.1 Design and number of participants
The majority of included studies (14 out of 26) had a descriptive or exploratory
design. In these studies, there was either a qualitative investigation of how teach-
ers employed the TOSC tool or if there were quantitative measures, there was no
statistical testing of the results. These studies all had a sample of 1 to 3 teachers,
which, for example, allowed for in-depth investigation of teachers’ behavior and
choices while supporting students partly mediated by the TOSC tool (e.g., Schwarz
and Asterhan 2011). In the studies that had an experimental or quasi-experimental
design, we can see that the number of participating teachers was generally quite low.
Examples of experimental designs include the comparison of teachers’ behavior or
perceptions when they were provided with a TOSC tool versus when they were
not (e.g., Casamayor et al. 2009; Swidan et al., this issue) or teachers’ interaction
with the TOSC tool in the context of supporting a small versus a larger number
of collaborating groups (Chounta and Avouris 2016). Thus, with a few exceptions
(e.g.,VanLeeuwenetal.2015a), the overall picture is that the conducted studies
generally had small sample sizes, and that the employed methodology often entailed
qualitative investigation or within-subjects comparisons.
3.2 Function of orchestration tool
The majority of reported TOSC tools fulfilled a Mirroring function, that is, infor-
mation was made available to the teacher, but further interpretation thereof was left
to the teacher. In some cases, we found combinations of Mirroring and Alerting
functions (e.g., Schwarz et al. 2018) in which teachers could peruse the provided
information and were also given alerts that something might be going wrong within
one of the collaborating groups. We found two studies (Berland et al. 2015; Duque
et al. 2015) that evaluated an Advising orchestration tool. In both cases, the tool
advised the teacher concerning which students to pair into collaborating dyads to
make their collaboration more effective.
K
Orchestration tools to support the teacher during student collaboration: a review
3.3 Type of analytics and examples of displayed information
In the included studies, there was considerable variation concerning the types of
information provided to the teacher by the TOSC tool. Some tools focused on cog-
nitive aspects of collaboration (6), others on social aspects (5), and quite a number
of orchestration tools provided both types of information (14). Often encountered
types of cognitive information include: the topics that groups are working on or dis-
cussing, and the correctness of answers that groups input. Often encountered social
types of information are the participation rates for each member of the collaborating
group, and the types of interactions or contributions that students engage in.
3.4 Measures and results of studies
The influence of the TOSC tools were measured in a number of ways, concerning
both the teacher and the collaborating students. Most studies focused on the influence
of the TOSC tool on the teacher using it (24), for example, in terms of teachers’
satisfaction with the tool (e. g., Berland et al. 2015), how teachers interacted with
the tool (e. g., Voyiatzaki and Avouris 2014), or how teachers’ support of groups
changed as a result of having a TOSC tool available to them, either in terms of
awareness/diagnosis of the groups’ state (e. g., Swidan et al., this issue) or in terms
of interventions (e. g., Van Leeuwen et al. 2014). Measures at student level were less
frequently encountered (12), but also showed variation. Examples of measures at
the student level include student perceptions of the collaboration (e.g., Duque et al.
2015), students’ progression on tasks (e. g., Casamayor et al. 2009), the quality
of collaboration (Marcos-Garcia et al. 2015), and students’ skills as a result of
collaboration (e. g. programming skills, Berland et al. 2015). The studies that focused
on students all showed a positive influence of the teacher using a TOSC tool, for
example, increased student programming skills in case of Berland et al. (2015)when
the teacher employed a tool that advised how to pair students.
The studies focusing on the teacher yielded a more complicated picture. On the
one hand, a large part of the studies showed positive results: Teachers evaluated
the use of a TOSC tool in their classroom positively because it provided them with
more information about their students, enhanced their diagnosis of the situation, and
formed input for their further decision making (e.g., Van Leeuwen et al. 2015b).
Voyiatzaki and Avouris (2014) elaborately describe the way teachers use the TOSC
tool as a source of information in different situations, including the detection of
groups that may need support, finding out whether a problem is group specific or
an issue in multiple groups, and finding out more about a group once a problem has
been signaled. Also, some studies found that teachers’ behavior was influenced by
the TOSC tool, for example, in terms of providing more support for the collaborating
groups (Van Leeuwen et al. 2015a) or by adjusting the runtime of activities in the
classroom (Martinez-Maldonado et al. 2015b).
On the other hand, some studies showed that the TOSC tool can also impair
teachers instead of assisting them. For example, Swidan et al. (this issue) report on
an alerting TOSC tool, and found that teachers sometimes felt interrupted in their
practice and that their own experience was a more valuable source to act on. These
K
A. van Leeuwen, N. Rummel
authors thus hypothesize that teacher experience might moderate how teachers make
use of the TOSC tool and whether they find it a valuable addition to their practice;
in particular, more experienced teachers would find TOSC tools disruptive to their
routine. Other studies also point at the role of several teacher factors and contextual
factors that influence how teachers use the TOSC tool, such as teachers’ beliefs of
what constitutes effective collaboration (Van Leeuwen et al. 2014) and the number
of groups that a teacher has to monitor (Chounta and Avouris 2016).
4 Discussion
In the present paper, we reviewed studies that evaluated teachers’ use of orches-
tration tools in the context of student collaboration (TOSC tools). Based on our
findings about the current state of the research in this area, we identify a number of
conclusions and related recommendations for future research.
First and foremost, it must be noted that this review only included a relatively
small number of papers (26 studies), and those papers often addressed just ex-
ploratory research questions or used very small scale within-subjects experiments.
This finding is in line with other reviews in the area of learning analytics, for exam-
ple, with Schwendimann et al. (2016) who noted that the available research is largely
of exploratory nature. Furthermore, although the underlying assumption of TOSC
tools is that by supporting the teacher, the effectiveness of student collaboration can
be increased, not many studies actually focus on the student level so far. Maybe the
research field is too young to move to this question, and the specific ways of how
teachers use TOSC tools need to be addressed first. In general, though, it can be
stated that there is not a large body of research yet and that there is a need for more
rigorous work that would allow for firmer conclusions.
A second major finding is that there is large diversity in TOSC tools concern-
ing their function and the type of information they display. On the one hand, this
finding seems positive because it means that teachers can be provided with different
types of support and with different types of information. For example, specific types
of collaboration tasks or specific group characteristics (such as familiarity between
group members) might determine which information is useful for the teacher at that
moment. As Kaendler et al. (2015) and Van Leeuwen and Janssen (2019) point out,
teachers need to support student collaboration both in the cognitive and the social
domain. In that regard, the TOSC tools that we discussed in this review seem to be
able to provide teachers with a variety of types of information that could be adapted
to the specific situation. On the other hand, several studies pointed at the possible
detrimental effects a TOSC tool can have if it is not tailored to a teacher’s needs.
Based on this review, we would therefore argue for a teacher-centered approach
in terms of the implementation and evaluation of TOSC tools. Concerning imple-
mentation, it is important to determine beforehand what type of support a teacher
is most in need of and which data about students is most relevant to display, and
to select a TOSC tool accordingly. For example, in the paper by Wiedmann et al.
(this issue), an instrument is proposed to measure teachers’ monitoring skills, on the
basis of which it could be decided whether mirroring, alerting, or advising is most
K
Orchestration tools to support the teacher during student collaboration: a review
appropriate. Concerning evaluation, it is important that more research is carried out
concerning the relation between teacher characteristics such as teaching experience
and teachers’ pedagogical beliefs, and how teachers interact with the TOSC tool. In
the studies we included in this review, there often was very little description of the
teachers’ characteristics in the study’s sample, although several authors do discuss
the potential relevance of such characteristics. Looi and Song (2013) for example
discuss the role of teachers’ pedagogical qualities in terms of their pedagogical and
content knowledge. In that sense, existing learning sciences literature could inform
future research into orchestration tools. For example, Heitink et al. (2016) outline
the skills required from teachers in the wider context of formative assessment, and
stress the importance of factors such as teachers’ data literacy, teachers’ ability to
provide adequate feedback, and the role of the culture at the school in which the
teacher is employed. Investigating the role of such factors in the context of TOSC
tools could advance the field by providing a theoretical basis to build on.
Our last suggestion for future research is to more systematically vary and evaluate
the characteristics of TOSC tools in controlled experiments, and to try to build on
each other’s work more explicitly. As it stands, the research is quite varied in terms
of the design and function of the TOSC tools under investigation. The aspects we
coded in this review allow for combinations of factors that lead to specific designs
of TOSC tools that could subsequently be evaluated. For example, consider the
three functions of tools we outlined (mirroring, alerting, advising) and types of
information provided by the tool (cognitive, social, or both). It would be a good
step forwards if researchers employed this terminology to situate their work and to
contribute to our understanding of which tools are effective and which are not (cf.
Rummel 2018).
To conclude, our review leads to four important recommendations for future
research. The first recommendation is to invest in larger scale as well as more
rigorous and controlled studies. The second recommendation is to also take into
account the influence of TOSC tools on collaborating students as a result of their
teacher using such tools. The third recommendation is to focus on the relation
between teacher characteristics and teachers’ interaction with TOSC tools to provide
insights into what type of TOSC tool best serves the needs of individualteachers. The
fourth recommendation is to more systematically build upon each other’s work by
systematically varying and evaluating the dimensions of TOSC tools. The framework
we provide here, in terms of a tool’s function and the type of information it displays,
could be used as a vocabulary or framework to inform future research.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 Interna-
tional License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution,
and reproduction in any medium, provided you give appropriate credit to the original author(s) and the
source, provide a link to the Creative Commons license, and indicate if changes were made.
K
A. van Leeuwen, N. Rummel
Appendix
Search query
TOPIC: (00 collaborative learning00 OR 00 cooperative learning00 OR 00small group
learning00 OR 00peer-assisted learning00 OR 00 peer-based learning00 OR 00 collaborative
instruction00 OR 00 collaborative work00 OR 00collaborative interaction00 OR 00collabora-
tive methods00 OR 00cooperative instruction00 OR 00cooperative work00 OR 00cooperative
interaction00 OR 00cooperative methods00 OR 00small group instruction00 OR 00small
group interaction00 OR 00 small group methods00 OR 00 peer-assisted instruction00 OR
00peer-assisted work00 OR 00peer-assisted interaction00 OR 00peer-assisted methods00
OR 00peer-based instruction00 OR 00 peer-based work00 OR 00 peer-based interaction00
OR 00peer-based methods00 OR 00 group work00 OR 00 collaborative dialogue00 OR
00small-group learning00 OR 00 small-group discussions00 OR 00peer-to-peer debates00
OR 00small-group argumentation00 OR 00student collaboration00 OR 00 cooperative-
learning00 OR 00collaborative networked learning00 OR 00group discussions00 OR
00synchronous discussions00 OR 00small-group work00 OR 00peer discussion00 OR
00collaborative reasoning00 OR CSCL) AND TOPIC: (moderation OR e-moderation
OR tutoring OR e-tutoring OR support OR guiding OR guidance OR assisting
OR assistance OR 00 verbal behavior00 OR discourse OR 00help giving00 OR helping
OR instruction OR instructing OR scaffolding OR orchestration OR feedback OR
00teaching methods00 OR 00 teaching strategies00 OR 00 teaching presence00 OR support-
ing OR 00teacher’s instructional practices00 OR 00 helping-behavior00 OR 00helping-
behaviour00 OR teacher) AND TOPIC: (dashboard OR 00 orchestration tool00 OR
analytics OR 00data mining00 )
References
Alavi, H., & Dillenbourg, P. (2012). An ambient awareness tool for supporting supervised collaborative
problem solving. IEEE Transactions on Learning Technologies,5(3), 264–274. https://doi.org/10.
1109/TLT.2012.7.
Berland, M., Davis, D., & Smith, C.P. (2015). AMOEBA: Designing for collaboration in computer science
classrooms through live learning analytics. International Journal of Computer-Supported Collabora-
tive Learning,10, 425–447. https://doi.org/10.1007/s11412-015-9217-z.
Casamayor, A., Amandi, A., & Campo, M. (2009). Intelligent assistance for teachers in collaborative
e-learning environments. Computers & Education,53(4), 1147–1154. https://doi.org/10.1016/j.
compedu.2009.05.025.
Chounta, I.-A., & Avouris, N. (2016). Towards the real-time evaluation of collaborative activities: Inte-
gration of an automatic rater of collaboration quality in the classroom from the teacher’s perspective.
Education and Information Technologies,21(4), 815–835. https://doi.org/10.1007/s10639-014- 9355-
3.
Cukurova, M., Luckin, R., Millan, E., & Mavrikis, M. (2018). The NISPI framework: Analysing collabo-
rative problem-solving from students’ physical interactions. Computers & Education,116, 93–109.
https://doi.org/10.1016/j.compedu.2017.08.007.
Dillenbourg, P. (1999). What do you mean by collaborative learning? In P. Dillenbourg (Ed.), Collabora-
tive-learning: Cognitive and computational approaches (pp. 1–19). Oxford: Elsevier.
Duque, R., Gomez-Perez, D., Nieto-Reyes, A., & Bravo, C. (2015). Analyzing collaboration and inter-
action in learning environments to form learner groups. Computers in Human Behavior,47, 42–49.
https://doi.org/10.1016/j.chb.2014.07.012.
Gerard, L., Matuk, C., McElhaney, K., & Linn, M. C. (2015). Automated, adaptive guidance for K-12
education. Educational Research Review,15, 41–58. https://doi.org/10.1016/j.edurev.2015.04.001.
K
Orchestration tools to support the teacher during student collaboration: a review
Heitink, M.C., Van der Kleij, F.M., Veldkamp, B. P., Schildkamp, K., & Kippers, W.B. (2016). A system-
atic review of prerequisites for implementing assessment for learning in classroom practice. Educa-
tional Research Review,17, 50–62. https://doi.org/10.1016/j.edurev.2015.12.002.
Kaendler, C., Wiedmann, M., Rummel, N., & Spada, H. (2015). Teacher competencies for the imple-
mentation of collaborative learning in the classroom: A framework and research review. Educational
Psychology Review,27(3), 505–536.
Kyndt, E., Raes, E., Lismont, B., Timmers, F., Dochy, F., & Cascallar, E. (2014). A meta-analysis of
the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings?
Educational Research Review,10, 133–149. https://doi.org/10.1016/j.edurev.2013.02.002.
Lang, C., Siemens, G., Wise, A. F., & Gasevic, D. (2017). Handbook of learning analytics. Society for
Learning Analytics Research.
Looi, C.-K., & Song, Y. (2013). Orchestration in a networked classroom: Where the teacher’s real-time
enactment matters. Computers & Education,69, 510–513. https://doi.org/10.1016/j.compedu.2013.
04.005.
Marcos-García, J. A., Martínez-Monés, A., & Dimitriadis, Y. (2015). DESPRO: A method based on roles
to provide collaboration analysis support adapted to the participants in CSCL situations. Computers
& Education,82, 335–353. https://doi.org/10.1016/j.compedu.2014.10.027.
Martinez-Maldonado, R., Clayphan, A., & Kay, J. (2015b). Deploying and visualizing teacher’s scripts of
small group activities in a multi-surface classroom ecology: A study in-the-wild. Computer Supported
Cooperative Work,24(2–3), 177–221. https://doi.org/10.1007/s10606-015-9217-6.
Martinez-Maldonado, R., Clayphan, A., Yacef, K., & Kay, J. (2015a). MTFeedback: Providing notifica-
tions to enhance teacher awareness of small group work in the classroom. IEEE Transactions on
Learning Technologies,8(2), 187–200. https://doi.org/10.1109/tlt.2014.2365027.
Martinez-Maldonado, R., Dimitriadis, Y., Kay, J., Yacef, K., & Edbaurer, M.-T. (2013). MTClassroom
and MTDashboard: Supporting analysis of teacher attention in an orchestrated multi-tabletop class-
room. In Proceedings of the International Conference on Computer Supported Collaborative Learn-
ing (pp. 119–128).
Melero, J., Hernandez-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization
support for a case of location-based learning design. British Journal of Educational Technology,
46(2), 317–329. https://doi.org/10.1111/bjet.12238.
Mercier, E. (2016). Teacher orchestration and student learning during mathematics activities in a smart
classroom. International Journal of Smart Technology and Learning,1(1), 33–52. https://doi.org/10.
1504/IJSMARTTL.2016.078160.
Munneke, L., Andriessen, J., Kanselaar, G., & Kirschner, P.A. (2007). Supporting interactive argumen-
tation: Influence of representational tools on discussing a wicked problem. Computers in Human
Behavior,23, 1072–1088. https://doi.org/10.1016/j.chb.2006.10.003.
Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice:
A systematic literature review of empirical evidence. Educational Technology & Society,17, 49–64.
Prieto, L. P., Holenko Dlab, M., Gutierrez, I., Rrez, N. A., Abdulwahed, M., & Balid, W. (2011). Orches-
trating technology enhanced learning: A literature review and a conceptual framework. International
Journal of Technology Enhanced Learning,3(6), 583. https://doi.org/10.1504/IJTEL.2011.045449.
Rodriguez-Triana, M.J., Martinez-Mones, A., Asensio-Perez, J. I., & Dimitriadis, Y. (2015). Scripting and
monitoring meet each other: Aligning learning analytics and learning design to support teachers in
orchestrating CSCL situations. British Journal of Educational Technology,46(2), 330–343. https://
doi.org/10.1111/bjet.12198.
Rodriguez-Triana, M. J., Prieto, L. P., Vozniuk, A., Boroujeni, M., Schwendimann, B., Holzer, A., & Gillet,
D. (2017). Monitoring, awareness and reflection in blended technology enhanced learning: A system-
atic review. International Journal of Technology Enhanced Learning,9(2), 126–150. https://doi.org/
10.1504/IJTEL.2017.10005147.
Rummel, N. (2018). One framework to rule them all? Carrying forward the conversation started by Wise
and Schwarz. International Journal of Computer-Supported Collaborative Learning,13(1), 123–129.
Schwarz, B. B., & Asterhan, C. S. (2011). E-moderation of synchronous discussions in educational set-
tings: A nascent practice. Journal of the Learning Sciences,20(3), 395–442. https://doi.org/10.1080/
10508406.2011.553257.
Schwarz, B. B., Prusak, N., Swidan, O., Livny, A., Gal, K., & Segal, A. (2018). Orchestrating the emer-
gence of conceptual learning: A case study in a geometry class. International Journal of Computer-
Supported Collaborative Learning,13, 189–211. https://doi.org/10.1007/s11412-018-9276-z.
Schwendimann, B., Rodriguez-Triana, M., Vozniuk, A., Prieto, L., Boroujeni, M., Holzer, A., Gillet, D.,
& Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning
K
A. van Leeuwen, N. Rummel
dashboard research. IEEE Transactions on Learning Technologies,10(1), 30–41. https://doi.org/10.
1109/TLT.2016.2599522.
Segal, A., Hindi, S., Prusak, N., Swidan, O., Livni, A., Schwarz, B., & Gal, K. (2017). Keeping the teacher
in the loop: Technologies for monitoring group learning in real-time. In Proceedings of the Artificial
Intelligence in Education Conference (pp. 64–76).
Sergis, S., & Sampson, D. G. (2017). Teaching and learning analytics to support teacher inquiry: A sys-
tematic literature review. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and
Trends, pp. 25–63.
Sharples, M. (2013). Shared orchestration within and beyond the classroom. Computers & Education,69,
504–506. https://doi.org/10.1016/j.compedu.2013.04.014.
Slotta, J. D., Tissenbaum, M., & Lui, M. (2013). Orchestrating of complex inquiry: Three roles for learning
analytics in a smart classroom infrastructure. In Proceedings of the 3rd International Conference on
Learning Analytics and Knowledge (pp. 270–274).
Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative learning: An historical
perspective. In R.K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 409–426).
Cambridge: Cambridge University Press.
Swidan, O., Prusak, N., Livny, A., Palatnik, A., & Schwarz, B. B. (2019). Fostering teachers’ online un-
derstanding of progression of multiple groups towards the orchestration of conceptual learning. Un-
terrichtswissenschaft.https://doi.org/10.1007/s42010-019-00050-x (included in the review)
Trausan-Matu, S., Dascalu, M., & Rebedea, T. (2014). PolyCAFe-automatic support for the polyphonic
analysis of CSCL chats. International Journal of Computer-Supported Collaborative Learning,9(2),
127–156. https://doi.org/10.1007/s11412-014-9190- y.
Van Leeuwen, A., & Janssen, J. (2019). A systematic review of teacher guidance during collaborative
learning in primary and secondary education. Educational Research Review,27, 71–89. https://doi.
org/10.1016/j.edurev.2019.02.001.
Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2014). Supporting teachers in guiding collab-
orating students: Effects of learning analytics in CSCL. Computers & Education,79, 28–39. https://
doi.org/10.1016/j.compedu.2014.07.007.
Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015a). Teacher regulation of cognitive
activities during student collaboration: Effects of learning analytics. Computers & Education,90,
80–94. https://doi.org/10.1016/j.compedu.2015.09.006.
Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015b). Teacher regulation of multiple
computer-supported collaborating groups. Computers in Human Behavior,52, 233–242. https://doi.
org/10.1016/j.chb.2015.05.058.
Van Leeuwen, A., Van Wermeskerken, M., Erkens, G., & Rummel, N. (2017). Measuring teacher sense
making strategies of learning analytics: A case study. Learning: Research and Practice,3(1), 42–58.
https://doi.org/10.1080/23735082.2017.1284252.
Van Lehn, K. (2011). The relative effectiveness of human tutoring, intelligent tutoring systems, and other
tutoring systems. Educational Psychologist,46(4), 197–221.
Voyiatzaki, E., & Avouris, N. (2014). Support for the teacher in technology-enhanced collaborative class-
room. Education and Information Technologies,19(1), 129–154. https://doi.org/10.1007/s10639-
012-9203-2.
Walker, E., Rummel, N., & Koedinger, K. (2009). CTRL: A research framework for providing adaptive
collaborative learning support. User Modeling and User-Adapted Interaction: The Journal of Person-
alization Research (umuai),19(5), 387–431.
Wichmann, A., Giemza, A., Krauß, M., & Hoppe, H. U. (2009). Effects of awareness support on moder-
ating multiple parallel e-discussions. In C. O’Malley, D. Suthers, P. Reimann & A. Dimitracopoulou
(Eds.), Proceedings of the 9th International Conference Discussions on Computer Supported Collab-
orative Learning (pp. 646–650).
Yen, C.-H., Chen, I.-C., Lai, S.-C., & Chuang, Y.-R. (2015). An analytics-based approach to managing
cognitive load by using log data of learning management systems and footprints of social media.
Educational Technology & Society,18(4), 141–158.
K
... State-of-the-Art Orchestration Solutions. According to Chan, "orchestration" is derived from orchestra in teacher orchestration [79]. Each student interacts with a digital device in a smart classroom to support them in the learning process. ...
Article
Full-text available
The ubiquitous devices and technologies to support teachers and students in a learning environment include the Internet of things (IoT), learning analytics (LA), augmented or virtual reality (AR/VR), ubiquitous learning environment (ULE), and wearables. However, most of these solutions are obtrusive, with substantial infrastructure costs and pseudo-real-time results. Real-time detection of students’ activeness, participation, and activity monitoring is important, especially during a pandemic. This research study provides a low-cost teacher orchestration solution with real-time results using off-the-shelf devices. The proposed solution determines a teacher’s activeness using multimodal data (MMD) from both teacher and student’s devices. The MMD extracts different features from data, decodes them, and displays them to the instructor in real time. It allows the instructor to update their teaching methodology in real time to get more students on board and provide a more engaging learning experience. Our experimental results show that real-time feedback about the classroom’s current status helped improve learning outcomes by about 45%. Also, we investigated a 50% increase in classroom engaging experience.
... When students go to breakout rooms to work on group tasks, TAs cannot commonly observe what is happening in each room unless they join a particular room and spend some time listening to the conversations or checking the documents generated by the group. This is a critical monitoring challenge that has been reported in CSCL and orchestration literature [43]. Moreover, although Zoom has become one of the staple communication tools to conduct learning activities around the world during the COVID-19 pandemic, it does not offer functionalities that teachers can use to "see" what is happening when several groups are working at once [37], and particularly if other tools are being used (e.g. ...
Conference Paper
Full-text available
One of the ultimate goals of several learning analytics (LA) initiatives is to close the loop and support students’ and teachers’ reflective practices. Although there has been a proliferation of end-user interfaces (often in the form of dashboards), various limitations have already been identified in the literature such as key stakeholders not being involved in their design, little or no account for sense-making needs, and unclear effects on teaching and learning. There has been a recent call for human-centred design practices to create LA interfaces in close collaboration with educational stakeholders to consider the learning design, and their authentic needs and pedagogical intentions. This paper addresses the call by proposing a question-driven LA design approach to ensure that end-user LA interfaces explicitly address teachers’ questions. We illustrate the approach in the context of synchronous online activities, orchestrated by pairs of teachers using audio-visual and text-based tools (namely Zoom and Google Docs). This study led to the design and deployment of an open-source monitoring tool to be used in real-time by teachers when students work collaboratively in breakout rooms, and across learning spaces.
Chapter
Over the last two decades, education in Singapore has shifted its focus from teacher-centred teachingTeacher-centred teaching to student-centred learningStudent-centred learning in an effort to meet all learning needs. This shift relies on teachers being responsive to a student’s individual and collective needs in terms of the curriculum, pedagogy and assessment. With this goal in mind, this chapter first unpacks the complexity of teaching and learningComplexityteaching and learning and illuminates how it imposes high demands on the workload, judgementTeacherjudgement and knowledge of teachers as they adapt and respond to students’ needs in the classroom. The chapter then explores artificial intelligence for educationArtificial intelligence for education (AIED) with a two-by-two categorisation: two for artificial intelligenceArtificial intelligence (AI) augmenting teachers or students, and two for individual or collaborative learningCollaborative learning. It introduces intelligent tutoring systemsSystemsIntelligent tutoring systems (ITSs) and classroom orchestration systemsSystemsClassroom orchestration systems as the AIED tools that can help teachers deal with the complexities of teaching an individual student or a group of students. It also introduces how AIED can augment students in creating new knowledge. In this case, AIED can increase the complexity of learning and forming partnerships with students to create new knowledge. The chapter then highlights three important considerations for effective use of AIED in classrooms, namely: AIED tools need to be robust and explainable, humans need to work in partnership with AI, and AI governanceAI governance is critical.
Chapter
Full-text available
There is little doubt about the significant role the educators play in supporting the collaboration process through monitoring and supporting effective interactions. However, little work explores the educators’ needs and understandings of the analytics generated to measure the process of collaboration in online learning settings. In this chapter, we first explain a new method of measuring the process of collaboration (CLaP) by drawing upon the collaborative cognitive load theory and utilising social network analysis. Then, we report the results of two educator workshops and a survey that investigated the educators’ understanding of the collaboration process visualisations compared to more commonly used participation measures such as the number of posts and the number of views. Our results show that although educators can indeed gain more insights into the collaboration process with CLaP visualisations, these are still considered limited and too complex to be easily adopted in practice. Moreover, currently, many educators are not evaluating the collaboration process in online settings at all, or when they do, they only rely on participation measures. We conclude the chapter with a discussion on the findings and their future implications.KeywordsCollaboration analyticsTeacher evaluationsLearning analytics
Article
Full-text available
Teacher dashboards are visual displays that provide information to teachers about their learners. In this article, we address teacher dashboards in the context of computer-supported student collaboration in primary education. We examine the role of different types of dashboards for the specific purpose of aiding teachers in identifying which group of collaborating students is in need of support. This question is addressed using qualitative and quantitative approaches. First, an interview study is reported in which teachers’ views ( n = 10) on and perceptions of the acceptability of different types of dashboards were examined. Then, the results of an experimental vignette study are reported, which built upon on the interview study, and in which teachers ( n = 35) interacted with mirroring or advising dashboards. Together, the studies revealed that the classroom situation, such as differing levels of time pressure, plays an important role regarding what type of dashboard is beneficial for a teacher to use in the classroom. The theoretical contribution of our study lies in a conceptual and empirical investigation of the relation between teachers’ need for control and their perception of different types of dashboards. Our study also points to several practical implications and directions for future research.
Article
Full-text available
Mapping the multidimensional impact of learner attributes on behavior demonstrates the importance of models in learning. To this purpose, we examined the correlations between strategies and student characteristics and utilized regression analysis to determine how learner attributes affect strategy selection. A cross-sectional study of 258 students demonstrated widespread strategy use, as well as statistically significant connections within and between the Strategy Inventory for Language Learning and Student Characteristics of Learning measures. Regression analysis found distinctions in the types of learner characteristics associated with strategy adoption, most notably between direct and indirect strategies. Instrumental motivation predicted both direct and indirect Strategy Inventory for Language Learning scores, but self-efficacy affected memory, cognitive, and compensatory strategies, and perseverance predicted reported metacognitive and emotional strategy choice levels. Additionally, a negative route coefficient occurred between persistence and compensation techniques and between competition and memory strategies, implying mediation and a high degree of complexity in the way learner traits impact behavior. The present study's findings have implications for prospective instructor techniques for motivating students to become fully involved in language learning via the online procedure.
Article
Full-text available
Orchestrating collaborative learning (CL) is difficult for teachers as it involves being aware of multiple simultaneous classroom events and intervening when needed. Artificial intelligence (AI) technology might support the teachers’ pedagogical actions during CL by helping detect students in need and providing suggestions for intervention. This would be resulting in AI and teacher co-orchestrating CL; the effectiveness of which, however, is still in question. This study explores whether having an AI assistant helping the teacher in orchestrating a CL classroom is understandable for the teacher and if it affects the teachers’ pedagogical actions, understanding and strategies of coregulation. Twenty in-service teachers were interviewed using a Wizard-of-Oz protocol. Teachers were asked to identify problems during the CL of groups of students (shown as videos), proposed how they would intervene, and later received (and evaluated) the pedagogical actions suggested by an AI assistant. Our mixed-methods analysis showed that the teachers found the AI assistant useful. Moreover, in multiple cases the teachers started employing the pedagogical actions the AI assistant had introduced to them. Furthermore, an increased number of coregulation methods were employed. Our analysis also explores the extent to which teachers’ expertise is associated with their understanding of coregulation, e.g., less experienced teachers did not see coregulation as part of a teacher’s responsibility, while more experienced teachers did.
Chapter
Orchestration tools may support K-12 teachers in facilitating student learning, especially when designed to address classroom stakeholders’ needs. Our previous work revealed a need for human-AI shared control when dynamically pairing students for collaborative learning in the classroom, but offered limited guidance on the role each agent should take. In this study, we designed storyboards for scenarios where teachers, students and AI co-orchestrate dynamic pairing when using AI-based adaptive math software for individual and collaborative learning. We surveyed 54 math teachers on their co-orchestration preferences. We found that teachers would like to share control with the AI to lessen their orchestration load. As well, they would like to have the AI propose student pairs with explanations, and identify risky proposed pairings. However, teachers are hesitant to let the AI auto-pair students even if they are busy, and are less inclined to let AI override teacher-proposed pairing. Our study contributes to teachers’ needs, preference, and boundaries for how they want to share the task and control of student pairing with the AI and students, and design implications in human-AI co-orchestration tools.
Article
Full-text available
For this review, we synthesized quantitative and qualitative research on collaborative learning to examine the relationship between teacher guidance strategies and the processes and outcomes of collaboration among students (66 studies). The results show that several aspects of teacher guidance are positively related to student collaboration, for example when teachers focus their attention on students’ problem solving strategies. During student collaboration, opportunities arise for students to engage in collaborative activities that support their learning process. The way teachers take more or less control of these moments determines whether these opportunities can be turned into real moments of learning for the students. This review highlights the important yet challenging role of the teacher during collaborative learning.
Article
Full-text available
This paper is about orchestrating the emergence of conceptual learning in a collaborative setting. We elaborate on the idea of critical moments in group learning, events which may lead to a particular development at the epistemic level regarding the shared object. We conjecture that teachers’ identification of critical moments may help them guide students to the emergence of conceptual learning. The complexity of small group settings in classrooms prevents teachers from noticing these critical moments, though. Here we present an environment, SAGLET (System for Advancing Group Learning in Educational Technologies), based on the VMT (Virtual Math Teams) environment (Stahl 2009), which allows teachers to observe multiple groups engaging in problem-solving in geometry. SAGLET capitalizes on machine learning techniques to inform teachers about on-line critical moments by sending them alerts, so that they can then decide whether (and how) to use the alerts in guiding their students. One teacher in an elementary school used SAGLET to help multiple groups of students solve difficult problems in geometry. We observed how the teacher mediated two cohorts of multiple groups at two different times in a mathematics classroom. We show that in both cases the teacher could detect the needs of the groups (partly thanks to the alerts) and could provide adaptive guidance for all the groups.
Book
Full-text available
The Handbook of Learning Analytics is designed to meet the needs of a new and growing field. It aims to balance rigor, quality, open access and breadth of appeal and was devised to be an introduction to the current state of research. The Handbook is a snapshot of the field in 2017 and features a range of prominent authors from the learning analytics and educational data mining research communities. The chapters have been peer reviewed by committed members of these fields and are being published with the endorsement of both the Society for Learning Analytics Research and the International Society for Educational Data Mining. We hope you will find the Handbook of Learning Analytics a useful and informative resource.
Article
Full-text available
Teacher orchestration of the classroom is a demanding task because of the multitude of activities and the rapid pace at which these activities occur. Teachers must constantly be aware of the activities students engage in and the progress students are making, in order to be able to make informed decisions. Recent years have seen the development of learning analytics (LA) to support teachers, in the form of interfaces that capture and visualise data about learner activities. LA are hypothesised to deliver actionable knowledge, because the provided information would enable teachers to immediately translate the information into concrete intervention of some kind. This paper provides a short discussion of the hypothesised affordances of LA for supporting teachers, and subsequently, a case study is presented in which a teacher’s sense making strategies of LA were investigated within a digital environment, which enabled the teacher to monitor and regulate the activities of five collaborating groups of students. Besides automatically logged teacher behaviour, eye tracking was used to examine eye movements, and cued retrospective reporting was used to evaluate the choices and decisions made by the teacher. The case study is used to discuss directions for future research concerning learning analytics to support teachers.
Chapter
Full-text available
Teacher inquiry is identified as a key global need for driving the continuous improvement of the teaching and learning conditions for learners. However, specific barriers (mainly related to teachers’ data literacy competences), can defer teachers from engaging with inquiry to improve their teaching practice. To alleviate these barriers and holistically support teacher inquiry, the concept of Teaching and Learning Analytics (TLA) has been proposed, as a complementing synergy between Teaching Analytics and Learning Analytics. Teaching and Learning Analytics aims to provide a framework in which the insights generated by Learning Analytics methods and tools can become meaningfully translated for driving teachers’ inquiry to improve their teaching practice (captured through Teaching Analytics methods and tools). In this context, TLA have been identified as a significant research challenge. Thus, the contribution of this chapter is the first systematic literature review in the emerging research field of Teaching and Learning Analytics. The insights gained from the systematic literature review aim to (a) transparently outline the existing state-of-the-art following a structured analysis methodology, as well as (b) elicit insights and shortcomings which could inform future work in the Teaching and Learning Analytics research field.
Article
Full-text available
This paper presents a systematic literature review of the state-of-the-art of research on learning dashboards in the fields of Learning Analytics and Educational Data Mining. Research on learning dashboards aims to identify what data is meaningful to different stakeholders and how data can be presented to support sense-making processes. Learning dashboards are becoming popular due to the increased use of educational technologies, such as Learning Management Systems (LMS) and Massive Open Online Courses (MOOCs). The initial search of five main academic databases and GScholar resulted in 346 papers out of which 55 papers were included in the final analysis. Our review distinguishes different kinds of research studies as well as various aspects of learning dashboards and their maturity regarding evaluation. As the research field is still relatively young, most studies are exploratory and proof-of-concept. The review concludes by offering a definition for learning dashboards and by outlining open issues and future lines of work in the area of learning dashboards. There is a need for longitudinal research in authentic settings and studies that systematically compare different dashboard designs.
Article
In this brief squib, I take up the first of the provocations put forward by Wise and Schwarz in their recent article and make an attempt to spark further discussion. Specifically, I argue that instead of attempting to agree on an overarching, unified conceptual framework for CSCL from the top down, and rather than synthesizing findings from CSCL research from the bottom up, we could take a taxonomy of CSCL support dimensions as a starting point and engage in a concerted research effort with the aim of working towards a comprehensive framework of CSCL support. I therefore propose such a taxonomy, which currently comprises 12 dimensions. By referring to some of my own research, I demonstrate how the proposed process of providing evidence-based design principles for CSCL support that cut across and interleave the dimensions of the taxonomy could work.
Conference Paper
Learning in groups allows students to develop academic and social competencies but requires the presence of a human teacher that is actively guiding the group. In this paper we combine data-mining and visualization tools to support teachers’ understanding of learners’ activities in an inquiry based learning environment. We use supervised learning to recognize salient states of activity in the group’s work, such as reaching a solution to a problem, exhibiting idleness, or experiencing technical challenges. These “critical” moments are visualized to teachers in real time, allowing them to monitor several groups in parallel and to intervene when necessary to guide the group. We embedded this technology in a new system, called SAGLET, which augments existing collaborative educational software and was evaluated empirically in real classrooms. We show that the recognition capabilities of SAGLET are compatible with that of a human domain expert. Teachers were able to use the system successfully to make intervention decisions in groups when deemed necessary, without overwhelming them with information. Our results demonstrate how AI can be used to augment existing educational environments to support the “teacher in the group”, and to scale up the benefits of group learning to the actual classroom.