Available via license: CC BY-NC-ND 4.0
Content may be subject to copyright.
Volume 9(2), 265–281. https://doi.org/10.18608/jla.2022.7643
FoLA2—A Method for Co-creating Learning
Analytics–Supported Learning Design
Marcel Schmitz1, Maren Scheffel2, Roger Bemelmans3, Hendrik Drachsler4
Abstract
Learning activities are at the core of every educational design effort. Designing learning activities is a process
that benefits from reflecting on previous runs of those activities. One way to measure the behaviour and effects of
design choices is to use learning analytics (LA). The challenge, however, lies in the unavailability of an easy-to-use,
LA-supported learning design (LD) method. We established such a method—the Fellowship of Learning Activities
and Analytics (FoLA
2
)—reinforced by a gameboard and cards, to provide structure and inspiration. The method
enables several participants with different roles to interact with a set of card decks to collaboratively create an
LA-supported LD. Using this method helps to design learning activities in a collaborative, practical way; it also raises
awareness about the benefits of multidisciplinary co-design and connections between LA and LD. FoLA
2
can be
used to develop, capture, and systematize design elements and to systematically incorporate LA.
Notes for Practice
•
FoLA
2
is a method of considering learning analytics (LA) while designing curricula and learning activities
for any subject.
•
Educational theory, pedagogical background, and the design of learning activities differ in each organization.
FoLA
2
has two aims: On an individual level, it aims to increase knowledge about and awareness of LA. On
a group level, it facilitates shared terminology and understanding among team members to improve the
co-creation of LA-supported learning activities.
Keywords
Learning design, learning analytics, learning activity, multidisciplinary cooperative method, learning enhancing
technology
Submitted: 01/12/2021 — Accepted: 11/07/2022 — Published: 31/08/2022
1Corresponding author Email: marcel.schmitz@zuyd.nl Address: Zuyd University of Applied Sciences, Nieuw Eyckholt 300, Heerlen,
Netherlands. ORCID ID: https://orcid.org/0000-0002-1816-6178
2Email: maren.scheffel@rub.de Address: Ruhr University Bochum, Universit¨
atsstraße 150, Bochum, Germany. ORCID ID:
https://orcid.org/0000-0003-4395-4819
3Email: roger.bemelmans@zuyd.nl Address: Zuyd University of Applied Sciences, Nieuw Eyckholt 300, Heerlen, Netherlands. ORCID ID:
https://orcid.org/0000-0002-6352-5213
4Email: drachsler@dipf.de Address: Open University the Netherlands, Valkenburgerweg 177, Heerlen, Netherlands; Goethe-University,
Rostocker Straße 6, Frankfurt, Germany. ORCID ID: https://orcid.org/0000-0001-8407-5314
1. Introduction
Although research in the field of learning analytics (LA) has been growing steadily in the past decade, the actual uptake
by educational institutions and their teaching staff is still very small (Tsai et al., 2018). In order to overcome this adoption
challenge and establish institutional rollout of LA, we have created a method to aid in the practical use of LA-supported learning
design (LD).
LD is a field of research that has gotten more and more attention since the early 2000s, when the focus was put on metadata
standards like IMS LD to establish a common markup language for designing learning activities and courses (Koper, 2005). LD
draws expertise from educational science, the content and context of the subject domain, and technology-enhanced learning
(TEL) (Koper, 2006). Designing learning activities is complex because the people involved (i.e., students and teachers);
knowledge; and knowledge transfer in different contexts based on region, topic, and learning institution types need to be
considered. From an LD perspective, educational designers can choose from a broad range of pedagogic theories of design
methodologies, such as the four-component instructional design model—4C/ID (Van Merrienboer & Kirschner, 2007), the
cognitive apprenticeship approach—CAA (Brown et al., 1989), cognitive-flexibility theory—CFT (Spiro & Jheng, 2012),
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 265
problem-based learning—PBL (Hmelo-Silver, 2004), community of practice—CoP (Wenger et al., 2002), epistemic frames
(Shaffer, 2006), design-inquiry of learning (Mor & Mogilevsky, 2013), cognitive load theory—CLT (Sweller, 2011), and
personal learning environment—PLE (Fournier et al., 2011). Although “evaluation” is an essential part of these frequently used
ADDIE-like design process models (ADDIE: analysis, design, development, implementation, evaluation; Peterson, 2003), the
use of LA to support LD is uncommon.
Possibilities to record, analyze, and visualize learning and teaching behaviour and hence improve learning and teaching, i.e.,
LA, have been evolving (Greller & Drachsler, 2012) and offer more and more opportunities in continuous efforts to increase the
quality of education. In fully online educational settings, there are numerous examples of teachers, educational designers, and
students using LA (Mangaroska & Giannakos, 2018; Rienties et al., 2017). Several scholars have concluded that although
institutions are aware of the possibilities of analyzing learning data, only a few go beyond counting the number of clicks in an
online learning environment (Colvin et al., 2016; Haythornthwaite, 2017; Tsai & Gasevic, 2017). Joksimovic and colleagues
(2019) combine these conclusions into a call for an incremental, evolutionary system positioned in a broader context of system
thinking (e.g., Dawson et al., 2019; Rubenstein-Montano et al., 2001). For this call, they define three significant points to take
into consideration: (1) let LA underpin institutional strategies and goals, (2) address various stakeholders by planning LA
during the design of a module, and (3) see LA as an evolutionary, iterative process. There have been efforts to use LA as an
instrument in LD processes to evaluate design choices (Persico & Pozzi, 2015), and there are several frameworks available
for connecting LA and LD (Mangaroska & Giannakos, 2018), but almost all of them are conceptual or difficult to operate in
educational practice.
The urge to connect LA and LD has been rising. Macfadyen and colleagues (2020), e.g., compiled four topics for future
research in this area: (1) experiments that demonstrate which LA-supported LD decisions influence learning experience and
learning behaviours; (2) explorations into how personalization and flexibility of LD might help individual learners and teachers;
(3) educator design practices, mainly practices that engage with LA; and (4) theoretical, conceptual, and empirical solutions to
the challenge of designing the “right” learning activity at the right time for diverse learners. In the special issue on LD and
LA for the Journal of Learning Analytics, Macfadyen and colleagues (2020) also collect several examples of LA-supported
design, among which are a model (Mangaroska et al., 2020) and a framework and method (Law & Liang, 2020). In the last
decade, most examples that address LA-LD connections are situated in online higher education (Nguyen et al., 2020). One
central observation of all that research is that a significant amount of technology-centred LA applies a data science approach by
using data provided by the environment and trying to find meaningful patterns of learning. This mining of process data is also
described as a bottom-up approach (Drachsler & Goldhammer, 2020). We know from other communities, such as the field of
psychometrics, that they apply the concept of evidence-centred design (ECD) to gain meaningful data. ECD is both a design
and an evaluation method for (educational) assessments proposed by Mislevy and colleagues (2003). It begins by identifying
what constructs should be assessed regarding knowledge, skills, or other learner attributes. It can also be seen as a top-down
approach (Mislevy et al., 2003). We want to merge the bottom-up and top-down approaches with the present work and enable
more evidence-based principles in LA-supported LD.
From an LA perspective, promising tools like the learning activity management system (LAMS) (Dalziel, 2003) or
instruments like Inspiration Cards (Vezzoli et al., 2020) and LA-Deck (Alvarez et al., 2020) have become available. They focus
on what stakeholders want to analyze to improve learning processes and how to visualize them and connect LA and LD. In our
previous study (Schmitz et al., 2018b), it became clear that the practical usability of LA would benefit from a better technical
embedding and a better embedding of LA in “design time” of the plan-do-check-act iteration of instructional design. Gasevic
and colleagues (2019) propose to charge all stakeholders with the implementation of LA but observe several awareness and
knowledge gaps that should be addressed. Therefore, the research presented here aims to develop a practical method to help
co-create learning activities while making teachers, students, and designers aware of LA and enabling LA’s embedding into LD
during design time. In order to overcome the low adoption of LA by teachers in the field, we need hands-on methods to mature
the uptake of LA. We, therefore, have two objectives:
O1: Develop an applicable method that supports the co-creation of LA-supported LD by different stakeholders.
O2: Increase staff’s knowledge about and awareness of LA.
Our method will help educators of all levels design learning activities that include LA. The learning behaviour information
gathered by LA solutions already put in place during the development of learning activities can lead to improved execution of
LDs and better-informed learning activities. Our solution thus bridges the gap between LA and LD. It enables teams to use
LA within their known environment and context, including a dedicated educational vision/pedagogy. It offers the opportunity
to implement LA during the development of learning activities. It can serve as a knowledge base within institutions and
the educational field for different pedagogies, learning enhancing technologies (LETs), and LA used in learning activities.
Finally, FoLA
2
also supports the transfer from state-of-the-art research on LA to practitioners developing learning activities and
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 266
teaching at schools. Our method contributes to research and practice in the educational field as a frame to look at both the LA
and the LD worlds.
2. Related Work
Law and Liang (2020) recently provided a promising multilevel framework that combines LA and LD seamlessly by connecting
LD curriculum elements to LA questions and LA solutions. However, the LA components still need additional development, and
the framework’s complexity is still a hurdle in practice for practitioners to clear. There are literature reviews that try to provide a
structured overview (Celik & Magoulas, 2016) by reporting on common characteristics that link to the process of LD (based on a
systematic literature review with 17 papers identified): sequence, learning activity flow, collaborative development, assessment,
LA, meaningful context, technology adaptation, resources, tools, roles, environment, objectives, recall of prerequisite skills,
and learner analysis. These characteristics only partly resonate in several other frameworks (e.g., Wise et al., 2016; Kelly et al.,
2015; Fern
´
andez-Gallego et al., 2013; Lockyer et al., 2013; Florian-Gaviria et al., 2013; Slade & Prinsloo, 2013; Schmitz
et al., 2017; Laurillard et al., 2013; Mor & Mogilevsky, 2013; Siemens & Long, 2011). The only connection found is that all
frameworks give insight into how to design (a set of) learning activities. This diversity of frameworks is perhaps why researchers
have lately noticed a lack of widespread practice in using LA-supported LD frameworks or instruments. Mangaroska and
Giannakos (2018) present a checklist for future work based on reviewed papers with—among other items—the following
suggestions: (a) provide details about the learning environment and the pedagogical approaches used, where improvements in
LD experiences based on LA outcomes will be measured (Rodr
´
ıguez-Triana et al., 2015); (b) indicate how LA metrics offer
insight into learning processes and can be theoretically grounded for meaningful interpretation to inform theory and design
(Ga
ˇ
sevi
´
c et al., 2017); and (c) evaluate and denote how educators are planning, designing, implementing, and evaluating LD
decisions (McKenney & Mor, 2015). Furthermore, other researchers present a strong plea for addressing the co-creation of LA,
mainly to solve issues like a mismatch between design and capacity, invalid inferences, and reconceptualization from product to
service (Dollinger & Lodge, 2018).
Educational design tools like CADMOS (Katsamani et al., 2012) or LA management systems like LAMS (Dalziel, 2003)
bring excellent environments from either an LD or an LA perspective but do not excel in connecting both. A suite like
AEEA (Baldiris et al., 2010) looks promising, but it is still a prototype. These systems appear very sophisticated and require
expert users and an already present technical infrastructure from an LA perspective. Experiments like the one in Schmitz and
colleagues (2018b) show that even if this infrastructure is available as a service, it still does not solve practical problems or use
opportunities to think of LA at the start of the LD process instead of in hindsight.
Several instruments support those who develop learning activities in their tasks. Besides paper, scissors, or a whiteboard,
card games can provide some structure or procedural steps. Roy and Warren (2019) reviewed 155 design card games and
elucidated their strengths as a design tool as follows: they (a) facilitate creative combinations of information and ideas, (b)
provide a common basis for understanding and communication in a team, (c) provide tangible external representations of design
elements or information, (d) provide convenient summaries of helpful information or methods, and (e) are semi-structured
tools between blank sticky notes and detailed instruction manuals. Of 155 card games reviewed—ranging from general design
cards (Moller, n.d.) to domain-specific games for domains like music (Eno & Schmidt, 1975) and fur and sustainability (Design
School Kolding, 2017)—only one is partly related to our cause: Tango (Deng et al., 2014). It can be used to design tangible
user interfaces and digital games to enable learning benefits but not learning activities, per se.
Not mentioned in the review is a card game used to support the TPACK model (Koehler & Mishra, 2009). In the US, several
resources and instruments are available for TPACK
1
, and there are case studies on the use of the TPACK card game at several
education levels (primary, secondary, vocational, higher; Boschman et al., 2015) in the Netherlands. The main idea of TPACK
is that during the development of a learning activity, designers combine content/knowledge, pedagogy, and technology to create
a learning activity. The elements of pedagogy and technology have pre-defined cards representing items regularly used in
education. In a TPACK card deck, there are some examples of content/knowledge cards for inspiration or awareness sessions,
but it is preferable to take the topic at hand for this part of the game. TPACK has been used to improve technology usage in
several educational processes, but it does not have specific LA elements.
Also not mentioned in the design card decks review by Roy and Warren (2019) are card games that have recently surfaced
at workshops or conference talks. The LD Cards (Gruber, 2019) are intended to design blended learning and try to bring LD a
step further while incorporating some or full technological support. Eight learning events cards consist of learning activities
individually or combined in different configurations. Every learning event card has two sides: one side has the title, the iconized
event type, teaching verbs, learning verbs, class methods, and online methods for the event. The other side of the cards has
room to fill in the learning objective, the learning task, the learning outcome, social interaction, support, learning material, and
the learning environment. There is no explicit placeholder for LA on either side of the cards. In workshops, however, LA is
1http://tpack.org.
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 267
positioned between elements and addressed accordingly.
LA-DECK (Alvarez et al., 2020) focuses on co-designing an LA solution. All stakeholders (including students) meet face to
face to create new LA. LA-DECK uses several suits of cards: Analytics Type, Data Source, Analytics Method, Developer Tools,
Privacy, User Interface, and Wildcards. The game also has Resource Tokens to distribute available time and money during the
design process. In their validation study, researchers first quantitatively operationalized “participation” and a qualitative account
of how cards appear to shape the conversation between participants. Eleven session transcripts (video and audio) were analyzed,
and 22 “critical incidents” (Angelides, 2001) were identified. In the evaluation of LA-DECK, there are several analyses: critical
incidents derived from transcripts; a card graph that illustrates the play of cards and connections between cards and degree; and
type of participation, i.e., cards played in each session and discussion time as indicators of participation.
Teams of designers created specific LA- and LD-inspired ideation cards while using design thinking methodologies (e.g.
Plattner, 2010). In co-design or participatory-design sessions (e.g., Sarmiento et al., 2020; Michos et al., 2020; Schmitz et al.,
2018a), during the first steps, specific LA and LD cards are created to spark Ideate and Design steps before Evaluation connects
the designed LA back to LD. Great examples of created cards are present within these sessions, but for non-participants, often,
only the solutions are findable and reusable, while most exciting discussions are because of the cards.
3. Design Method
This article presents the results of four iterations of developing FoLA
2
, i.e., a method/instrument to support the co-creation of
LA-supported LD by different stakeholders in the form of a serious game. Our approach to developing this was based on the
Design Science Research Framework by Hevner and Chatterjee (2010), with all four iterations following steps 1 to 4 of this
framework (see Figure 1): requirements, grounding, design build, and evaluate.
Environment
1. Requirements
Design
3. Design Build
4. Evaluate
Knowledge Base
2. Grounding
Figure 1. The Four Steps of Design Science Research Based on Hevner and Chatterjee (2010)*
*Green squares are steps taken. Blue squares are the place where the input came from or where the output has to go.
Table 1 gives an overview of the four iterations. In iteration 1, a paper prototype called Version 1 was established with a
technology readiness level (TRL) (Mankins, 1995) of 2, i.e., a formulated concept. In iteration 2, a playable proof of concept
(
TRL =3
), i.e., Version 2, was created and tested by LA and TEL researchers and practitioners. FoLA
2
was then further
improved in iteration 3, i.e., Version 3, where two larger groups of practitioners and TEL experts tested its usability (Brooke,
2013). In iteration 4, we set up a simulation that involved higher education teachers and students who used FoLA
2
to create a
learning activity. In this way, we validated Version 4 as an applicable method for LA-supported LD (Objective 1) in a laboratory
environment and were able to achieve
TRL =4
. We validated Version 4 as a tool that increases staff knowledge and awareness
of LA (Objective 2) in a separate study. Table 2 shows the participants of the design session of iteration 1. Table 3 shows an
overview of participants in the evaluation sessions of iterations 2 to 4.
4. FoLA2
FoLA
2
is used to co-create an LD of a learning activity, including technology enhancements and measurement indicators for
LA. In several steps (Table 4), participants discuss the learning activity, TEL, LA, and its implications.
The result of using the method is a learning activity design (Figure 2). The design consists of interactions (blue cards)
between students, teachers, learning material, or a specific learning activity type (purple cards). Since educational designers
use very different starting points or theories to base their learning design on, there are very different sets of learning activity
types. Every purple card has a name for the set it is part of (A.1) (i.e., Cognitive Apprenticeship, TPACK, Zuyd University’s
Educational Vision). Purple cards also have a learning activity type (A.2) (examples for the Cognitive Apprenticeship set
are debate, reflect, create, and explore), a description (A.3), and a code (A.4). Each interaction card has an actor (teacher,
student, learning material, or learning environment) that initiates the interaction with another actor (B.2). Every interaction card
has a white square for a detailed description of that interaction (B.3). Some interactions could use technology enhancement,
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 268
Table 1. Overview of the Four Method Design Iterations
Iteration 1 Iteration 2 Iteration 3 Iteration 4
Method version paper prototype playable version toward usability toward feasibility
TRL 2 3 4 4
Requirements
based on previous
study by Schmitz et
al. (2018a)
based on evaluation 1 based on evaluation 2 based on evaluation 3
Grounding literature about LD, LA, gamification
Design build
design thinking ses-
sion
adaptations by researching team
Evaluation simulated play-test
play-test with TEL
experts/researchers (2
groups)
usability with TEL
experts (5 groups)
and IT advisors
+
managers (3 groups)
feasibility with teach-
ers (2 groups) and stu-
dents (2 groups); us-
ability with teachers
(1 group)
Period
May 2019–October
2019
October 2019
October 2019–
November 2019
November 2019–
February 2020
Version 1 2 3 4
TRL = Technology Readiness Level according to Mankins (1995): 2 = concept formulated,
3 = proof-of-concept, 4 = validation in laboratory environment, 5 = validation in relevant environment
Table 2. Demographics of Participants in Design/Build Iteration 1
Age
Gender
Exp. in
Edu.
Years
since
last
degree
Current role at HEI*
Hours/month
board games
Hours/month
digital games
Has de-
signed
game
Exp.
game-
design
(1–5)
25 female 0 0.5 junior researcher 1 to 4 41 No 1
21 male 0 0 student 0 30 to 35 No 1
54 male 28 4 teacher, TEL advisor 1 to 4 25 No 2
27 male 0.5 0 tutor, junior researcher 1 to 4 20 to 30 Yes 3
47 male 2 22 IT consultant, teacher 4 and more 4 to 8 Yes 3
38 male 16 5 teacher, researcher 1 to 4 4 to 8 Yes 3
41 male 16 18 teacher, researcher 1 to 4 1 to 4 Yes 3
*Higher education institution.
illustrated by connected red cards on top of the blue cards (B.1). Every red card has a title (C.4), a description of how the
technology can enhance learning (C.2), a code (C.1), and a green part to give information on what LA options specific learning
enhancement brings (C.3). Some interactions are relevant for measuring the learning behaviour of students and teachers, which
connected yellow cards can illustrate at the bottom of the blue cards (B.4). Yellow cards have a title (D.1), a description of
an indicator of what we want to know (D.4), a code (D.2), and a green part describing how and what can be measured to get
information for this indicator (D.3).
4.1 Design Elements
In a previous study (Schmitz et al., 2018a), focus groups and a questionnaire were used to gather requirements for an LA-
supported LD tool that served as input for iteration 1 (step 1 of figure 1). Based on those requirements, literature was reviewed
to ground the design choices. Every iteration resulted in a new version, based on grounded choices, that got evaluated in several
sessions (see Table 3). Each evaluation then led to a new selection of requirements and design elements as presented in Table 5.
The most relevant ones are as follows.
Game-Play Circle A process or set of rulings that provide structure was needed to be able to play. The core of our envisioned
method was the design of learning activities, so we looked at educational design processes like ADDIE (Peterson, 2003).
Furthermore, we want to embed technology, both LET and LA, in the LD. In our search for educational design processes with
technology in mind, we came across the migration to a blended learning process called Design Cycle for Education (DC4E)
(Scheffel et al., 2021). It is an eight-step process to transform learning activities into a blended form or create blended learning
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 269
Table 3. Participants in Iterations 2, 3, and 4
Iter. Session Participation
2
2 parallel sessions at
an international research
conference workshop
• 14 people
•
2 male aged 20–30, 2 male aged 40–50, 2 female aged 20–30, 6 female aged
30–40, 2 female aged 50–60
•
2 TEL advisors, 1 head of educational services, 1 representative of a publisher, 2
bachelor’s students, and 8 researchers in TEL
3
5 parallel sessions during
a conference. Evaluation:
usability
• 40+ people
•
8 people evaluated (4 male, 4 female): 1 manager, 1 student, 1 teacher, 3 TEL
experts, and 2 educational advisors
3
3 parallel sessions at a vo-
cational education meet-
ing. Evaluation: usability
• 24 people
•
6 people evaluated (5 male, 1 female): 3 TEL experts, 2 project managers, 1
information manager, and 1 information architect.
4
1 session with 4 groups.
Evaluation: feasibility • 19 people
• Group 1 had 4 teachers: 3 female (age: 36, 59, 64) and 1 male (age: 50).
• Group 2 had 6 teachers: 5 male (age: 39, 41, 43, 57, 61) and 1 female (age: 37).
• Group 3 had 4 first-year students: all male (18–21).
• Groups 4 had 5 first-year students: 3 male (19–20) and 2 female (both 19).
4
1 session with 2 groups.
Evaluation: usability • 9 people
• Group 1 had 1 female and 5 male (age range 20–45): 6 TEL researchers.
• Group 2 had 3 teachers: 2 female and 1 male (age range 30–45).
activities. Based on this, we created a game-play process consisting of the cycle’s eight steps adapted to the needs of our game
(see Figure 3 and Table 4).
Interaction Cards and the Systematization of LD Elements The interaction cards are blue and describe the central elements
in the learning activity. Based on input from related work (Koper & Manderveld, 2004; Koper & Olivier, 2004; Beetham,
2007), several interaction types could be distilled: student to student, student to teacher, teacher to student, student to material,
material to student, student to environment, and environment to student. In addition, an interaction card was created with blank
actors to illustrate any interaction between actors. The interaction cards have a white rectangle where the players can fill in
the interaction or the content and the intention of the interaction. The interaction cards were adapted to have all arrows point
from left to right for all interaction types so that the first actor is always on the left. Other icons were used for the learning
environment to make it more recognizable. Furthermore, the term student became learner for more general use at different
educational levels.
A board is used to systematize design elements (Figure 3). The idea of storytelling by placing interactions after each other
does not say anything about the number of needed interactions. The duration of every interaction can differ. A simple set of
interactions going from left to right is the basis. Players should always have the flexibility to make interaction chains. The board
facilitates this with 10 spaces for interactions (I). The game-play cycle is positioned on the top left (A). The board has fields to
enter the title (B) of the learning activity and the intention of that learning activity (C). A dark blue placeholder was added to
the left side of the board to place one or more challenge cards (D) that indicate specific limitations or special assignments.
Role Cards Several roles are involved in designing learning activities, and, therefore, we have several role cards in our game.
Several roles and functions within educational institutions are the basis for the roles: student, teacher, instructional designer,
study coach, educational advisor, TEL advisor, LA advisor, and manager. Each role is tasked with asking questions at specific
phases. For example, the student role card says, “The student wants to have a relevant interaction and learning activity” and
nudges the student player to ask questions like, “Is this interaction fun? Do I get that this step has to be done? Do I want to give
the data (if asked)? Does the technology help me? Does this interaction spark my performance?” To give assessment a more
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 270
Table 4. The Steps within the Game
Step Title Description
0
Hand out role
cards.
Before the game starts, every player gets one or two role cards, e.g., student, teacher, course designer,
management. These cards include a role description and some questions to be asked during the game.
1 Goal
First, the players decide which learning activity to design, and its intention needs to be declared. Then
a population is selected. One or more student and teacher cards are selected or randomly chosen to
represent the learning activity’s target group. Every student and teacher card has a set of characteristics
(readiness to give data on learning behaviour, technology adaptiveness, motivation).
2 Challenge
The manager can add an organizational challenge, or the game leader can add a challenge to get specific
input from the players or to state boundaries and challenges that are present in the environment where
the learning has to take place.
3 Inspiration During this step, best, good, and bad practices are shared to learn from past decisions.
4 Analysis
In the analysis phase, players select or randomly choose the pedagogy or educational vision. Every
pedagogy or educational vision has a set of learning activity type cards. Examples are lecture, discussion
group, workshop, case study, and self-study.
5 Development
This step has two actions. First, a sequence of interactions between students and teachers, students and
other students, students and learning material, or students and the environment is selected. For every
interaction, the players add short text snippets on what the interaction is (and optionally the intention of
this specific interaction). There do not have to be 10 interactions, and players may use interaction types
more than once. Second, participants check for every interaction whether LETs are available. There is
room for two LETs per interaction. Each technology has a green LA element (see C.3, Figure 2).
6 Prototype
Players can select one or more yellow “what do we want to know” cards per interaction. These are
indicators or items where different stakeholders state what they want to know from the interaction. Every
yellow card has a green LA part describing the tooling to measure (see D.2, Figure 2). If they are
applicable, a set of yellow and red cards is available. Each of them has a square to place a token with a
number. Every number stands for a data element that is measured. Players decide what and how many
data elements to use as input for evidence.
7 Simulation
In this step, we look again at the population cards and decide if the choices made in the Interactions, the
LETs, and the “what do we want to know” cards fit the population or if the population will react by not
agreeing in giving away the data elements.
8 Adaptation If needed, players choose cards that can be adapted based on the simulation.
substantial place (Sluijsmans et al., 1998) in the game, an extra role card was added: the assessment advisor. Role cards have
information on both sides: the card shows the role title and explanation on the front, while the back contains the questions to be
asked during the game.
Population Cards A clear view of the target group for the learning activity is relevant; the student card in the population set
ignites this topic’s discussion. The card illustrates the willingness to share data on the learning process and the technology
adoption of students at the levels low, medium, and high. There are also two blank characteristics to fill in that may be relevant
to consider in the design of learning activities. A teacher population card was added to the population set (see Figure 5).
Pedagogy/Educational Vision Cards The pedagogy cards are purple and represent sets of learning activity types grouped
by a pedagogy, an educational vision, or a design tool’s outcomes. As examples, we added three sets of different, often used
pedagogies or educational visions or methods to create learning activities: TPACK (Koehler & Mishra, 2009), 8 Learning
Events (Verpoorten et al., 2007), and Cognitive Apprenticeship (Brown et al., 1989). Every purple card has a specific code to
give more details on the methods or link to other methods, e.g., the LD Cards by Gruber (2019). For example, the TPACK
cards have the following learning activity types: instruction in class, excursion, exploring, case study, discussion, simulation,
practice, collaborative learning, problem solving, and self-study. There is also one purple wildcard where players can fill in a
pedagogy/educational vision/design tool title and a learning activity type to use in the game. A set of cards about educational
visions and pedagogies commonly used at the faculty of ICT at Zuyd University of Applied Sciences was added in the last
version. They contain the following learning activity types: lecture, discussion group, workshop, case study, and self-study.
LET Cards The LET cards provide connections between learning, TEL, and opportunities to use LA and have three elements:
a label with the title of the technology, a description of the opportunities for enhancing education with this technology, and a
green part for the LA. Using LET offers the opportunity to collect data. The green part thus contains a description of examples
of available data sources or tools. A white square is a placeholder in the green part to illustrate the number of chosen data
sources from this technology. Every red card has a code so that further information on the learning enhancement and the LA
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 271
Figure 2. Example of Learning Activity Design with Cards and Board
can be given and be grounded in literature or practical examples. The technologies added are the same examples used in the
local version of the TPACK (Koehler & Mishra, 2009) cardgame (i.e., Blogs/Vlogs, Smart Screen, Interaction Booster, Online
Collaborative environment, Video). There are also red wildcards that can be filled manually with a LET, its description, and the
potential LA options.
“What Do We Want to Know” Cards “What do we want to know” cards are yellow and represent something we want to know
based on students’ and teachers’ behaviour or based on a specific design choice. They can contain a single indicator or several
indicators. The input for an LA indicator comes from one or several data sources. Every yellow card has a green LA area with
(several) option(s) to measure data. There also is a white square that, if filled, indicates the number of selected data sources.
Every card has a code to identify and connect it to the companion website with more information and theoretical grounding by
connecting it to relevant literature or examples. Examples like engagement (the amount of time, attention, and effort a student
puts into the interaction), social interaction (the amount and quality of interactions they experience with each other), having fun
(whether students enjoy the learning activity), use of resources (how much, how long, and when are resources used), asked
for help (how much and when), and activity (how active students are) are based on outcomes of general LA literature reviews
(Leitner et al., 2017; Mangaroska & Giannakos, 2018). Examples like initiative, presence, connectedness, responsiveness,
productivity, and discussion are based on research about online collaboration (Scheffel et al., 2016).
Connection to Interaction Cards To embody the connection between interaction and LET cards and interaction and “what
do we want to know” cards (step 3), we changed several elements. Every (blue) interaction card got a red top bar labelled
“Learning Enhancing Technology” and a yellow bottom bar labelled “What do we want to know.” A maximum of 10 interaction
cards can be part of this board setup (Figure 3). There are two rows of red card placeholders above the blue row (G and H). The
red cards have their title label on the bottom, so the label (the technology title) can be put on the red bar on the interaction card.
The yellow cards have their title label on the top; they connect to the yellow bar on the interaction card’s bottom. There are two
rows of yellow card placeholders above the blue row (J and K).
Challenge Cards Several challenge cards were added, e.g.: Make it fully digital, Make it blended, Design the learning
activity for half the cost. A particular challenge card relates to the “what do we want to know” cards; e.g., a yellow card about
engagement as depicted in Figure 2 can be set as a challenge for one interaction or the whole learning activity. Setting such a
challenge at the beginning of the game could influence design choices later on. A “fill in your own challenge” card completes
the dark blue set. Figure 4 shows all three types of challenge cards.
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 272
Table 5. Requirements and Design Elements per Iteration
Iter. Requirements Design elements
1• Facilitate creative combinations of information and ideas.
• Provide common terminology for team around LA and LD.
• Provide tangible external representations of design elements.
• Provide a summary of useful methods for LA and LD.
•
Provide details about the learning environment, the used peda-
gogical approaches, and the LD of a learning activity.
•
Indicate how and where LA indicators can offer insights into
learning processes.
• Provide a set of LA indicators from LA’s state of the art.
•
Evaluate and denote how educators are planning, designing,
implementing, and evaluating LD decisions.
•
Interaction cards and the systematization of
LD elements
• Role cards
• Population cards
• Pedagogy/educational vision cards
• LET cards
• “What do we want to know” cards
2• A game-play process
• Connect to Interactions.
• Connect the “what do we want to know” cards to the game.
• Examples of LET cards
• Examples of “what do we want to know” cards
• Monitor participation of LA.
• Game-play circle
• Connect to interaction cards
• Extra LET cards
• Extra “what do we want to know” cards
• Tokens and field per interaction
3• Merge game process board and placeholder board into one.
• Add intention field.
• Add boundaries/specific tasks.
• Include teacher as population.
• Keep game within a time-span of 1.5 hours.
• Create context-based pedagogy cards.
• Board changes
• Challenge cards
• Population cards
• Deletion of student tokens and placeholders
• Pedagogy cards
4• Add element to emphasize assessment more.
• Clarify roles.
• Connect actors on interaction cards always from left to right.
• Extra role cards
• Redesign of interaction cards
Deleted Element: Student Tokens and Placeholders In the first iteration, there was a game element where tokens as
representations of students could be put on the interactions to identify the student population’s likelihood of opting out of any
data collection specific to that interaction. This game element was taken out of the game and thus off the game board. During
the evaluation, it turned out to be an unused game element due to its position at the very end of the game-play cycle. It was
often skipped or cut short because of the excessive time needed to play the game in that version. The allowed time for serious
games to be considered usable is limited (Bellotti et al., 2010). Workable time expenditure is usually set from one hour to one
and a half hours, not counting the time needed to introduce the game. For the game to be a feasible method for designing
learning activities, it should at least compare to or be better than the current way of designing learning activities (i.e., without
the game) when it comes to timing.
4.2 Evaluation
In iteration 3 we conducted two usability evaluations based on Brooke (2013) (see Table 6). In addition, participants were asked
on what day they participated, the type of institution, and their role at the institution. Table 3 gives details about participants,
who are all (potential) users of FoLA
2
. The first evaluation was conducted at a conference on IT and education (EC-TEL Delft
2019) with around 50 participants. A central moderator and an assisting moderator were present. The eight collected usability
scores range from a minimum of 20 to a maximum of 92.5, with an average of 57.5. The participants with scores of 20 and 55
answered the open questions with “the room was too crowded and loud.” The person who scored 30 liked the game elements
but thought, “this tool is too complicated for a development team.” On the contrary, the person that had a score of 92.5 said, “I
would definitely try to use this game ‘live’ in our educational team” (see Table 7).
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 273
Figure 3. Board for Structure and Game-Play. A: Game Cycle, B: Title of Learning Activity, C: Intention of Learning Activity
Design, D: Placeholder Challenge Card, E: Placeholder Pedagogy Card, F: Placeholder Population Card, G&H: Two Rows to
Place TEL cards, I: Row of Placeholders for Interaction Cards, J&K: Two Rows to Place “What Do We Want to Know” Cards
Figure 4. Example Challenge Card, “Fill in Yourself” Challenge Card, “What Do You Want to Know” Challenge Card
In iteration 4, a feasibility study investigated how educational institutions can use the game to identify elements that students
and teachers want to see on an LA dashboard and then built that dashboard in a prototype. The first group of teachers had to
design a learning activity “case study,” and the second group of teachers had to design a learning activity “workshop.” Looking
specifically at the usage of LA elements, i.e., the “what do we want to know” cards, the case study group used Belbin-team
role-scores Belbin (2011) per student, number of questions asked per student (three times during the activity), initiative, and
attitude. The workshop group used number of questions asked, number of conversations between students that sit next to each
other, interactions of students in general, and what students are doing during the workshop. Both groups of students were
asked to design the learning activity “workshop.” The first group of designing students used the LA elements engagement
(twice), having fun, activity, time expenditure during the assignment, social interaction, asked for help, and use of resources.
The second group of designing students used engagement, initiative (twice), social interaction, and asked for help. A developer
used the input from all four sessions to develop a working but straightforward system that inputs the mentioned data elements
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 274
Figure 5. Population Cards: Student Card with Characteristics (left); Teacher Card with Characteristics (right)
and connects them to a student dashboard and a teacher dashboard.
The usability evaluation of iteration 4 took place at Goethe-University, where teachers and researchers used the method
to develop two learning activities for a collaborative course. The project manager positively reflected on this combination of
getting people with different viewpoints working together toward goals at Goethe-University in the post-game evaluation as the
two groups “came closer to understanding their different efforts within the combined project” and “for the first time in the
combined project talked about concrete, practical work.” The usability scores range from 22.5 to 90, with an average of 64.2
(see Table 7 for detailed results).
Table 6. System Usability Scale (SUS) Questions According to Brooke (2013)
Nr. Question
1. I think that I would like to play this game frequently.
2. I found the game unnecessarily complex.
3. I thought the game was easy to play.
4. I think that I would need support to be able to play this game.
5. I found the various game elements in this game were well integrated.
6. I thought there was too much inconsistency in this game.
7. I would imagine that most people would learn to play this game very quickly.
8. I found the game very awkward to play.
9. I felt very confident playing the game.
10. I needed to learn a lot of things before I could get going with this game.
Table 7. Usability Scores
Iteration Event Individual usability scores Average
3 3a. Surf 20, 30, 55, 57.5, 62.5, 65, 77.5, 92.5 57.5
3b. saMBO 45, 60, 62.5, 65, 72.5, 95 66.6
4 Goethe 22.5, 55, 62.5, 67.5, 72.5, 80, 90 64.2
5. Discussion
When we started developing FoLA
2
we had two objectives: we wanted to develop a method that supports the co-creation of
LA-supported LD by different stakeholders (Objective 1), and we wanted to increase the staff’s knowledge about and awareness
of LA beyond formal training (Objective 2). Looking at the final version of our method, we can say that it allows participants to
systematize the design of learning activities collaboratively and that the design choices intend to raise awareness of using LA in
a holistic approach to LD by using LA elements based on recent research or practice.
While the final version allows players to choose from a range of pre-filled pedagogy cards, LET cards, and “what do we
want to know” cards, there are also blank cards that can be filled in as needed during the design process. In this way, our
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 275
method enables the collection of knowledge on pedagogy/educational visions used in an institution. Institutions can thus use it
to collect already-used or advanced LETs and related LA technologies. Playing the game allows institutions to understand what
questions their teachers, students, study coaches, instructional designers, TEL advisors, educational advisors, LA advisors, or
managers ask themselves when it comes to planning, designing, and executing learning activities. This kind of input is helpful
on different levels, e.g., faculty, institutional, regional, national, or even international.
The usability evaluations of the different iterations provide us with several insights since they show a broad range of results.
Scores ranged from 20 to 95; the average scores in the three evaluations were 57.5, 66.6, and 64.2. The distribution of the
ratings is similar. Taking into account these numbers alone, the usability is “high marginally acceptable” (according to Bangor
et al., 2008). Looking at the participants’ roles or backgrounds, we see no clear pattern that explains the positive or negative
outliers. However, the open comments given during the evaluation provide an explanation for the different SUS scores. More
effort is needed to make FoLA
2
more able to be used and appreciated as a method in practice. One option could be to create
versions that are quickly adaptable to the context they are used in, i.e., offering more differentiation in cards to fit an educational
institution’s needs. One aspect that might have negatively influenced the participants’ overall ratings was the setting of the
evaluation sessions; e.g., in iteration 2, one moderator had to supervise several groups of players, which was not ideal, and the
noise level in the room might have been distracting. Also, while the evaluations in iteration 3 and iteration 4 already closely
resembled “real” usage of the game as a method to design learning activities, more evaluations in actual development sessions
are needed.
The learning activities created in the evaluation sessions included LA elements (e.g., “what do we want to know” cards
were chosen and added to the interactions of the learning activities) and thereby reflected some awareness of how and where to
position and use LA in an LD. One indicator to measure the awareness and usage of LA is the amount of implemented LA in
LD, i.e., the number of “what do we want to know” cards. We are, therefore, currently looking into more detailed evaluations as
part of our future work by analyzing actual learning activity design sessions at Zuyd University of Applied Sciences.
Having multiple roles within the game facilitates the players viewing an aspect of design from different perspectives and
thus can stimulate them to take actions beyond their role in the organization. Although we did not evaluate the effect of
taking multiple roles within the feasibility evaluation, within iteration 4, we could demonstrate how groups of students and
teachers with different perspectives can complement each other by taking multiple role perspectives. Furthermore, we had a
similar demonstration during the usability evaluation of iteration 4. A group of researchers and teachers was combined, having
different aims due to their specific role cards. The researchers aimed to model data points to illustrate learning behaviour, and
the teachers aimed to design an optimal learning process to enable students to achieve their learning goals. These different
perspectives between researchers and teachers only emerged while using FoLA
2
. Future work can look further into this and
investigate the aspect of multidisciplinarity, e.g., by having people play a specific scenario to address the multidisciplinary
approach of LD.
Several limitations need attention. The LET and “what do we want to know” cards are based on the available literature
and our instructional design experiences at the time of design. For the game to be an inspirational, up-to-date, and informed
resource, users will need to be provided with relevant and new literature regularly. Another limitation relates to the difficulty of
usability testing due to the diversity of elements and the vast number of different ways to use our method. The choice of cards
and discussion around them can strongly influence usability evaluations.
6. Conclusion
There are several instruments using cards in a co-creation setting to design LA. However, examples available either are on a
curriculum level or focus on an LA dashboard and analytic technology. Therefore, we developed a co-creation method that
enables participants to design LA-supported learning activities and orchestrate LA to monitor and visualize design choices. We
presented a practically usable tool, i.e., an applicable method, in the form of a game that enables people to include LA from the
start of the LD as an evolutionary element and provides a common language for the LA and LD communities of researchers and
practitioners.
FoLA
2
bridges the gap between course development and LA dashboard development. An ideal process for LA-supported
LD would be to first use the LD cards by Gruber (2019) for the course-level design and then use our method for learning activity
design that includes the planning of LA, and eventually use LA-Deck by Alvarez and colleagues (2020) or more general design
methods in the LA context (Michos et al., 2020).
In this article, we present a method that uses the steps of the Design Cycle for Education (Scheffel et al., 2021), intended to
support the transition to remote, online, and hybrid learning. FoLA
2
pivots LA as a support for the design of learning activities
in the practice of students and teachers, involving them in co-creating education. Empowering students and teachers to use LA
enables them to co-create learning activities informed by evidence. We provide them with a method to create education more
equally (Means & Neisler, 2020). Furthermore, having included elements such as a “challenge” scenario or student population
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 276
“characteristics” (see Figure 5) in our method addresses equality in remote, online, and hybrid education for training teaching
staff and educational scholars.
Availability of Supporting Material
FoLA
2
is available as a physical purchase (www.fola2.com) and an Attribution Non-Commercial Share Alike copyrighted free
printable version (Schmitz et al., 2020). FoLA
2
is actively used in Netherlands and Germany in educational design, TEL, and
LA development.
Declaration of Conflicting Interest
The authors declare no potential conflicts of interest concerning this article’s research, authorship, and publication.
Funding
The authors declare no financial support for the research, authorship, and/or publication of this article.
References
Alvarez, C. P., Martinez-Maldonado, R., & Buckingham Shum, S. (2020). LA-DECK: A card-based learning analytics co-design
tool. Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27
March 2020, online (pp. 63–72). ACM. https://doi.org/10.1145/3375462.3375476
Angelides, P. (2001). The development of an efficient technique for collecting and analyzing qualitative data: The analysis of
critical incidents. International Journal of Qualitative Studies in Education,14(3), 429–442. https://doi.org/10.1080/
09518390110029058
Baldiris, S. M., Fabregat, R., & De la Hoz Manotas, A. (2010). A set of software tools to build an author assessment package
on Moodle: Implementing the AEEA proposal. 2010 10th IEEE International Conference on Advanced Learning
Technologies (ICALT 2010), 5–7 July 2010, Sousse, Tunisia (pp. 67–69). IEEE. https://doi.org/10.1109/ICALT.2010.26
Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of
Human–Computer Interaction,24(6), 574–594. https://doi.org/10.1080/10447310802205776
Beetham, H. (2007). An approach to learning activity design. In H. Beetham & R. Sharpe (Eds.), Rethinking pedagogy for
a digital age: Designing and delivering e-learning (pp. 26–40). Taylor & Francis. https://www.taylorfrancis.com/
chapters/edit/10.4324/9780203961681-12/approach-learning-activity-design-helen-beetham
Belbin, R. M. (2011). Management teams: Why they succeed or fail. Emerald Group Publishing Limited. https://doi.org/https:
//doi.org/10.4324/9780080963594
Bellotti, F., Berta, R., & De Gloria, A. (2010). Designing effective serious games: Opportunities and challenges for research.
International Journal of Emerging Technologies in Learning (iJET),5(SI3), 22–35. https://doi.org/10.3991/ijet.v5s3.
1500
Boschman, F., McKenney, S., & Voogt, J. (2015). Exploring teachers’ use of TPACK in design talk: The collaborative design of
technology-rich early literacy activities. Computers & Education,82, 250–262. https://doi.org/10.1016/j.compedu.
2014.11.010
Brooke, J. (2013). SUS: A retrospective. Journal of Usability Studies,8(2), 29–40. https://uxpajournal.org/sus-a-retrospective/
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher,18(1),
32–42. https://doi.org/10.3102/0013189X018001032
Celik, D., & Magoulas, G. D. (2016). A review, timeline, and categorization of learning design tools. In D. K. Chiu, I. Marenzi, U.
Nanni, M. Spaniol, & M. Temperini (Eds.), Proceedings of the 15th International Conference on Web-Based Learning
(ICWL 2016), 26–29 October 2016, Rome, Italy (pp. 3–13). Springer. https://doi.org/10.1007/978-3-319-47440-3 1
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., &
Kennedy, G. (2016). Student Retention and Learning Analytics: A Snapshot of Australian Practices and a Framework
for Advancement (tech. rep.). Australian Government Office for Learning and Teaching. https://www.research.ed.ac.
uk/en/publications/student-retention-and-learning- analytics-a-snapshot-of-australian
Dalziel, J. (2003). Implementing learning design: The learning activity management system (LAMS) [Workshop presented at
ASCILITE 2003, 7–10 December 2003, Adelaide, Australia].
Dawson, S., Joksimovic, S., Poquet, O., & Siemens, G. (2019). Increasing the impact of learning analytics. Proceedings of the
Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, AZ,
USA (pp. 446–455). ACM. https://doi.org/10.1145/3303772.3303784
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 277
Deng, Y., Antle, A. N., & Neustaedter, C. (2014). Tango cards: A card-based design tool for informing the design of tangible
learning games. Proceedings of the 2014 Conference on Designing Interactive Systems (DIS 2014), 21–25 June 2014,
Vancouver, BC, Canada (pp. 695–704). https://doi.org/10.1145/2598510.2598601
Design School Kolding. (2017). DSKD method cards.https://www.designskolenkolding.dk/en/publications/sustainable-fur-
cards
Dollinger, M., & Lodge, J. M. (2018). Co-creation strategies for learning analytics. Proceedings of the Eighth International
Conference on Learning Analytics and Knowledge (LAK 2018), 5–9 March 2018, Sydney, Australia (pp. 97–101).
ACM. https://doi.org/10.1145/3170358.3170372
Drachsler, H., & Goldhammer, F. (2020). Learning analytics and eAssessment—Towards computational psychometrics by
combining psychometrics with learning analytics. In D. Burgos (Ed.), Radical solutions and learning analytics
(pp. 67–80). Springer. https://doi.org/10.1007/978-981-15-4526-9 5
Eno, B., & Schmidt, P. (1975). Oblique strategies. Opal. (Limited edition, boxed set of cards.) [rMAB].
Fern
´
andez-Gallego, B., Lama, M., Vidal, J. C., & Mucientes, M. (2013). Learning analytics framework for educational virtual
worlds. Procedia Computer Science,25, 443–447. https://doi.org/10.1016/j.procs.2013.11.056
Florian-Gaviria, B., Glahn, C., & Gesa, R. F. (2013). A software suite for efficient use of the European qualifications
framework in online and blended courses. IEEE Transactions on Learning Technologies,6(3), 283–296. https :
//doi.org/10.1109/TLT.2013.18
Fournier, H., Kop, R., & Sitlia, H. (2011). The value of learning analytics to networked learning on a personal learning
environment. Proceedings of the First International Conference on Learning Analytics and Knowledge (LAK 2011, 27
February–1 March 2011, Banff, AB, Canada (pp. 104–109). ACM. https://doi.org/10.1145/2090116.2090131
Gasevic, D., Tsai, Y.
-
S., Dawson, S., & Pardo, A. (2019). How do we start? An approach to learning analytics adoption
in higher education. The International Journal of Information and Learning Technology,6(4), 342–353. https :
//doi.org/10.1108/IJILT-02-2019- 0024
Ga
ˇ
sevi
´
c, D., Mirriahi, N., Dawson, S., & Joksimovi
´
c, S. (2017). Effects of instructional conditions and experience on the
adoption of a learning tool. Computers in Human Behavior,67, 207–220. https://doi.org/10.1016/j.chb.2016.10.026
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Journal of
Educational Technology & Society,15(3), 42–57. https://www.jstor.org/stable/jeductechsoci.15.3.42
Gruber, M. R. (2019). Designing for Great Teaching with Learning Design Cards. Universit
¨
at-Z
¨
urich. https://dlf.uzh.ch/2019/
04/01/designing-for-great-teaching-with-learning- design-cards
Haythornthwaite, C. (2017). An information policy perspective on learning analytics. Proceedings of the Seventh International
Conference on Learning Analytics and Knowledge (LAK 2017), 13–17 March 2017, Vancouver, BC, Canada (pp. 253–
256). ACM. https://doi.org/10.1145/3027385.3027389
Hevner, A., & Chatterjee, S. (2010). Design science research in information systems. In A. Hevner & S. Chatterjee (Eds.),
Design research in information systems (pp. 9–22). Springer. https://doi.org/10.1007/978-1- 4419-5653-8 2
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review,16(3),
235–266. https://doi.org/10.1023/B:EDPR.0000034022.16470.f3
Joksimovi
´
c, S., Kovanovi
´
c, V., & Dawson, S. (2019). The journey of learning analytics. HERDSA Review of Higher Education,
6, 27–63. https://www.herdsa.org.au/herdsa-review-higher-education-vol-6/37-63
Katsamani, M., Retalis, S., & Boloudakis, M. (2012). Designing a Moodle course with the CADMOS learning design tool.
Educational Media International,49(4), 317–331. https://doi.org/10.1080/09523987.2012.745771
Kelly, N., Thompson, K., & Yeoman, P. (2015). Theory-led design of instruments and representations in learning analytics:
Developing a novel tool for orchestration of online collaborative learning. Journal of Learning Analytics,2(2), 14–43.
https://doi.org/10.18608/jla.2015.22.3
Koehler, M., & Mishra, P. (2009). What is technological pedagogical content knowledge (TPACK)? Contemporary Issues in
Technology and Teacher Education,9(1), 60–70. https://doi.org/10.1177/002205741319300303
Koper, R. (2005). An introduction to learning design. In R. Koper & C. Tattersall (Eds.), Learning design (pp. 3–20). Springer.
https://doi.org/10.1007/3-540-27360-3 1
Koper, R. (2006). Current research in learning design. Educational Technology & Society,9(1), 13–22. https://www.jstor.org/
stable/10.2307/jeductechsoci.9.1.13
Koper, R., & Manderveld, J. (2004). Educational modelling language: Modelling reusable, interoperable, rich and personalised
units of learning. British Journal of Educational Technology,35(5), 537–551. https://doi. org/10 . 1111 / j . 0007 -
1013.2004.00412.x
Koper, R., & Olivier, B. (2004). Representing the learning design of units of learning. Journal of Educational Technology &
Society,7(3), 97–111. https://www.learntechlib.org/p/75442/
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 278
Laurillard, D., Charlton, P., Craft, B., Dimakopoulos, D., Ljubojevic, D., Magoulas, G., Masterman, E., Pujadas, R., Whitley,
E. A., & Whittlestone, K. (2013). A constructionist learning environment for teachers to model learning designs.
Journal of Computer Assisted Learning,29(1), 15–30. https://doi.org/10.1111/j.1365-2729.2011.00458.x
Law, N., & Liang, L. (2020). A multilevel framework and method for learning analytics integrated learning design. Journal of
Learning Analytics,7(3), 98–117. https://doi.org/10.18608/jla.2020.73.8
Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education—a literature review. In A. Pe
˜
na-Ayala (Ed.),
Learning analytics: Fundaments, applications, and trends (pp. 1–23). Springer. https://doi.org/10.1007/978-3-319-
52977-6 1
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning
design. American Behavioral Scientist,57(10), 1439–1459. https://doi.org/10.1177/0002764213479367
Macfadyen, L. P., Lockyer, L., & Rienties, B. (2020). Learning design and learning analytics: Snapshot 2020. Journal of
Learning Analytics,7(3), 6–12. https://doi.org/10.18608/jla.2020.73.2
Mangaroska, K., & Giannakos, M. N. (2018). Learning analytics for learning design: A systematic literature review of
analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies,12(4), 516–534. https:
//doi.org/10.1109/TLT.2018.2868673
Mangaroska, K., Sharma, K., Gasevic, D., & Giannakos, M. (2020). Multimodal learning analytics to inform learning design:
Lessons learned from computing education. Journal of Learning Analytics,7(3), 79–97. https://doi.org/10.18608/jla.
2020.73.7
Mankins, J. C. (1995). Technology Readiness Levels (white paper). Advanced Concepts Office, Office of Space Access;
Technology, NASA. http://www.artemisinnovation.com/images/TRL White Paper 2004-Edited.pdf
McKenney, S., & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of Educational
Technology,46(2), 265–279. https://doi.org/10.1111/bjet.12262
Means, B., & Neisler, J. (2020). Suddenly Online: A National Survey of Undergraduates during the COVID-19 Pandemic (tech.
rep.). Digital Promise. https://www.everylearnereverywhere.org/resources/suddenly-online-national-undergraduate-
survey/
Michos, K., Lang, C., Hern
´
andez-Leo, D., & Price-Dennis, D. (2020). Involving teachers in learning analytics design: Lessons
learned from two case studies. Proceedings of the 10th International Conference on Learning Analytics and Knowledge
(LAK 2020), 23–27 March 2020, online, 94–99. https://doi.org/10.1145/3375462.3375507
Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Research Report
Series,2003(1), i–29. https://doi.org/10.1002/j.2333-8504.2003.tb01908.x
Moller, O. (n.d.). MethodKit.https://methodkit.com/research-method-cards/
Mor, Y., & Mogilevsky, O. (2013). The learning design studio: Collaborative design inquiry as teachers’ professional develop-
ment. Research in Learning Technology,21, 22054. https://doi.org/10.3402/rlt.v21i0.22054
Nguyen, Q., Rienties, B., & Whitelock, D. (2020). A mixed-method study of how instructors design for learning in online and
distance education. Journal of Learning Analytics,7(3), 64–78. https://doi.org/10.18608/jla.2020.73.6
Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of
Educational Technology,46(2), 230–248. https://doi.org/10.1111/bjet.12207
Peterson, C. (2003). Bringing ADDIE to life: Instructional design at its best. Journal of Educational Multimedia and Hypermedia,
12(3), 227–241. https://www.learntechlib.org/primary/p/2074/
Plattner, H. (2010). An Introduction to Design Thinking Process Guide. The Institute of Design at Stanford. https://web.stanford.
edu/∼mshanks/MichaelShanks/files/509554.pdf
Rienties, B., Nguyen, Q., Holmes, W., & Reedy, K. (2017). A review of ten years of implementation and research in aligning
learning design with learning analytics at the Open University UK. Interaction Design and Architecture(s),33,
134–154. https://doi.org/10.55612/s-5002-033-007
Rodr
´
ıguez-Triana, M. J., Mart
´
ınez-Mon
´
es, A., Asensio-P
´
erez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet
each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations.
British Journal of Educational Technology,46(2), 330–343. https://doi.org/10.1111/bjet.12198
Roy, R., & Warren, J. P. (2019). Card-based design tools: A review and analysis of 155 card decks for designers and designing.
Design Studies,63, 125–154. https://doi.org/10.1016/j.destud.2019.04.002
Rubenstein-Montano, B., Liebowitz, J., Buchwalter, J., McCaw, D., Newman, B., Rebeck, K., & The Knowledge Management
Methodology Team. (2001). A systems thinking framework for knowledge management. Decision Support Systems,
31(1), 5–16. https://doi.org/10.1016/S0167-9236(00)00116-0
Sarmiento, J. P., Campos, F., & Wise, A. (2020). Engaging students as co-designers of learning analytics. Companion
Proceedings of the 10th International Learning Analytics and Knowledge Conference (LAK 2020), 23–27 March 2020,
online (pp. 29–32). https://www.solaresearch.org/wp-content/uploads/2020/06/LAK20 Companion Proceedings.pdf
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 279
Scheffel, M., Drachsler, H., De Kraker, J., Kreijns, K., Slootmaker, A., & Specht, M. (2016). Widget, widget on the wall, am I
performing well at all? IEEE Transactions on Learning Technologies,10(1), 42–52. https://doi.org/10.1109/TLT.2016.
2622268
Scheffel, M., Schmitz, M., van Hooijdonk, J., van Limbeek, E., Kockelkoren, C., Joppe, D., & Drachsler, H. (2021). The design
cycle for education (DC4E). In A. Kienle, A. Harrer, J. M. Haake, & A. Lingnau (Eds.), DELFI 2021—Komplettband,
13–15 September 2021, online (pp. 229–240). Gesellschaft f
¨
ur Informatik eV. https://dl.gi.de/handle/20.500.12116/
37055
Schmitz, M., Scheffel, M., Bemelmans, R., & Drachsler, H. (2020). Fellowship of the Learning Activity—Learning Analytics 4
Learning Design Game.https://doi.org/10.25385/zuyd.9884279
Schmitz, M., Scheffel, M., van Limbeek, E., Bemelmans, R., & Drachsler, H. (2018a). “Make it personal!”—Gathering
input from stakeholders for a learning analytics-supported learning design tool. In V. Pammer-Schindler, M. P
´
erez-
Sanagust
´
ın, H. Drachsler, R. Elferink, & M. Scheffel (Eds.), European Conference on Technology Enhanced Learning
(EC-TEL 2018), 3–5 September 2018, Leeds, UK (pp. 297–310). Springer. https://doi.org/10.1007/978-3-319-98572-
5 23
Schmitz, M., Scheffel, M., van Limbeek, E., van Halem, N., Cornelisz, I., van Klaveren, C., Bemelmans, R., & Drachsler, H.
(2018b). Investigating the relationships between online activity, learning strategies and grades to create learning
analytics-supported learning designs. In V. Pammer-Schindler, M. P
´
erez-Sanagust
´
ın, H. Drachsler, R. Elferink, &
M. Scheffel (Eds.), European Conference on Technology Enhanced Learning (EC-TEL 2018), 3–5 September 2018,
Leeds, UK (pp. 311–325). Springer. https://doi.org/10.1007/978-3-319-98572-5 24
Schmitz, M., Van Limbeek, E., Greller, W., Sloep, P., & Drachsler, H. (2017). Opportunities and challenges in using learning
analytics in learning design. In
´
E. Lavou
´
e, H. Drachsler, K. Verbert, J. Broisin, & M. P
´
erez-Sanagust
´
ın (Eds.),
European Conference on Technology Enhanced Learning (EC-TEL 2017), 12–15 September 2017, Tallinn, Estonia
(pp. 209–223). Springer. https://doi.org/10.1007/978-3-319-66610-5 16
Shaffer, D. W. (2006). Epistemic frames for epistemic games. Computers & Education,46(3), 223–234. https://doi.org/10.
1016/j.compedu.2005.11.003
Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review,46(5), 30.
https://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist,57(10),
1510–1529. https://doi.org/10.1177/0002764213479366
Sluijsmans, D., Dochy, F., & Moerkerke, G. (1998). Creating a learning environment by using self-, peer-and co-assessment.
Learning Environments Research,1(3), 293–319. https://doi.org/10.1023/A:1009932704458
Spiro, R. J., & Jheng, J. C. (2012). Cognitive flexibility and hypertext: Theory and technology for the nonlinear and multidi-
mensional traversal of complex subject matter. In D. Nix & R. J. Spiro (Eds.), Cognition, education, and multimedia
(pp. 177–220). Taylor & Francis. https://www.taylorfrancis.com/chapters/edit/10.4324/9780203052174-11/cognitive-
flexibility-hypertext-theory-technology-nonlinear-multidimensional-traversal-complex-subject-matter-rand-spiro-
jihn-changjehng
Sweller, J. (2011). Cognitive load theory. In J. P. Mestre & B. H. Ross (Eds.), Psychology of learning and motivation (pp. 37–76).
Elsevier. https://doi.org/10.1016/B978-0-12-387691-1.00002- 8
Tsai, Y.
-
S., & Gasevic, D. (2017). Learning analytics in higher education—challenges and policies: A review of eight learning
analytics policies. Proceedings of the Seventh International Conference on Learning Analytics and Knowledge (LAK
2017), 13–17 March 2017, Vancouver, BC, Canada (pp. 233–242). ACM. https://doi.org/10.1145/3027385.3027400
Tsai, Y.
-
S., Moreno-Marcos, P. M., Jivet, I., Scheffel, M., Tammets, K., Kollom, K., & Ga
ˇ
sevi
´
c, D. (2018). The SHEILA
framework: Informing institutional strategies and policy processes of learning analytics. Journal of Learning Analytics,
5(3), 5–20. https://doi.org/10.18608/jla.2018.53.2
Van Merrienboer, J., & Kirschner, P. (2007). Ten steps to complex learning. Lawrence Erlbaum Associates.
Verpoorten, D., Poumay, M., & Leclercq, D. (2007). The eight learning events model: A pedagogic conceptual tool supporting
diversification of learning methods. Interactive Learning Environments,15(2), 151–160. https://doi.org/10.1080/
10494820701343694
Vezzoli, Y., Mavrikis, M., & Vasalou, A. (2020). Inspiration cards workshops with primary teachers in the early co-design
stages of learning analytics. Proceedings of the 10th International Conference on Learning Analytics and Knowledge
(LAK 2020), 23–27 March 2020, online (pp. 73–82). ACM. https://doi.org/10.1145/3375462.3375537
Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowl-
edge. Harvard Business Press. https:/ /www.google.com .au/books/ edition/Cultivating Communities of Practice/
m1xZuNq9RygC?hl=en&gbpv=0
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 280
Wise, A. F., Vytasek, J. M., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle
space”: The Student Tuning Model and Align Design Framework for learning analytics use. Online Learning Journal,
20(2), 155–182. https://doi.org/10.24059/olj.v20i2.783
ISSN 1929-7750 (online). The Journal of Learning Analytics works under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported
(CC BY-NC-ND 3.0) 281