Conference PaperPDF Available

Data-driven design pattern production: a case study on the ASSISTments online learning system

Authors:

Abstract and Figures

Recently, online learning systems such as cognitive tutors, online courses, and massive open online courses (MOOCS) increased in popularity in various domains. The design quality of online learning systems is difficult to maintain. Multiple stakeholders are involved (e.g., software developers, interaction designers, learning scientists, teachers), the system is complex, there are rapid changes in software, platforms (e.g., mobile, tablet, desktop) and learning subject content, and so forth. Many existing online learning systems collect a significant amount of data that describe students’ learning gains and affective states, which are indirect measures of system quality. Analysis of online learning system data can uncover linkages between particular design choices made and student learning. In this paper, we describe the data-driven design pattern production (3D2P) methodology to prospect, mine, write, and evaluate design patterns for online learning systems from collected data. Design patterns are high quality solutions for recurring problems in a particular context. Patterns identified with 3D2P can guide the addition of new content and the modification of system designs to maintain the quality of online learning systems. A case study on the ASSISTments math online learning system is presented to demonstrate the 3D2P methodology and discuss its benefits and limitations.
Content may be subject to copyright.
Data-Driven Design Pattern Production: A Case Study
on the ASSISTments Online Learning System
PAUL SALVADOR INVENTADO AND PETER SCUPELLI, School of Design, Carnegie Mellon University
Recently, online learning systems such as cognitive tutors, online courses, and massive open online courses (MOOCS) increased
in popularity in various domains. The design quality of online learning systems is difficult to maintain. Multiple stakeholders
are involved (e.g., software developers, interaction designers, learning scientists, teachers), the system is complex, there are
rapid changes in software, platforms (e.g., mobile, tablet, desktop) and learning subject content, and so forth. Many existing
online learning systems collect a significant amount of data that describe students’ learning gains and affective states, which
are indirect measures of system quality. Analysis of online learning system data can uncover linkages between particular
design choices made and student learning. In this paper, we describe the data-driven design pattern production (3D2P)
methodology to prospect, mine, write, and evaluate design patterns for online learning systems from collected data. Design
patterns are high quality solutions for recurring problems in a particular context. Patterns identified with 3D2P can guide the
addition of new content and the modification of system designs to maintain the quality of online learning systems. A case study
on the ASSISTments math online learning system is presented to demonstrate the 3D2P methodology and discuss its benefits
and limitations.
Categories and Subject Descriptors: H.2.8 [Information Systems]: Database ApplicationsData mining; H.5.2 [Information
Interfaces and Presentation]: User InterfacesEvaluation/methodology; H.5.2 [Information Interfaces and
Presentation]: User InterfacesUser-centered design; K.3 [Computers and Education]: Computer Uses in Education
Distance learning
General Terms: Design, Human Factors
Additional Key Words and Phrases: Pattern prospecting, pattern mining, pattern writing, pattern evaluation, online learning
systems, ASSISTments
ACM Reference Format:
Paul Salvador Inventado and Peter Scupelli. 2016. Data-Driven Design Pattern Production: A Case Study on the ASSISTments
Online Learning System. EuroPLoP '15: Proceedings of the 20th European Conference on Pattern Languages of Program, Article
(July 2015), 13 pages.
DOI:http://dx.doi.org/10.1145/2855321.2855336
1. INTRODUCTION
Design patterns have been used successfully in many domains like architecture, software design,
interaction design, business, and education [Alexander et al. 1977, Gamma et al. 1995, Tidwell 2010,
Van Duyne et al. 2007, Gourova et al. 2012, Holden et al. 2010]. They are defined as known high
quality solutions to recurring problems in a specific context [Alexander et al. 1977]. Design patterns
can hasten knowledge transfer, improve understanding, and facilitate collaboration [Borchers 2001,
Chung et al. 2004, Dearden et al. 2002].
Complex information technology systems, such as online learning systems, are often built and
maintained by multiple stakeholders with varied backgrounds. In the case of online learning systems
for example, its stakeholders might be system developers, interface designers, learning scientists,
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee
provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full
citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored.
Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior
specific permission and/or a fee. Request permissions from Permissions@acm.org.
EuroPLoP '15, July 08 - 12, 2015, Kaufbeuren, Germany
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-3847-9/15/07…$15.00
DOI: http://dx.doi.org/10.1145/2855321.2855336
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 2
teachers, and students. Such systems can benefit from design patterns by facilitating communication
and collaboration between stakeholders. This research promotes the use of design patterns in online
learning systems and presents a methodology that can address three challenges in applying design
patterns to this domain: (a) limited amount of available design patterns in the domain, (b) dynamic
nature of the domain, and (c) diversity of the system’s users.
Design patterns for online learning systems are limited, and often focus on content delivery and
class management (e.g., [Anacleto et al. 2009, Derntl 2004, Frizell and Hübscher 2011]). Patterns
from related domains such as software design [Gamma et al. 1995], interaction design [Neil 2014,
Tidwell 2010, Van Duyne et al. 2007], or pedagogy [Anacleto et al. 2009, Derntl 2004, Frizell and
Hübscher 2011] might be useful, but the contextual differences with online learning systems requires
existing patterns to be adapted or new patterns to be developed.
Online learning systems are highly dynamic because they change with rapid technological and
pedagogical advancements. For example, advancements in mobile technology led to mobile learning
systems that support various devices [Wu et al. 2012]. Another example is advancements in affective
computing [Picard 1997], the study of systems that automatically recognize, interpret, or simulate
human emotions (e.g., confusion, frustration, boredom), which led to the design of automated affect-
based feedback. Continued changes in the environment require stakeholders to constantly rethink
designs that will address new contexts, forces, and problems.
Online learning systems are often used by many learners with varying cognitive abilities, learning
experiences, motivations, and cultural backgrounds. Different types of users experience different
kinds of problems. For example, low-knowledge learners could get confused or frustrated when they
are asked to solve a difficult math problem, while high-knowledge learners could get bored answering
the same type of math problem, which they do not find challenging. A large number of design patterns
need to be identified and applied to address different users.
Addressing these three challenges manually will be difficult because the design of the system’s
components and the learners’ interactions within the system have to be analyzed. Stakeholders need
to identify recurring problems and corresponding solutions, and to adapt design patterns or produce
new design patterns for new problems. Stakeholders also need to constantly reanalyze the system
because new contexts and problems might arise from system changes. The difficulty of addressing
these challenges increase with the size of the system and the number of users.
In this paper, we present the data-driven design pattern production (3D2P) methodology, which
uses data-driven approaches to lessen the work required from stakeholders to identify, adapt, create,
and evaluate design patterns. Design patterns selected for online learning systems, as well as adapted
or newly uncovered design patterns, can guide the addition of new content and new features in online
learning systems to ensure their quality. The methodology may be applied in similar domains, but the
paper only focuses on online learning systems.
The following sections discuss related work, the 3D2P methodology, a case study on applying the
3D2P methodology to the ASSISTments online learning system, the challenges and benefits of using
the methodology, a summary of the paper, and future work.
2. RELATED WORK
Enrollment in online learning systems is growing rapidly. In the Fall of 2007, over 3.9 million
students took at least one online course, and over 20% of all US higher education students took at
least one online course [Allen and Seaman 2008]. Massive open online courses (MOOCs) such as
Coursera, edX and Udacity have also become popular. Coursera, one of the largest MOOC platforms,
has over 11 million users in 941 courses from 118 institutions as of January 2015 [Coursera n.d.]. The
success of tutoring systems like Cognitive Tutor and ASSISTments [Koedinger et al. 1997, Mendicino
et al. 2009a, Morgan and Ritter 2002] also led to an increase of users. Cognitive Tutor was reported to
make over 250 million student-observations a year [Carnegie Learning, n.d., Sarkis 2004].
Online learning systems can collect data about their users to generate reports, customize feedback
and personalize content. Increasingly, online learning providers store student learning information in
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 3
well-structured formats and some make it available for public use [Clow 2014, Koedinger et al. 2010,
Sharkey 2011, Veeramachaneni et al. 2013]. The availability of large quantities of high quality data
enabled educational data mining and learning analytics researchers to understand students better
and provide them with proper support.
As more students use online learning systems, more courses are offered and more content is added.
Rapid advances in technology and learning sciences call for system changes to adapt new technologies
and implement new pedagogies. For example, delivering online learning systems originally designed
for desktop computing on multiple platforms presents interaction design challenges on the smaller
displays of tablets and mobile devices. Supporting online learning on multiple platforms is difficult. It
is even more challenging to maintain the system’s quality as it changes over time.
Design principles and guidelines are often used to ensure quality while modifying the system and
its content. However, there are challenges in instantiating design principles in new contexts because
of possible conflicts (cf. [Koedinger et al. 2012, Mahemoff and Johnston 1998]). Design patterns can
overcome design principles limitations [Borchers 2001, Chung et al. 2004, Kunert 2009, Mahemoff
and Johnston 1998, Van Welie et al. 2001] and set the standard for rigorous, contextualized design
principles.
Most design patterns developed for online learning systems focus on pedagogy such as class
management, content presentation, and activities that promote learning [Anacleto et al. 2009, Derntl
2004, Frizell and Hübscher 2011]. There are also some patterns for providing feedback and evaluation
(c.f. [Anthony 1996, Bergin et al. 2012, Larson et al. 2008]). Patterns developed for the educational
domain might also be adapted to online learning systems such as patterns for teaching language
[Köppe and Nijsten 2012], teaching object-oriented programming [Bergin 2003] and learning [Iba and
Sakamoto 2011].
Design patterns are produced through a pattern mining process. There are four common pattern
mining methods: (a) introspective approach using the author’s own experience to identify patterns;
(b) social approach – using observations of an environment and expert interviews to identify patterns;
(c) artifactual approach uncovering patterns from existing instances; and (d) pattern mining
workshops analyzing experiences of different experts from group discussions to identify patterns
[DeLano 1998, Kerth and Cunningham 1997, Kohls 2013]. We differentiate how we use pattern
mining, which is to find new patterns, from finding known patterns in large data sets (c.f., [De Lucia
et al. 2010, Donget al. 2008, Fontana and Zanoni 2011]). In the next section, we introduce a fifth
method that uses data collected from user interactions within a learning system to mine for patterns.
3. METHODOLOGY
The data-driven design pattern production (3D2P) methodology uses data collected from online
learning systems to uncover design patterns. Figure 1 illustrates the 4 steps in the methodology:
pattern prospecting, mining, writing, and evaluation. Pattern prospecting finds traces of interesting
relationships in learning data to constrain search spaces for pattern mining. Pattern mining finds
relationships in the data to uncover existing design patterns or hypothesize new design patterns.
Pattern writing puts form into proposed design patterns and refines the patterns through expert and
peer feedback. Finally, potential patterns are evaluated in actual use to validate design patterns.
Over time, more content will be added into the system and technical updates may be applied. The
methodology can be used incrementally to ensure the quality of the system while it changes.
Each step in the methodology is elaborated in the following subsections.
3.1 Pattern prospecting
In the field of mineral extraction, the extraction process usually begins with prospecting. Prospecting
helps identify what and where to mine so there is a better chance of finding something valuable.
Examples of prospecting methods for mineral deposits are electrical, gravimetric, magnetic, seismic,
and imaging [Dobrin and Savit 1960, Edge and Laby 1931, Rowan et al. 2003]. Data mining uses the
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 4
“mining” metaphor to emphasize the difficulty of finding valuable information in large and complex
data. Prospecting uncovers interesting features in the data before further processing, which is often a
more costly operation. Data visualization and descriptive statistics are two commonly used methods in
the data exploration or data preparation step of data mining to find interesting data relationships
[Ramachandran et al. 2013, Shearer 2000].
Figure 1. Data-driven design pattern methodology (Image courtesy of Learning Environments Lab).
The pattern-prospecting step investigates online learning system data to find potential design
patterns. Data is retrieved then cleaned to remove noise and errors, and fixed to recover missing
information. Data can be processed further to: (a) change its current representation (i.e., for easier
data manipulation or to satisfy data analysis tool requirements), (b) change its granularity (e.g.,
individual student actions vs. averaged problem actions), (c) merge it with other data sources (e.g.,
student affect), (d) add features, and (e) delete unnecessary features. Data visualization, descriptive
statistics, and manual exploration can be used to uncover relationships between data features that
suggest recurring problems or effective solutions. For example, a particular math problem frequently
associated with high student frustration could indicate a potential issue worth exploring. There are
different approaches to automatically identify human emotions (e.g., frustration), such as using data
from students’ interaction logs or physiological sensors [Baker et al. 2012, Ocumpaugh et al. 2014,
Picard 1997].
Student learning is shaped by an emergent system that results from the integration of its
components. For example, a well-designed math problem presented through an engaging interface,
with helpful hints, and delivered in a timely manner will likely lead to a pleasant learning experience
(e.g., less confusion, more correct answers, increased motivation). Some features in the data might
measure the cognitive (e.g., answer correctness, mastery speed, time on task) or affective (e.g.,
engagement, boredom, confusion, frustration) aspects of learning, which we refer to as learning
measures. In this work, we focused on data describing the relationship between the system’s designs
and learning measures because the primary goal of online learning systems is to help students learn.
Student learning measures capture partial measures of the overall system quality, which does not
include, for example, software design quality measures (e.g., code readability and scalability).
Collecting other measures in addition to student learning measures will allow us to find potential
design patterns on other aspects of the system. In this work we focused on learning measures, but we
plan to explore other system measures in the future.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 5
Prospecting is essential especially when dealing with large quantities of data because it takes time
and effort to mine for patterns. Features that describe relationships in the data can be used to
generate data subsets that limit the search space for pattern mining. Using the previous example, a
data subset containing problems associated with high student frustration can be set aside for further
exploration and pattern mining.
Although 3D2P focuses on prospecting potential patterns from data, patterns can be prospected in
other ways. Inherently, pattern authors prospect their experiences, other people’s experiences, or
artifacts for potential design patterns. Pattern authors’ knowledge may guide the search for
interesting relationships in the data and interesting features found in data may help pattern authors
gain new insights to develop patterns.
3.2 Pattern mining
Pattern mining aims to discover or explain patterns [Kohls 2013]. Patterns can be mined from data,
but may need verification to account for noise, missing information, or limited contextual information.
There are two ways one might mine for patterns in data: find effective solutions, and find recurring
problems.
Prospecting limits the pattern mining search space to only consider interesting features. Features
that frequently co-occur with desirable learning measure values may suggest a potential design
pattern. For example, correlation between frequent hint requests and low frustration may indicate the
usefulness of hints. Features that frequently co-occur with undesirable learning measure values are
recurring problems that need resolution. For example, correlation between infrequent hint requests
and high frustration may indicate a problem with feedback design. Pattern authors might use
background knowledge or literature to identify possible solutions for recurring problems.
Pattern mining might also reveal the application of existing design patterns. Desirable effects from
applying the design pattern confirm its quality, which can be shared with the pattern author. If
undesirable effects were observed it might indicate that either the pattern was applied incorrectly and
the system’s design needs modification, or the existing pattern may need to consider some special
cases. The pattern author might learn more about the pattern from observed undesirable effects and
can decide to refine the pattern if necessary.
3.3 Pattern writing
Hypotheses regarding effective solutions and recurring problems drawn from pattern mining is used
to produce design patterns. Design pattern components such as context, forces, problem, and solution
can be uncovered from effective solutions found in data. The solution might not be as apparent with
recurring problems. Exploring data to explain the recurring problem helps to identify a solution,
which might be drawn from background knowledge or literature. An example is provided in the next
section to clarify this process.
Resulting patterns are interpreted from data containing partial, incomplete measures therefore
they are considered potential design patterns until verified. Experts and peers can help to confirm the
quality of potential design patterns through informal discussions, shepherding, and writing
workshops. Patterns can be refined until the pattern author and reviewers are satisfied with the
definition.
3.4 Pattern evaluation
Pattern evaluation helps to validate the quality of design patterns, to test their adaptability into other
similar environments, and to refine its definition when necessary.
Effective design patterns are expected to replicate desirable learning measure values or reduce
undesirable learning measure values. Randomized controlled trials (RCTs) can be conducted on the
same system or on other online learning systems to compare learning measure differences when the
design pattern is applied and when it is not applied. Design patterns validated on multiple
environments can be shared more confidently with the community. Patterns that failed validation
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 6
might need refinement or adaptation to fit differences in context (e.g., adapting patterns designed for
desktop computers into mobile devices).
4. ASSISTMENTS CASE STUDY
The 3D2P methodology was used to produce design patterns using data from the ASSISTments online
learning system. The following subsections describe the ASSISTments online learning system, discuss
the application of the methodology, and present one of the patterns produced.
4.1 ASSISTments
ASSISTments was developed in 2003 to help students practice Math for the Massachusetts
Comprehensive Assessment System (MCAS), to help teachers identify students who needed more
support, and to identify topics that needed to be discussed and practiced further [Heffernan and
Heffernan 2014]. It is an online learning system that allows teachers to create math problems and
exercises with corresponding solutions and feedback. Teachers can get immediate feedback about
student performance on their assigned exercises. ASSISTments was developed by a multi-disciplinary
team consisting of content experts, ITS experts, and system developers among others. It was designed
and built using architecture and interface development best practices with the help of math teachers
who provided expert advice and content. Figure 2 illustrates a math problem in ASSISTments, which
allows students to view a problem, request hints, and attempt to solve it.
Figure 2. ASSISTments interface for a word problem: (a) Students solving a problem can submit their answer
and get feedback from the system (b) Students can request for multiple hints to help them solve the problem.
(Image courtesy of ASSISTments http://assistments.org)
Initially, math teachers were asked to provide the content (i.e., problems, answers, hints, feedback)
for ASSISTments. Later, references to math textbook problems were also added and teachers who
used the system were allowed to create their own questions or adapt their own versions of existing
questions. Over time, ASSISTments underwent many changes to support feature requests and
improvements. It also allowed researchers to run studies and collect student data using customized
content, feedback, and other features of the interface (e.g., [Broderick et al. 2011, Li et al. 2013,
Whorton 2013]).
ASSISTments has been collecting large amounts of data since 2003 from close to 50,000 students
using the system each year in 48 states of the United States [ASSISTments n.d., Heffernan and
Heffernan 2014, Mendicino et al. 2009b]. Data is represented using multiple features that describe the
student learning experience, such as the number of hint requests in a problem, the number of answer
attempts in a problem, the correctness of students’ answers, and the timestamps of students’ actions.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 7
4.2 3D2P on ASSISTments data
ASSISTments data was used to test the 3D2P methodology. Specifically, data collected between
September, 2012 and September, 2013 containing 6,123,424 instances. It was the most recent data
available at the time the study was conducted and the system had already undergone many design
changes that could have affected its quality.
Pattern Prospecting
Ocumpaugh et al. [2014] used data from ASSISTments to predict student affect (i.e., concentration,
frustration, confusion, boredom). The predictions were generated by a machine learning model built
with expert-labeled data of student affect while using ASSISTments. The machine learning model
used various features available in the data such as the number of previous incorrect answers, time
taken to solve problems, and number of hint requests. The model outputs confidence values for each
affective state, which approximates students’ learning experience at a particular point in time. For
example, the confidence values: concentration 0.63, frustration 0.99, confusion 0.00, boredom
0.04, might indicate that a student was likely frustrated and confused while performing an activity.
The same machine-learning model was used to predict student affect in the data used for the study.
The affect predictions were merged into the data set to provide more contextual information for
analysis, and to allow the discovery of design patterns that address the emotional aspects of learning.
Instances in the data with missing values were removed resulting in 5,801,073 problem-student
interaction instances. Figure 3a presents a subset of the features and instances in the data, which
summarizes students’ actions when they solved a particular problem (e.g., number of hint requests,
number of attempts to answer a problem). Only 11 of the 285 features are shown in the figure. A
separate data set was generated to aggregate and approximate students’ general behavior towards a
particular problem. The features of each problem-student interaction associated with a problem were
aggregated through statistical functions (e.g., sum, average, min, max) resulting in 179,908 problem-
level instances (see Figure 3b). Only 6 of the 285 features are shown in the figure.
sequence_id
problem_id
user_id
correct
attempt_count
problem_body
confidence_bored
confidence_concentrating
confidence_confused
confidence_frustrated
41342
304914
184820
1
1
What is 8.08 + 0.953?
0.042401209
0.626725220
0.000000000
0.000000000
41342
304905
184820
1
1
What is 5.13 + 1.137?
0.042401209
0.626725220
0.000000000
0.000000000
41342
304888
184820
1
1
What is 2.44 + 0.383?
0.042401209
0.626725220
0.000000000
0.000000000
41342
304916
184820
1
1
What is 6.38 + 0.083?
0.042401209
0.626725220
0.000000000
0.000000000
41342
304899
184820
1
1
What is 4.5 + 1.016?
0.042401209
0.626725220
0.000000000
0.999803203
41342
304899
184757
0
1
What is 4.5 + 1.016?
0.577264514
0.181500873
0.000000000
0.492143066
(a) Problem-student interaction data 5,801,073 instances x 285 features
problem_id
hintbeforecorrect
confidence_bored
confidence_concentrating
confidence_confused
confidence_frustrated
409449
0.055555556
0.143366120
0.510642641
0.055741783
0.049795279
304899
0.333333333
0.195219296
0.499518263
0.000000000
0.355964210
491950
0.333333333
0.212678290
0.358241646
0.117607819
0.185450115
493052
0.350000000
0.177703088
0.380689055
0.107184182
0.229513384
499240
0.206896552
0.151566570
0.368188615
0.110587510
0.132678163
(b) Problem-level data 179,908 instances x 285 features
Figure 3. Subsets of ASSISTments data showing: (a) problem-student interaction data with 1 student (user_id
184820) answering a series of problems and both the 1st and 2nd student (user_id 184757) experiencing
frustration when solving the same problem (see gray rows); and (b) problem-level data created by aggregating
the features of problem-student interaction data (gray row shows the aggregated features of problem_id
304899, which was solved by many students) (Image courtesy of Learning Environments Lab).
Students often experience frustration while learning. It is unclear whether it hinders learning or
pushes students to exert more effort to learn (cf. [Artino and Jones 2012, Patrick et al. 1993, Pekrun
et al. 2002]). Frustration was investigated in the ASSISTments data set by exploring problem-level
data for math problems that were likely to be frustrating (i.e., high frustration confidence values).
Manual observation of the problems suggested issues with the design of the math problems. For
example, some problems were difficult to understand, hints presented unnecessary information, and
notations were used inconsistently. This led to the hypothesis that poor content might have caused
student frustration and possibly hindered learning. A data subset was generated to contain problems
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 8
associated with high frustration confidence values. Other measures such as answer correctness and
boredom were available for analysis, but these were set aside for later investigation.
Pattern Mining
Further analysis revealed that one group of problems with high frustration confidence values were
about “Adding Decimals” (e.g., Figure 2). Many students requested for hints and made multiple
attempts to solve these problems but failed. This might mean that the problem was too difficult.
However, the data also showed that many students were frustrated even when they repeatedly
answered the problem correctly. Two hypotheses were identified from this observation: (a) students
get frustrated when they find it difficult to solve a problem (i.e., evidenced by multiple hint requests
and incorrect answer attempts, and high frustration values), and (b) students get frustrated when
they are asked to answer the same problem type repeatedly even if they already mastered it (see
correct column in Figure 3a).
Pattern Writing
The second hypothesis led to the definition of the Just Enough Practice pattern, which addressed
the recurring problem of student frustration possibly caused by repeated requests to answer mastered
math problem types. The solution selected was to change problem types after a student masters a
particular skill. Cen, Koedinger, and Junker [2007] suggest a switch in problem types instead of over
practice because it leads to the same amount of learning in less time. The pattern is presented in the
next sub-section.
The Just Enough Practice pattern was discussed with ASSISTments stakeholders who thought
it would make an interesting design pattern. The pattern was also refined through a shepherding
process and a writing workshop at PLoP 2015 [Inventado and Scupelli in press].
Pattern Evaluation
An RCT is being designed for ASSISTments to compare student learning measures when they are
asked to: (a) answer all problems in a math homework regardless if they already mastered the skill,
and (b) repeatedly answer problems in a math homework until they get three right in a row (i.e.,
mastery). Other online learning systems are also being considered for running the same RCT.
The Just Enough Practice pattern will either be retained or modified depending on the
evaluation’s result. If the pattern leads to better learning, it will be shared with ASSISTments and
other online learning system stakeholders. Otherwise, it will be investigated and refined further.
The ASSISTments data set has a large number of features available, which has continued to grow
to address researchers’ needs. Currently, there are 285 features that describe students’ learning
experience. Over time, more content will be added into the system and technical updates may be
applied. The methodology will be iterated to explore more features, produce more design patterns, and
share more design patterns with online learning system stakeholders to ensure its quality.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 9
4.3 Proposed Pattern
Just enough practice
(Image courtesy of Learning Environments Lab)
Context: Students are asked to practice a particular skill through exercises in an online learning system. Teachers design the problems for
the exercise in the online learning system. They also provide corresponding answers and feedback for each problem, and design their
presentation sequence. Problems may vary in type (e.g., multiple choice, true or false), topic (e.g., addition, subtraction), and difficulty.
Forces:
1. Practice. Students need to practice a skill to master it [Clark and Mayer 2011, Sloboda et al. 1996, Tuffiash et al. 2007]. It leads to
greater improvements in performance during early sessions, but additional practice sessions lead to smaller improvement gains over time
[Rohrer and Taylor 2006].
2. Expertise reversal. Presenting students information they already know can impose extraneous cognitive load and interfere with
additional learning [Sweller 2004].
3. Limited resources. Students’ attention and patience towards performing a learning activity decrease over time [Arnold et al. 2005,
Bloom 1974].
Problem: Students may become frustrated when they are asked to repeatedly answer problems that test skills they already mastered.
Solution: Therefore, change the problem type and/or topic after students master it. Student mastery can be measured in different ways
such as, the number of times a student correctly answered a problem type and/or topic, or individualized statistical models that predict
student knowledge [Yudelson et al. 2013].
Consequences:
Benefits:
1. Students are not asked to do additional practice on a skill they already know.
2. Students can focus learning skills they have not mastered.
3. Students can learn more efficiently within the time they set for themselves to learn.
Liability:
1. If skill mastery is incorrectly identified, students are still asked to over-practice a skill or worse, they might be asked to stop practicing a
skill they have not yet mastered.
Evidence:
Application: In the ASSISTments online learning system, teachers can give their students “skill builders”, which are a collection of
problems that test students’ mastery of a skill. Whenever students consecutively answer a certain number of problems correctly (e.g., 3
problems right in a row), they do not need to answer the remaining problems.
Data: According to ASSISTments math online learning system data, there were many cases wherein students became frustrated when
they were asked to answer problems they knew how to solve.
Expert Discussion: ASSISTments stakeholders (i.e., data mining experts, ITS experts, and educators) agreed that the problem occurred
frequently in the system and the selected solution could help resolve the problem.
Research: Cen, Koedinger, and Junker [2007] used data mining approaches to show that students had similar learning gains when they
over-practiced a skill and practiced a skill until mastery. However, it took less time to practice until mastery. The authors suggested that
students move on to learn a different skill instead of over-practice.
Related Patterns: Give students Just Enough Practice when they are asked to practice or apply their skills in learning activities such as
Fixer Upper, Mistake, and Fill in the Blanks [Bergin 2000].
Sample Implementation: A teacher designs homework with different math problem types (e.g., decimal addition and subtraction). He/she
configures the online learning system to switch between problem types whenever a student shows mastery in solving a particular problem
type. The number of times a student answered each problem type correctly can be used to identify mastery. For example, if the student
correctly answers 3 decimal-addition problems in a row, then the student will be asked to advance to decimal-subtraction problems.
Otherwise, the student will continue answering decimal-addition problems until he/she gets 3 correct answers in a row.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 10
5. DISCUSSION
The the 3D2P methodology differs from the other four pattern mining approaches (i.e., introspective,
social, artifactual, and pattern-mining workshops) in three ways. First, it relies on collected data for
finding and evaluating patterns (i.e., pattern prospecting and mining). Learning measures and other
system measures are essential for analyzing data. Second, partial measures of quality, such as
learning measures, are used to validate design patterns. Third, the approach can be used
incrementally to continuously refine design patterns when there is a significant change in the online
learning system context that could lead to the ineffectiveness of current solutions. For example, a user
interface designed for desktop computers might be ineffective for mobile devices like smart phones
and tablets.
The methodology holds much promise but also has some limitations. It requires more steps and
validation compared to traditional design pattern methods. Design patterns are uncovered from
existing data but they need to be tested and verified with new user data. While measuring pattern
outcomes empirically is valuable, this step also adds complexity. Yet, focusing on learner needs and
outcomes measured through data can center design decisions on the learner and focus a
heterogeneous system development team (cf. [Lambropoulos and Zaphiris 2006]).
The search for patterns may take a lot of time depending on the size and complexity of the data.
But, with any large data set it would take even more time to find design patterns by hand. The
ASSISTments data had 180,000 math problems and over 6 million student interactions for the 2012
2013 school year data alone. Significant time is saved because only a subset of problems is analyzed in
detail. Data is used in two ways: first, in historical data to find patterns of past behavior, and second,
in randomized controlled trials on thousands of students to test redesigned math problems using
patterns.
The measures used to find patterns are based on indirect and partial measures of student
behavior. Specifically, the analysis was limited to online system logs. Features such as student
learning, engagement, frustration, and boredom were measured but some contextual information was
missing from the dataset (e.g., classroom environment, teacher related variables, location). The
number and type of patterns discovered depend on the amount and type of data gathered. Future
work can capture the learning environment more broadly: soundscape (e.g., quiet or loud), location
(e.g., home, school), and so forth. Such contextual measures may help explain student outcomes and
affective states better. Moreover, including measures from other aspects of the system can expand the
search for design patterns. For example, network latency measures could correlate with frustration
when students wait too long for their learning activities to load on the browser.
The learning context likely plays a role in learning outcomes. Typically, design patterns are
defined for a known context. However, online tutoring systems such as ASSISTments support
multiple learning contexts ranging from in-class assignments, quizzes, tests, homework, and skill-
builder exercises. The multiplicity of contexts raises the question whether design patterns to learn
math problems are the same as those to test learning.
Despite the mentioned limitations, the methodology complements and augments traditional design
pattern methods. Pattern authors can get insights about the domain using data aside from their own
knowledge and existing literature. It can also encourage communication between design pattern
authors and stakeholders that might lead to a more holistic design quality. More generally, producing
design patterns can be an open, collaborative project [Inventado and Scupelli 2015].
6. SUMMARY AND FUTURE WORK
The design and maintenance of the quality of complex information technology systems, such as online
learning systems, is challenging because of its multi-disciplinary nature, design complexity,
continuously changing components and contents, and the diversity of its users. 3D2P uncovers design
patterns that might address these design challenges from data. Design patterns can help to ensure
that the addition of new content and the modifications in the system maintains its quality.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 11
The methodology helps stakeholders focus on design decisions that impact students’ learning,
which is the primary purpose of the system. Stakeholders will have more time to evaluate and develop
design patterns that address system issues based on actual data on top of their own knowledge and
existing literature. Most importantly, the effectiveness of design patterns can be evaluated through
randomized controlled trials, and effective patterns can be used to guide later design decisions. Design
patterns can be evaluated in multiple online learning system platforms to ensure quality [Inventado
and Scupelli 2015].
Although the methodology requires more time and effort, its advantages far outweigh its
limitations. The methodology is currently being used to build a design pattern language that will help
ASSISTments developers and users to create high quality designs and content. Currently, frustration-
related design patterns are being explored, but design patterns that address student boredom,
confusion, and learning gains are also being considered.
The methodology was developed and tested on online learning systems, but it might be useful in
other domains. For example, social media platforms, and other online service systems can benefit from
an analysis of the relationships between its designs and user experience measures.
7. ACKNOWLEDGEMENT
This material is based upon work supported by the National Science Foundation under DRL-1252297.
We would like to thank our shepherds Christian Köppe and Thomas Erickson for their invaluable
advice and feedback on this work. We would also like to thank Ryan Baker, Stefan Slater, and Jaclyn
Ocumpaugh from Teachers College Columbia University, and Neil Heffernan, Eric VanInwegen, and
Korinn Ostrow from Worcester Polytechnic Institute for helping us analyze the data and gain insights
for developing the methodology.
REFERENCES
Andrew Arnold, Richard Scheines, Joseph Beck, and Bill Jerome. 2005. Time and attention: Students, sessions, and tasks. In
Proceedings of the AAAI 2005 Workshop Educational Data Mining, 62-66.
ASSISTments. n.d. Retrieved February 1, 2015 from https://www.assistments.org/
Cristopher Alexander, Sara Ishikawa, and Murray Silverstein. 1977. A pattern language: towns, buildings, construction. Vol. 2.
Oxford University Press, New York, NY.
Elaine Allen and Jeff Seaman. 2008. Staying the course: Online education in the United States. Sloan Consortium, Needham,
MA.
Júnia Coutinho Anacleto, Américo Talarico Neto, and Vânia Paula de Almeida Néris. 2009. Cog-Learn: An e-Learning Pattern
Language for Web-based Learning Design. eLearn Magazine, 8 (2009), ACM, New York, NY.
Dana L. Anthony. 1996. Patterns for classroom education. In Pattern Languages of Program Design 2, John Vlissides, James
Coplien, and Norman Kerth (Eds.). Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA, 391-406.
Anthony R. Artino and Kenneth D. Jones. 2012. Exploring the complex relations between achievement emotions and self-
regulated learning behaviors in online learning. The Internet and Higher Education 15, 3, 170-175.
Ryan S. J. D. Baker, Sujith M. Gowda, Michael Wixon, Jessica Kalka, Angela Z. Wagner, Aatish Salvi, Vincent Aleven, Gail W.
Kusbit, Jaclyn Ocumpaugh, and Lisa Rossi. 2012. Towards Sensor-free Affect Detection in Cognitive Tutor Algebra. In
Proceedings of the 5th International Conference on Educational Data Mining, International Educational Data Mining
Society, 126-133.
Joseph Bergin. 2000. Fourteen Pedagogical Patterns. In Proceedings of the 5th European Conference on Pattern Languages of
Programs, Universitaetsverlag Konstanz, Germany, 1-49.
Joseph Bergin. 2003. Teaching polymorphism with elementary design patterns. In Companion of the 18th annual ACM
SIGPLAN conference on Object-oriented programming, systems, languages, and applications (OOPSLA ’03). ACM, New
York, NY, 167-169.
Joseph Bergin, Helen Sharp, Jane Chandler, Marianna Sipos, Jutta Eckstein, Markus Völter, Mary Lynn Manns, Eugene
Wallingford, and Klaus Marquardt. 2012. Pedagogical patterns: advice for educators. Joseph Bergin Software Tools.
Benjamin S. Bloom. 1974. Time and learning. American psychologist, 29(9), 682.
Jan Borchers. 2001. A Pattern Approach to Interaction Design. John Wiley & Sons, Inc., New York, NY.
Zachary Broderick, Christine O'Connor, Courtney Mulcahy, Neil Heffernan, and Christina Heffernan. 2011. Increasing parent
engagement in student learning using an intelligent tutoring system. Journal of Interactive Learning Research 22, 4,
Association for the Advancement of Computing in Education, Chesapeake, VA, 523-550.
Carnegie Learning. n.d. Retrieved February 1, 2015 from http://www.carnegielearning.com/
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 12
Hao Cen, Kenneth Koedinger, and Brian Junker. 2007. Is Over Practice Necessary?-Improving Learning Efficiency with the
Cognitive Tutor through Educational Data Mining. Frontiers in Artificial Intelligence and Applications, 158, 511.
Eric S. Chung, Jason I. Hong, James Lin, Madhu K. Prabaker, James A. Landay, and Alan L. Liu. 2004. Development and
evaluation of emerging design patterns for ubiquitous computing. In Proceedings of the 5th conference on Designing
interactive systems: processes, practices, methods, and techniques (DIS ’04). ACM, New York, NY, 233-242.
Ruth C. Clark, and Richard E. Mayer. 2011. E-learning and the science of instruction: Proven guidelines for consumers and
designers of multimedia learning. John Wiley & Sons.
Doug Clow. 2014. Data wranglers: human interpreters to help close the feedback loop. In Proceedings of the Fourth
International Conference on Learning Analytics and Knowledge (LAK ’14). ACM, New York, NY, 49-53.
Coursera - Free Online Courses from Top Universities. n.d. Retrieved February 1, 2015 from https://www.coursera.org/
Andy Dearden, Janet Finlay, Liz Allgar, and Barbara McManus. 2002. Evaluating pattern languages in participatory design.
In CHI'02 extended abstracts on Human factors in computing systems, ACM, 664-665.
David E. DeLano. 1998. Patterns mining. The Pattern Handbook, 87-96.
Andrea De Lucia, Vincenzo Deufemia, Carmine Gravino, and Michele Risi. 2010. Improving behavioral design pattern detection
through model checking. In 2010 14th European Conference on Software Maintenance and Reengineering (CSMR), IEEE,
664-665.
Michael Derntl. 2004. The Person-Centered e-Learning pattern repository: Design for reuse and extensibility. In World
Conference on Educational Multimedia, Hypermedia and Telecommunications, Association for the Advancement of
Computing in Education, Lugano, Switzerland, 3856-3861.
Milton B. Dobrin and Carl H. Savit. 1960. Introduction to geophysical prospecting. McGraw-Hill, New York, NY.
A. B. Broughton Edge and T.H. Laby. 1931. The Principles and Practice of Geophysical Prospecting. Cambridge University
Press, New York, NY.
Jing Dong, Yongtao Sun, and Yajing Zhao. 2008. Design pattern detection by template matching. In Proceedings of the 2008
ACM symposium on Applied computing, ACM, 765-769.
Francesca A. Fontana and Marco Zanoni, M. 2011. A tool for design pattern detection and software architecture
reconstruction. Information sciences, 181(7), 1306-1324.
Sherri S. Frizell and Roland Hübscher. 2011. Using Design Patterns to Support E-Learning Design. In Instructional Design:
Concepts, Methodologies, Tools and Applications. IGI-Global, Hershey, PA. 114-134.
Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. 1995. Design Patterns: Elements of Reusable Object-
Oriented Software. Addison-Wesley Longman Publishing Co., Inc., Boston, MA.
Elissaveta Gourova, Kostadinka Toteva, and Yanka Todorova. 2012. Audit of knowledge flows and critical business processes.
In Proceedings of the 17th European Conference on Pattern Languages of Programs. ACM, New York, NY.
Neil T. Heffernan and Cristina L. Heffernan. 2014. The ASSISTments Ecosystem: Building a Platform that Brings Scientists
and Teachers Together for Minimally Invasive Research on Human Learning and Teaching. International Journal of
Artificial Intelligence in Education 24, 4, 470-497.
Georgina Holden, Nicole Schadewitz, and Chrysi Rapanta. 2010. Patterns for the creation of elearning content and activities in
a university setting. In Proceedings of the 15th European Conference on Pattern Languages of Programs (EuroPLoP ’10).
ACM, New York, NY, USA.
Takashi Iba and Mami Sakamoto. 2011. Learning patterns III: a pattern language for creative learning. In Proceedings of the
18th Conference on Pattern Languages of Programs (PLoP ’11). ACM, New York, NY, USA.
Paul Salvador Inventado and Peter Scupelli. 2015. Towards an open, collaborative repository for online learning system design
patterns. eLearning Papers, 42(Design Patterns for Open Online Teaching), 14-27.
Paul Salvador Inventado and Peter Scupelli. in press. Using Big Data to Produce Patterns for the ASSISTments Online
Learning System. In Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP15).
Norman L. Kerth and Ward Cunningham.1997. Using patterns to improve our architectural vision. Software, IEEE, 14(1), 53-
59.
Kenneth R. Koedinger, John R. Anderson, William H. Hadley, and Mary A. Mark. 1997. Intelligent tutoring goes to school in
the big city. International Journal of Artificial Intelligence in Education, 8, 30-43.
Kenneth R. Koedinger, Ryan S. J. D. Baker, Kyle Cunningham, Alida Skogsholm, Brett Leber, and John Stamper. 2010. A data
repository for the EDM community: The PSLC DataShop. Handbook of educational data mining, CRC Press, Boca Raton,
FL, 43-56.
Kenneth R. Koedinger, Albert T. Corbett, and Charles Perfetti. 2012. The KnowledgeLearningInstruction framework:
Bridging the science-practice chasm to enhance robust student learning. Cognitive Science 36, 5, Blackwell Publishing Ltd.,
Hoboken, NJ, USA, 757-798.
Christian Kohls. 2013. The theories of design patterns and their practical implications exemplified for e-learning patterns.
Ph.D. Dissertation. Catholic University of Eichstätt-Ingolstadt, Bavaria, Germany.
Christian Köppe and Mariëlle Nijsten. 2012. A pattern language for teaching in a foreign language: part 1. In Proceedings of
the 17th European Conference on Pattern Languages of Programs (EuroPLoP ’12). ACM, New York, NY.
Tibor Kunert. 2009. User-Centered Interaction Design Patterns for Interactive Digital Television Applications (1st ed.).
Springer-Verlag, London, UK.
Niki Lambropoulos and Panayiotis Zaphiris. 2006. User-Centered Design of Online Learning Communities. IGI Publishing,
Hershey, PA, USA.
Kathleen A. Larson, Frances P. Trees, and D. Scott Weaver. 2008. Continuous feedback pedagogical patterns. In Proceedings of
the 15th Conference on Pattern Languages of Programs, ACM.
Shoujing Li, Xiaolu Xiong, and Joseph Beck. 2013. Modeling student retention in an environment with delayed testing.
Educational Data Mining 2013. International Educational Data Mining Society, 328-329.
Data-Driven Design Pattern Production: A Case Study on the ASSISTments Online Learning System: Page - 13
Michael J. Mahemoff and Lorraine J. Johnston. 1998. Principles for a Usability-Oriented Pattern Language. In Proceedings of
the Australasian Conference on Computer Human Interaction (OZHCI ’98). IEEE Computer Society, Washington, DC.
Michael Mendicino, Leena Razzaq, and Neil. T. Heffernan. 2009a. Improving learning from homework using intelligent tutoring
systems. Journal of Research on Technology in Education, 331-346.
Michael Mendicino, Leena Razzaq, and Neil T. Heffernan. 2009b. A comparison of traditional homework to computer-supported
homework. Journal of Research on Technology in Education (JRTE), 41, 3, 331-359.
Pat Morgan and Steven Ritter. 2002. An experimental study of the effects of Cognitive Tutor® Algebra I on student knowledge
and attitude. Carnegie Learning, Inc., Pittsburgh, PA.
Theresa Neil. 2014. Mobile Design Pattern Gallery: UI Patterns for Smartphone Apps. O'Reilly Media, Inc., Sebastopol, CA.
Jaclyn Ocumpaugh, Ryan S. J. D. Baker, Sujith Gowda, Neil Heffernan and Cristina Heffernan. 2014. Population validity for
Educational Data Mining models: A case study in affect detection. British Journal of Educational Technology 45, no. 3.
Wiley-Blackwell, Hoboken, NJ, 487-501.
Brian C. Patrick, Ellen A. Skinner, and James P. Connell. 1993. What motivates children's behavior and emotion? Joint effects
of perceived control and autonomy in the academic domain. Journal of Personality and social Psychology 65, 4, 781-791.
Reinhard Pekrun, Thomas Goetz, Wolfram Titz, and Raymond P. Perry. 2002. Academic emotions in students' self-regulated
learning and achievement: A program of qualitative and quantitative research. Educational psychologist 37, 2 (2002),
Taylor & Francis, Philadelphia, PA, 91-105.
Rosalind W. Picard. 1997. Affective computing. MIT press.
Rahul Ramachandran, John Rushing, Amy Lin, Helen Conover, Xiang Li, Sara Graves, U.S. Nair, Kwo-Sen Kuo, and Deborah
K. Smith. 2013. Data ProspectingA Step Towards Data Intensive Science.Selected Topics in Applied Earth Observations
and Remote Sensing, IEEE Journal of, 6(3), 1233-1241.
Doug Rohrer and Kelli Taylor. 2006. The effects of over-learning and distributed practice on the retention of mathematics
knowledge. Applied Cognitive Psychology, 20, 1209-1224.
Lawrence C. Rowan, Simon J. Hook, Michael J. Abrams, and John C. Mars. 2003. Mapping hydrothermally altered rocks at
Cuprite, Nevada, using the advanced spaceborne thermal emission and reflection radiometer (ASTER), a new satellite-
imaging system. Economic Geology, 98, 5, 1019-1027.
Hank Sarkis. 2004. Cognitive Tutor Algebra 1 Program Evaluation: Miami-Dade County Public Schools. The Reliability Group,
Lighthouse Point, FL.
Mike Sharkey. 2011. Academic analytics landscape at the University of Phoenix. In Proceedings of the 1st International
Conference on Learning Analytics and Knowledge (LAK ’11). ACM, New York, NY, 122-126.
Colin Shearer. 2000. The CRISP-DM model: the new blueprint for data mining. Journal of data warehousing, 5(4), 13-22.
John A. Sloboda, Jane W. Davidson, Michael J. A. Howe, and Derek G. Moore. 1996. The role of practice in the development of
performing musicians. British journal of psychology, 87(2), 287-310.Sweller, J., & Cooper, G. A. 1985. The use of worked
examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59-89.
John Sweller. 2004. Instructional design consequences of an analogy between evolution by natural selection and human
cognitive architecture. Instructional science, 32(1-2), 9-31.
Jenifer Tidwell. 2010. Designing interfaces. O'Reilly Media, Inc., Sebastopol, CA.
Michael Tuffiash, Roy W. Roring, and K. Anders Ericsson. 2007. Expert performance in SCRABBLE: implications for the study
of the structure and acquisition of complex skills. Journal of Experimental Psychology: Applied, 13(3), 124.
Douglas K. Van Duyne, James A. Landay, and Jason I. Hong. 2007. The Design of Sites. Prentice Hall, Upper Saddle River, NJ.
Martijn Van Welie, Gerrit C. Van Der Veer, and Anton Eliëns. 2001. Patterns as tools for user interface design. In Tools for
Working with Guidelines. Springer-Verlag, London, UK, 313-324.
Kalyan Veeramachaneni, Franck Dernoncourt, Colin Taylor, Zachary Pardos, and Una-May O’Reilly. 2013. Moocdb: Developing
data standards for mooc data science. In Workshops Proceedings of AIED 2013, 17-24.
Skyler Whorton. 2013. Can a computer adaptive assessment system determine, better than traditional methods, whether
students know mathematics skills? Master’s thesis, Computer Science Department, Worcester Polytechnic Institute.
Wen-Hsiung Wu, Yen-Chun Jim Wu, Chun-Yu Chen, Hao-Yun Kao, Che-Hung Lin, and Sih-Han Huang. 2012. Review of
trends from mobile learning studies: A meta-analysis.Computers & Education, 59(2), 817-827.
Michael V. Yudelson, Kenneth R. Koedinger, and Geoffrey J. Gordon. 2013. Individualized bayesian knowledge tracing models.
In Artificial Intelligence in Education, Springer Berlin Heidelberg, 171-180.
... Discussions related to the methodology in this paper were based on our experience in applying the methodology on ASSISTments data. More details can be found in Inventado and Scupelli (in press), and Inventado and Scupelli (2015). ...
... However, the RCT between hints and worked examples could be applied in other online learning systems. The worked examples pattern generated using the methodology (Inventado & Scupelli, 2015) is presented in Appendix A. ...
... Worked examples (Inventado & Scupelli, 2015) Context: An online learning system allows teachers to give exercises and assignments to their students. Teachers can select problems, specify the sequence and conditions for presenting problems, and assign an exercise or homework to students. ...
Article
Full-text available
Design patterns are high quality solutions to known problems in a specific context that guide design decisions. Typically, design patterns are mined and evaluated through four methods: expert knowledge, artifact analysis, social observations, and workshops. For example, experts discuss: knowledge, interpretations of artifacts, social patterns, and clarity of patterns. In this paper, we introduce a fifth method, a data-driven design pattern production (3D2P) method to produce design patterns and conduct randomized controlled trials as a means to evaluate applied design patterns. We illustrate the 3D2P method in the context of online learning systems (OLSs) that are difficult to create, update and maintain. To overcome such challenges, we propose an open repository for OLS design patterns, evaluation data, and implementation examples. On the repository, researchers can collaborate in the six stages of the pattern lifecycle (i.e., prospecting, mining, writing, evaluation, application, applied evaluation). The repository provides five benefits: researchers from different backgrounds can (a) collaborate on design pattern production; (b) perform distributed tasks in parallel; (c) share results for mutual benefit; (d) test patterns across varied systems and domains to explore pattern generalizability and robustness; and (e) promote design patterns to elevate OLS quality.
... The design patterns presented in this paper were uncovered using the data-driven design pattern production (3D2P) methodology. 3D2P is a four-step iterative process used to uncover design patterns from collected data (Inventado & Scupelli 2015a). As illustrated in Figure 1, 3D2P starts with pattern prospecting to find interesting relationships in the data. ...
... Finally, Mastery Learning refers to patterns that deal with the design of math problems or problem sets to promote skill mastery. Some of these design patterns have recently been published, which can be found in Inventado and Scupelli (2015a), Inventado and Scupelli (2015b), Inventado and Scupelli (2016a), and Inventado and Scupelli (2016b). ...
... This may indicate students' preference to learn from worked solutions, and its positive effects on their learning. More details about the data used, the methodology used for analysis, and the results are presented in (Inventado & Scupelli 2015a). ...
Conference Paper
Full-text available
Increasingly, many institutions and students benefit from online learning systems each year. For example, in 2016 Massive Open Online Courses (MOOCS) reported as many as 16 million enrolled students and online tutoring systems reported over half a million enrolled students. In the literature, many design patterns capture online learning system designs for class management, discussion facilitation, lecture delivery, and feedback. In this paper, we describe design patterns that describe finer-grained activities within online learning systems such as the design of problem-solving activities and their associated learning support. The three patterns presented in this paper describe designs for constructing math-problem content and corresponding learning support for students who answer these problems – Mastery Learning Templates, Explain Worked Solutions, and Scaffold Problems with Guide Questions. We uncovered these patterns using the data-driven design pattern production (3D2P) methodology on data collected from the ASSISTments online learning system. The design patterns we describe were mined from data on student interactions with an online learning system and were linked to existing learning-science literature.
... Finally, Mastery Learning refers to patterns that deal with the design of math problems or problem sets to promote skill mastery. Some of these design patterns have recently been published, which can be found in Inventado and Scupelli [2015a], Inventado and Scupelli [2015b], Inventado and Scupelli [2016a], and Inventado and Scupelli [2016b]. The pattern format used in this paper separates each part with a heading much like other pattern formats (c.f., [Carlsson 2004, Dearden andFinaly 2006]). ...
... The patterns presented in this paper are part of a pattern language for math problems and corresponding learning support in online learning systems, which is currently under development. Some related patterns were presented in Inventado and Scupelli [2015a] and Inventado and Scupelli [2015b]. The design patterns are also compiled in an online design pattern repository (http://learningenvironmentslab.org/openpatternrepository) and work is being done to foster collaboration between design pattern authors, domain experts, and design pattern users to continue writing, evaluating, and refining design patterns. ...
Conference Paper
Full-text available
Online learning systems have been gaining popularity, but are not without their challenges. For example, enrollment in MOOCs has slowed down, which is attributed to the lack of sustainability. The success of online learning systems is heavily influenced by how they were designed. For example, results from a recent study showed that the incorporation of learning activities to instruction increased learning gains as much as six times. Although there are many design patterns that may be applied in the design of learning activities, they usually operate at a higher level. There is a need for design patterns that address problems in implementing learning activities in online learning systems. Specifically, the goal of the four design patterns presented is to help students learn to represent math problems. The patterns presented in this paper are part of a pattern language for creating math problems and corresponding learning support in online learning systems.
... L'amélioration continue des EIAH à partir des résultats d'expériences dont ils sont le support est un processus important de leur ingénierie [16]. Appliqué au learning analytics, ce cycle d'amélioration permet de découvrir de nouveaux patrons à partir de l'analyse des traces du système qui vont à leur tour générer des données exploitables pour la recherche et l'amélioration de l'EIAH [17]. À partir de nos résultats et en suivant cette méthodologie, nous avons intégré à la plateforme Lab4CE deux nouvelles fonctionnalités fondées sur deux patrons de conception différents. ...
Article
Full-text available
Cette étude s’appuie sur les méthodes d’analyse des usages pour identifier de nouveaux facteurs de succès de l’apprentissage. Son originalité réside dans l’exploration des liens existants entre le comportement des apprenants pendant des activités de travaux pratiques et leur performance académique. A partir de traces issues d’expérimentations menées en contexte d’apprentissage réel, nous découvrons dans un premier temps un ensemble de motifs d’actions récurrents avant de définir des stratégies d’apprentissage de plus haut niveau d’abstraction. Les résultats montrent l’existence de corrélations entre certaines stratégies et la performance des apprenants : la construction progressive d’une action complexe, ou la réflexion avant l’exécution d’une action, sont deux stratégies appliquées plus fréquemment par les étudiants performants au test final. A partir de ces résultats, nous avons implanté pour les étudiants, mais aussi pour les enseignants, de nouveaux outils de guidage et de tutorat au sein de notre plateforme de travaux pratiques.
... Indeed, the continuous improvement of TELbased systems, according to experimental findings resulting from their usage, is a critical part of the re-engineering process [14]. Applied to learning analytics, this enhancement cycle makes it possible to discover new design patterns and to generate new data for research about and improvement of TEL [15]. ...
Conference Paper
Full-text available
This study analyzes students’ behaviors in a remote laboratory environment in order to identify new factors of prediction of academic success. It investigates relations between learners’ activities during practical sessions, and their performance at the final assessment test. Based on learning analytics applied to data collected from an experimentation conducted with our remote lab dedicated to computer education, we discover recurrent sequential patterns of actions that lead us to the definition of learning strategies as indicators of higher level of abstraction. Results show that some of the strategies are correlated to learners’ performance. For instance, the construction of a complex action step by step, or the reflection before submitting an action, are two strategies applied more often by learners of a higher level of performance than by other students. While our proposals are domain-independent and can thus apply to other learning contexts, the results of this study led us to instrument for both students and instructors new visualization and guiding tools in our remote lab environment.
... Indeed, the continuous improvement of TEL-based systems, according to experimental findings resulting from their usage, is a critical part of the re-engineering process [5]. Applied to learning analytics, this enhancement cycle makes it possible to discover new design patterns and to generate new data for research about and improvement of TEL [6]. ...
Conference Paper
This study analyzes students’ behavior in our remote laboratory environment and aims at identifying behavioural patterns during a practical session that lead to better learning outcomes, in order to predict learners’ performance and to automatically guide students who might need more support. Based on data collected from an experimentation conducted in an authentic learning context, we discover recurrent sequential patterns of actions that lead us to the definition of learning strategies as indicators of higher level of abstraction. Results show that some of the strategies are correlated to the learners’ performance at the final assessment test. For instance, construction of a complex action step by step, or reflexion before submitting an action, are two strategies applied more often by learners of a higher level of performance than by others. These findings led us to instrument for both students and instructors new guiding and tutoring tools in our remote lab environment.
... The design patterns presented in the following section were uncovered using the data-driven design pattern production (3D2P) methodology. 3D2P is a four-step iterative process used to uncover design patterns from data collected in a particular domain [Inventado & Scupelli 2015b]. As illustrated in Figure 1, 3D2P starts by prospecting data to find interesting relationships in the data. ...
Conference Paper
Full-text available
Online learning systems have been gaining popularity, but are not without its challenges. For example, enrollment in MOOCs has slowed down, which is attributed to lack of sustainability. Research shows that introducing learning activities with appropriate learning support may increase students' learning gains as much as six times compared to just delivering content in MOOCs. These results emphasize the importance of high quality designs for learning activities and learning support in online learning systems. Patterns for designing learning activities and learning-support are necessary because most pedagogical patterns operate at a higher level (e.g., course design, lectures, interaction with students). This paper presents three design patterns that may help online learning system stakeholders such as system developers, content creators, and teachers, to design learning-support for math problem-solving activities in online learning systems.
... It is also feasible to predict the problematic behaviors of hint misuse or hint abuse. Previous research has analyzed relationships between problem-related features (e.g., problem length, number of hints available, hint length) and student affect, behavior, and learning ( [3][11] [13] [19]). Among other findings, hint length has been positively correlated with gaming the system [3], a behavior incorporating help abuse that is associated with poorer learning outcomes ( [21] [23]). ...
Conference Paper
Full-text available
On-demand help in intelligent learning environments is typically linked to better learning, but may lead to longer completion times. This present work provides an analysis of how students interacted with a summer learning assignment when on-demand help was available, compared to when it was not. When hints were available from the start, students were more likely to delay work, compared to students for whom step-wise hints were only available after the third problem. When hints were always available, participants took significantly more time to complete a mastery learning assignment. We interpret this difference in time to complete the assignment as an opportunity to re-engage in productive math learning.
Conference Paper
Pedagogical design patterns offer high-quality solutions to known educational problems. Design patterns are meant to guide teachers' pedagogical decisions to improve student learning in varied learning contexts such as intelligent tutoring systems, traditional classrooms, online learning systems, and so forth. In theory, design patterns are written so that they are applicable to multiple learning contexts, but, in practice, we wonder if pedagogical design patterns intended specifically for one learning environment can be used in other learning environments. We explore this theoretical question for practical reasons. Over the past three years, we applied existing pedagogical design patterns and wrote new pedagogical patterns to enhance student feedback for math problems in an intelligent tutoring system (ITS) called ASSISTments. It was more difficult than expected for two reasons: there are few pedagogical design patterns specifically for ITSs and contextual features for design patterns developed for other learning environments make them either too general or too specific to apply to an ITS. For example, feedback design patterns that involve interpreting learners' misconceptions may be easy for teachers to apply in traditional classroom settings, but difficult for ITSs because current algorithms poorly predict student misconceptions. In this paper, we explore the adaptability of design patterns to different learning environments using guide questions and contextual design pattern features we developed from our experience of adapting design patterns to ITSs.
Article
Full-text available
Design patterns are high quality solutions to known problems in a specific context that guide design decisions. Typically, design patterns are mined and evaluated through four methods: expert knowledge, artifact analysis, social observations, and workshops. For example, experts discuss: knowledge, interpretations of artifacts, social patterns, and clarity of patterns. In this paper, we introduce a fifth method, a data-driven design pattern production (3D2P) method to produce design patterns and conduct randomized controlled trials as a means to evaluate applied design patterns. We illustrate the 3D2P method in the context of online learning systems (OLSs) that are difficult to create, update and maintain. To overcome such challenges, we propose an open repository for OLS design patterns, evaluation data, and implementation examples. On the repository, researchers can collaborate in the six stages of the pattern lifecycle (i.e., prospecting, mining, writing, evaluation, application, applied evaluation). The repository provides five benefits: researchers from different backgrounds can (a) collaborate on design pattern production; (b) perform distributed tasks in parallel; (c) share results for mutual benefit; (d) test patterns across varied systems and domains to explore pattern generalizability and robustness; and (e) promote design patterns to elevate OLS quality.
Article
Full-text available
This study explores the ability of an Intelligent Tutoring System (ITS) to increase parental engagement in student learning. A parental notification feature was developed for the web-based ASSISTments ITS that allows parents to log into their own accounts and access detailed data about their students' performance. Parents from a local middle school were then invited to create accounts and answer a survey assessing how engaged they felt they were in their students' education. A 60 day exploratory study was run during which messages were sent home to parents regarding what their students were studying in class and how they were performing. After having them take a post-survey, it was found that parents felt significantly more engaged in their students' education. Additionally, the messages increased how frequently parents logged in to check reports on their students' performance data using the ASSISTments system. Qualitative feedback from both parents and teachers was very positive.
Chapter
Design patterns have received considerable attention for their potential as a means of capturing and sharing design knowledge. This chapter provides a review of design pattern research and usage within education and other disciplines, summarizes the reported benefits of the approach, and examines design patterns in relation to other approaches to supporting design. Building upon this work, it argues that design patterns can capture learning design knowledge from theories and best practices to support novices in effective e-learning design. This chapter describes the authors’ work on the development of designs patterns for e-learning. It concludes with a discussion of future research for educational uses of design patterns.
Chapter
Design patterns have received considerable attention for their potential as a means of capturing and sharing design knowledge. This chapter provides a review of design pattern research and usage within education and other disciplines, summarizes the reported benefits of the approach, and examines design patterns in relation to other approaches to supporting design. Building upon this work, it argues that design patterns can capture learning design knowledge from theories and best practices to support novices in effective e-learning design. This chapter describes the authors’ work on the development of designs patterns for e-learning. It concludes with a discussion of future research for educational uses of design patterns.
Chapter
Technology is meant to make life easier and to raise its quality. Our interaction with technology should be designed according to human needs instead of us being required to adapt to technology. Even so, technology may change quickly and people and their habits change slowly (Norman 1988). Ease of use has been recognised as a key factor influencing users’ acceptance of new technologies, as in the Technology Acceptance Model (TAM) framework (Davis 1989; Kaasinen 2005) and in the innovation diffusion framework (Rogers 1995). This book addresses interactive digital television (iTV) as an instance of new technology. With the aim of supporting user acceptance of iTV, the focus of this book is on the usability of iTV applications.
Article
The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is a 14-band multispectral instrument on board the Earth Observing System (EOS), TERRA. The three bands between 0.52 and 0.86 mum and the six bands from 1.60 and 2.43 mum, which have 15- and 30-m spatial resolution, respectively, were selected primarily for making remote mineralogical determinations. The Cuprite, Nevada, mining district comprises two hydrothermal alteration centers where Tertiary volcanic rocks have been hydrothermally altered mainly to bleached silicified rocks and opalized rocks, with a marginal zone of limonitic argillized rocks. Country rocks are mainly Cambrian phyllitic siltstone and limestone. Evaluation of an ASTER image of the Cuprite district shows that spectral reflectance differences in the nine bands in the 0.52 to 2.43 mum region provide a basis for identifying and mapping mineralogical components which characterize the main hydrothermal alteration zones: opal is the spectrally dominant mineral in the silicified zone; whereas, alunite and kaolinite are dominant in the opalized zone. In addition, the distribution of unaltered country rocks was mapped because of the presence of spectrally dominant muscovite in the siltstone and calcite in limestone, and the tuffaceous rocks and playa deposits were distinguishable due to their relatively flat spectra and weak absorption features at 2.33 and 2.20 mum, respectively. An Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) image of the study area was processed using a similar methodology used with the ASTER data. Comparison of the ASTER and AVIRIS results shows that the results are generally similar, but the higher spectral resolution of AVIRIS (224 bands) permits identification of more individual minerals, including certain polymorphs. However, ASTER has recorded images of more than 90 percent of the Earth's land surface with less than 20 percent cloud cover, and these data are available at nominal or no cost. Landsat TM images have a similar spatial resolution to ASTER images, but TM has fewer bands, which limits its usefulness for making mineral determinations.
Conference Paper
In this paper we present a pattern language for learners who want to learn better without killing their creativity. In order to tell a 'knack' about the way of learning we apply the method of pattern language, which was originally proposed in architectural design and became famous in software design. Our proposed pattern language for creative learners, which we named "Learning Patterns," consists of 40 patterns. Each pattern is described in the same format; pattern number, pattern name, introduction, illustration, context, problem, forces, solution, actions, and related patterns. Although Learning patterns were originally developed in order to support learning of university students, we think it can be applied to any learners in various situations like engineering, business, science, and everyday life due to their fine abstract descriptions as a pattern language. In this paper, we show the overview of the 40 patterns and four patterns in detail. Note that other five patterns have been presented in our previous paper at PLoP09 [1].