Conference PaperPDF Available

Formative feedback in Digital Lofts: Learning environments for real world innovation

Authors:

Abstract and Figures

Civic innovators design real-world solutions to societal problems. Teaching civic innovation presents serious challenges in classroom orchestration because facilitators must manage a complex learning environment (which may include community partners, open-ended problems and long time scales) and cannot rely on traditional classroom orchestration techniques (such as fixed schedules, pre-selected topics and simplified problems). Here we consider how digital lofts-online learning environments for civic innovation might overcome orchestration challenges through the use of badges, cases, crowd-feedback, semi-automatically created instruction, self-assessment triggered group instruction, social media, and credentialing. Together these features create three types of feedback loops: a crowd critique loop in which learners receive formative feedback on their innovation work from a broader community, a case development loop in which examples of student work are semi-automatically created to provide instruction, and a learner-driven instructional loop, in which self-assessments determine which group instruction is provided. Researching and developing digital lofts will help us to understand how to support real-world innovation across design disciplines such as engineering, policy, writing and even science; and result in technologies for disseminating and scaling civic innovation education more broadly.
Content may be subject to copyright.
Formative feedback in Digital Lofts: Learning
environments for real world innovation
Matthew W. Easterday
easterday@northwestern.edu
Daniel Rees-Lewis
daniel2011@u.northwestern.edu
Elizabeth Gerber
egerber@northwestern.edu
Northwestern University, Evanston, IL, 60208
ABSTRACT
Civic innovators design real-world solutions to societal problems.
Teaching civic innovation presents serious challenges in
classroom orchestration because facilitators must manage a
complex learning environment (which may include community
partners, open-ended problems and long time scales) and cannot
rely on traditional classroom orchestration techniques (such as
fixed schedules, pre-selected topics and simplified problems).
Here we consider how digital lofts--online learning environments
for civic innovation might overcome orchestration challenges
through the use of badges, cases, crowd-feedback, semi-
automatically created instruction, self-assessment triggered group
instruction, social media, and credentialing. Together these
features create three types of feedback loops: a crowd critique
loop in which learners receive formative feedback on their
innovation work from a broader community, a case development
loop in which examples of student work are semi-automatically
created to provide instruction, and a learner-driven instructional
loop, in which self-assessments determine which group instruction
is provided. Researching and developing digital lofts will help us
to understand how to support real-world innovation across design
disciplines such as engineering, policy, writing and even science;
and result in technologies for disseminating and scaling civic
innovation education more broadly.
Keywords
Digital lofts, feedback, civic innovation, online learning
environments
1. INTRODUCTION
Many of the challenges facing our society such as global
warming, poverty, and illiteracy are political problems that cannot
be solved through engineering alone. For example, to create
environmentally sustainable cities we would have to train
engineers to redesign the land, water, energy and information
systems of the city. And while we do train engineers to design
membrane filtration-, renewable energy-, and mass transit-
systems, we do not teach them about changing economic policy to
promote conservation, energy initiatives to discourage fossil fuel
use, or zoning rules to encourage mass transit. We do teach
engineers about complex mechanical systems and how to
communicate effectively as a team, but we don’t teach them that
sustainable infrastructure might also require changes in policy.
Even when we do teach them about policy, we don’t teach them
how to change it, and even if they did know how to change it, they
can’t change it alone, leaving us with engineers who are at the
mercy of policy problems, not ones that can solve them. In short,
good technology and bad policy means no impact (Easterday,
2012).
To overcome societal challenges, we must train civic innovators
who can identify, design and engineer solutions to societal
problems. Civic innovators must be able to develop, modify, and
implement ideas while navigating ambiguous problem contexts,
overcoming setbacks, and persisting through uncertainty in their
community. To become civic innovators, learners must gain
experience identifying and tackling complex, ill-structured design
challenges that are not easily solved within a fixed time frame.
Civic innovation education is thus a kind of service learning that
“...integrates meaningful community service with instruction and
reflection to enrich the learning experience, teach civic
responsibility, and strengthen communities...” (ETR Associates,
2012). However, unlike other forms of service learning, civic
innovation focuses on design--whereas service learning might ask
students to pick up trash in a riverbed to motivate learning about
ecology, civic innovation might ask students to pick up trash in a
riverbed to motivate learning about ecology in order to identify,
design, and engineer solutions to reduce environmental pollution.
But embedding learning in real-world activities makes civic
innovation difficult to teach: individual mentoring can be effective
but expensive; extra-curricular environments provide flexibility
but insufficient guidance; and classroom instruction is too rigid
and time-bound for solving complex societal problems.
Embedding learning in real-world activities creates a serious
challenge of classroom orchestration. Classroom orchestration
(Dillenbourg & Jermann, 2010) involves satisfying the constraints
of curriculum, assessment, time, energy, space, etc. required to
promote learning in a given context. Embedding learning in the
real-world increases the orchestration challenge because
orchestration techniques that work in the classroom (such as using
simple problems, making students complete assignments at the
same pace) can’t be used when learners are working on real-world
problems. Adding community clients and professional design
mentors only makes orchestration more challenging.
New cyberlearning technologies, such as web 2.0, social media,
reputation systems, and crowdsourcing offer new ways to
orchestrate learning environments for civic innovation. Just as we
create instructional labs to teach science, the purpose of this
project is to develop instructional lofts to teach innovation. Our
research question is: how might we create Digital Lofts: on-line,
learning platforms for teaching civic innovation that overcome
the orchestration challenge?
Knowing how to design digital lofts that overcomes the
orchestration challenge will allow us to amplify teaching
resources to make civic innovation education feasible. Design
principles for Digital Lofts would allow us to overcome
orchestration challenge not just for civic innovation education, but
for project-based learning environments as well, allowing us to
design learning environments that are more sustainable, more
easily scaled to new contexts, and more like real life.
2. BACKGROUND
Advantages of civic innovation learning communities
What do civic innovation learning environments look like? Civic
innovation learning communities: (a) have pro-public missions,
(b) teach learners how to design solutions to real problems, (c) are
led by learners and supported by faculty and professional experts,
and (d) extend nationally through a network of chapters. For
example, in GlobeMed, students work on international health
challenges. In Engineers for a Sustainable World, students work
on projects that promote environmental, economic, and social
sustainability. It is important to stress the pro-public mission of
these learning communities. Learners are tackling problems that
require them to address societal challenges and to understand
policy issues. For example, by tackling the problem of energy
sustainability, students are forced to consider the environmental,
economic and legal policies that constrain the effectiveness of
technological interventions. For this project, we consider Design
for America, which provides an ideal model of a learning
community for civic innovation.
Figure 1. Design for America’s community of practice. The 14
studios are hosted on University campuses and interact with, but
do not replace the existing curricula. Studios incorporate local
clients, mentors and alumni and communicate directly with DFA
Headquarters.
Design for America (DFA) is a learner-directed, extracurricular
service-learning environment that is succeeding at developing
civic innovators. Universities host on-campus DFA studios in
which student teams work on self-selected civic innovation
projects throughout the academic year, applying the skills and
expertise they’ve gained through academic coursework (Figure 1
& 2). Student teams identify challenges in healthcare,
environment, and education in their local community such as
reducing hospital-acquired infections and reducing water waste in
cafeterias. They work with organizational partners to: understand
stakeholder needs, ideate, prototype, test, and implement
solutions. During the annual 4-day Leadership Studio,
experienced student leaders train new student leaders in studio
management and leadership.
Design for America was conceived by co-author Gerber during
the 2008 presidential election to engage university students in
solving civic issues using human-centered design. As an assistant
professor of design, Gerber joined student co-founders Mert Iseri,
Yuri Malina, and Hannah Chung, to start the first studio at
Northwestern University. Currently, there are 14 studios hosted
by universities throughout the country (including Stanford,
Virginia Tech, and Northwestern) involving 1800 students (58%
women), aged 18-30 from over 60 majors, working on over 50
projects; 15 faculty mentors; and 80 professional mentors. And
the number of studios is expected to grow to 30 by 2015. In just
four years, DFA has produced two start-ups that have raised over
$1.5 million in funding. DFA has been featured in Fast Company,
Oprah, and the Chicago Tribune.
Figure 2: Design for America students learn civic innovation
through projects that require designing, building, and
implementing solutions.
Findings from surveys, daily diaries, interviews, and observations
suggest that DFA students develop confidence in their ability to
act as civic innovators through successful task completion, social
persuasion, and vicarious learning in communities of practice with
clients, peers, industry professionals, and faculty. Furthermore,
students attribute achievement of learning outcomes outlined by
the Accreditation Board for Engineering and Technology
including identifying, formulating, and solving problems;
functioning on a multidisciplinary team; communicating
effectively; and knowledge of contemporary issues to their
participation in Design for America. (Gerber, Marie Olson, &
Komarek, 2012); (ABET Engineering Accreditation Commission,
2011).
Design for America’s civic innovation model follows many
recommendations of the learning sciences for improving
motivation and transfer such as using real world problems that
require design of meaningful products with social relevance.
DFA encourages students to work on authentic problems (Shaffer
& Resnick, 1999) to motivate learning and transfer. Students
identify and select projects and self-direct the innovation and
discovery process including observation, idea generation,
prototyping, and testing (Kolodner, Crismond, Gray, Holbrook, &
Puntambekar, 1998); (Puntambekar & Kolodner, 2005). By
trying to apply their knowledge to a problem, students come to
understand what they know and when they need more information
(Edelson, 2001). Like service learning (Furco, 1996), DFA
increases civic awareness, interest in the real needs of people, and
contemporary issues by focusing on innovating solutions to local
community challenges (Gerber et al., 2012).
Unlike traditional classrooms, Design for America’s community
of practice (Figure 1) expands beyond the physical boundaries of
the student community to include experienced, local
professionals, local clients and community members, as well as
beyond the temporal boundaries of student life as learners
continue to participate in projects as alumni. Students
involvement in a community of practice (Lave & Wenger, 1991)
includes engaging with peer mentors, professionals and faculty in
a non-evaluative environment over an extended timeframe.
Communities of practice foster innovation self-efficacy (i.e.,
learners’ belief in their ability to innovate, (Gerber et al., 2012)
and such beliefs influence goal setting, effort, persistence,
learning and attribution of failure (Bandura, 1997); (Deci & Ryan,
1987); (Ryan & Deci, 2000). Students select real world
challenges (Shaffer & Resnick, 1999) that are personally
meaningful, build and test solutions to problems, and share their
work with the community through review sessions (Papert &
Harel, 1991); (Papert, 1980); (Resnick, 2009); (Kolodner,
Owensby, & Guzdial, 2004). Because DFA projects are
extracurricular, they conclude when ideas are implemented, rather
than when the academic term ends.
Orchestration challenges in civic innovation learning
communities
While learning environments for civic innovation have many
potential advantages, they also face many challenges. Civic
innovation teachers face serious orchestration challenges because
they have to teach many different project teams, with different
levels of expertise, working on different problems for different
community clients. The orchestration challenge makes civic
innovation difficult to teach well.
Like many extra-curricular organizations, DFA students often
suffer from a lack of guidance. Our needs analysis of Design for
America found that, unsurprisingly, learners would benefit from
more scaffolding and feedback on the innovation process
including: (a) planning and conducting research on their project
challenge; (b) using initial research to inform proposed solutions;
(c) selecting and conducting appropriate design activities for their
project challenge; and (d) discounting initial solutions if these
solutions prove not to be viable. While DFA has been very
successful at attracting learners, these learners report that
frustrations from lack of progress makes them question their
commitment to the work they are undertaking. And while leaders
(student facilitators) experienced in project work and trained at
the DFA leadership studio require less support, they find helping
other students very challenging. In interviews, these student
leaders asked for more granular ‘how to’ guides from DFA
headquarters.
DFA students also often struggle to access available resources that
could help them in their projects. While students are aware that
they can reach out to experts within the DFA network generally,
they struggled to identify specific individuals or instructional
resources that can help them. Learners often fail to ask for
support from more experienced members of the community
because they don’t know whom or for what to ask. Similarly,
learners find it challenging to locate helpful instruction. They
report floundering for long periods of time trying to find resources
and as well as not knowing where to start looking.
In fact, these issues are challenges in project-based learning and
criticisms of minimally guided instruction in general. Without
sufficient guidance, learners become lost, confused and frustrated,
which can lead to misconceptions (Kirschner, Sweller, & Clark,
2006); (Hardiman, Pollatsek, & Well, 1986); (Brown &
Campione, 1996). Furthermore, students often need to develop
additional help-seeking skills in order to learn effectively (Gall,
1981; Pintrich, 2004); (Ryan, Pintrich, & Midgley, 2001).
Learning science provides myriad ways to offer guidance such as
providing explanations, worked examples, process worksheets,
prompts, (and many more) (Scardamalia, Bereiter, & Steinbach,
1984); (Reiser, 2004); (Edelson, Gordin, & Pea, 1999);
(Puntambekar & Kolodner, 2005); (Kolodner et al., 2004).
Note that we do not wish to re-litigate the discovery vs. direct
instructional debate here--achieving the proper balance between
providing and withholding assistance (a.k.a the assistance
dilemma) remains a fundamental and enduring question in the
learning sciences (Koedinger & Aleven, 2007). Our point is
merely that civic innovation facilitators cannot effectively deliver
any instructional model (constructionist, direct, or otherwise)
because they cannot effectively orchestrate learning at DFA
studios. In other words, we cannot answer the fundamental
questions about civic innovation without addressing orchestration.
The need for new orchestration technologies
In a typical classroom, orchestration is relatively easy. But the
traditional classroom approaches to orchestration don’t work for
civic innovation. For example, to make classroom teaching
easier, we often give students identical, simplified problems (in
the words of one DFA student: “well-defined problems on a
platter.”) We use schedules that keep learners moving at the same
pace so we can teach the same skills and knowledge to the whole
class. This is an easy way to orchestrate groups of learners when
we have a limited set of teaching resources.
Unfortunately, when we use simplified, artificial problems, we
don’t give students a chance to practice the skills for coping with
design complexity we want them to learn. We also destroy the
motivational benefits that come from working on real world
problems. For example, if we want students to practice
“scoping,” (i.e., identifying important but tractable problems to
solve) then we need to give them ill-defined problems that can be
scoped in different ways and that may not fit neatly into the
academic calendar. If we want them to practice communicating
with clients, then we must accept unclear and changing project
goals. If we want to take advantage of students’ intrinsic
motivation to address real world problems on topics they feel are
important, then we must accept a certain level idiosyncrasy of
projects. But once we start letting different groups work on
different, more complex problems, at different speeds, working
with clients in the community, and so on, it becomes almost
impossible for a single teacher to orchestrate learning in a
productive way.
Could technology help teacher orchestrate civic innovation
learning environments? Existing online learning management
platforms do not address the orchestration problem. Many of the
most popular general-purpose online platforms assume a
classroom model and are designed for distributing online books or
lectures, such as academic platforms like the Open Learning
Initiative (Lovett, Meyer, & Thille, 2008), MIT Opencourseware
(Massachusetts Institute of Technology, 2012), and Coursera
(Severance, 2012), which do not help us orchestrate design
projects. Other technologies provide no pedagogical help but
rather tools for managing files and conversations, such as
Blackboard (Blackboard Inc., 2012), Canvas (Canvas, 2012), Lore
(Lore, 2012), and Sakai (Sakai Foundation, 2012). Some
technologies for orchestration focus on only small portions of the
challenge such as managing a single activity (Dillenbourg &
Jermann, 2010). And while there has been great progress in
technologies for orchestrating scientific inquiry (Peters & Slotta,
2010), such as BioKIDS (Songer, 2006), BGuILE (Reiser et al.,
2001); (Sandoval & Reiser, 2004), Inquiry Island (White et al.,
2002), KIE (Bell, Davis, & Linn, 1995), and WISE (Slotta, 2004),
these platforms are not appropriate for teaching civic innovation.
Solving the orchestration challenge is not simply another
application of technology to teaching, it is absolutely essential for
creating the civic innovation learning environments urgently
needed to prepare learners for the societal challenges that await
them.
3. TECHNOLOGICAL INNOVATION
Orchestration of civic innovation is difficult because there are too
many moving pieces: different learners, with different abilities,
working on different (complex) design problems, at different
speeds, with different community clients. We could solve the
orchestration challenge by giving each project team it’s own
professional design teacher but doing so is costly. However, with
new technologies like web 2.0, crowdsourcing, and social media,
we may be able to reduce the orchestration challenge for teachers
and give them additional resources to overcome it. Specifically:
we can use web 2.0 to scaffold the innovation process and provide
flipped, just-in-time instruction relevant to students’ current goals;
we can use crowd-feedback to provide learners with more
frequent, higher quality feedback on their progress; we can use
recommender systems to semi-automatically create case libraries
of successful designs; and we can automatically monitor group
progress so teachers can give the right instruction to the right
group at the right time.
Design hypothesis. Our initial design hypothesis argues that we
can teach civic innovation by using what we call Digital Lofts to
overcome the orchestration challenge. Digital Lofts are online
learning platforms for support learning in real world contexts that:
1. use badges to scaffold the innovation process,
2. provide a student-generated and curated case-library linked to
badges to teach design,
3. use crowd-feedback to increase the frequency and quality of
feedback,
4. use recommender systems to semi-automatically create case-
based instructional material,
5. use self-assessment to trigger maximally relevant group
instruction,
6. use social media to facilitate participation and support, and
7. use recognition and credentialing to facilitate help-seeking and
connections to resources.
These features allow us to create a curriculum that dynamically
adapts to the needs of the learner, that is, to merge curriculum and
data. By merging curriculum+data, we can reduce the challenge
of orchestrating civic innovation to a manageable level.
To understand how Lofts help us orchestrate civic innovation, we
can think of Lofts as supporting 3 interrelated feedback loops: (a)
a crowd-critique loop in which students receive feedback on
their work through project critiques, (b) a case development loop
in which student work is used to semi-automatically create case
studies of successful and unsuccessful designs which are then
used to teach design principles, and (c) a learner-driven
instructional loop in which students’ self assessments trigger
face-to-face group instruction taught by facilitators (Figure 3).
The crowd-critique loop
Designers and engineers often organize their work according to an
innovation process. Figure 4 shows the high level steps or goals
of a simplified innovation process consistent with the processes
used by leading design and engineering firms like IDEO and
Cooper (Dubberly, 2005) by the Stanford d.school (Beckman &
Barry, 2007) or defined in engineering education standards
(Massachusetts Department of Education, 2006). In Figure 4, the
first stage of design is to “focus” by identifying a potential topic
to address such as “water conservation at universities. The
second stage is to “immerse” or study the user-needs, constraints
and technologies involved in the issue. The third stage is to
“define” a specific problem that can be solved, such as “reduce
water use in the college cafeteria by 30%.” The fourth stage is to
“ideate” by generating a wide range of potential solutions. The
fifth stage is to “build” the design using sketches, prototypes and
high-fidelity implementations that realize the design idea. The
sixth stage is to “test” the design. Even in simplified models like
that in Figure 4, the design process is applied in an iterative and
non-linear manner.
Figure 4. Badges scaffold complex design processes for the
novice into smaller, more manageable challenges and identify
members who have passed the challenges as potential mentors.
Design can be thought of as a process of learning (Beckman &
Barry, 2007); (Owen, 1998). Designers construct new knowledge
through observations that yield insights; insights support
frameworks that inspire ideas that lead to innovative solutions
(Beckman & Barry, 2007). Through this process, people
construct knowledge (Dong, 2005), moving back and forth from
the analytic phase of design, which focuses on finding and
Figure 3. Digital Lofts merge curriculum and data in three integrated feedback loops: the crowd-critique loop, the case development
loop and the learner-driven instructional loop.
discovery, to the synthetic phase, which focuses on invention and
making (Owen, 1998). Beckman and Barry (2007) describe
knowledge creation through the design process as movement
between concrete experiences and abstract conceptualization,
reflective observation, and active experimentation. Inductive and
deductive practices support the construction of new knowledge
that designers use to shape the environment in ways that did not
previously exist.
So how can teachers guide design groups working on different,
complex problems? One of the most important ways to promote
learning is to provide learners with scaffolding and feedback on
their work.
The Loft’s crowd-critique loop scaffolds the design process and
provides feedback using project critiques. The crowd-critique
loop starts with project badges (like girl scout badges) that break
the complex design process into a series of manageable mini-
challenges (Figure 4). For example, for the focus badge, learners
have to scope an important but tractable issue such as hospital
acquired infections; for the immerse badge, learners have to
conduct user-research on their target population to better
understand their needs. In the second step of the crowd-critique
loop, learners use the resources attached to each badge to help
them solve the challenge--each badge is linked to flipped
(blended) instructional material (Khan, 2012); (Lovett et al.,
2008) that includes resources, principles, and examples that can
help the learners solve the design challenge. For example, the
“build” badge for a web design project might include a video
lecture on writing html, an interactive javascript tutorial, on-line
readings about web-design principles, or examples of the different
stages of creating a well-designed website In the final step of the
crowd-critique loop, (after students have worked on a badge and
submitted their work to the Loft), the Loft solicits feedback on
students’ work from professional design mentors and peers who
have previously completed the badge. The mentors and peers use
the badge assessment rubrics to provide feedback to students.
The widespread use of badges in online games has led to a surge
of interest in badges for learning (Duncan, 2011). However, civic
innovation students are already intrinsically motivated to work on
real world design problems, so it doesn’t make sense to use
badges as extrinsic rewards that might decrease motivation (Deci,
Koestner, & Ryan, 1999) and encourage gaming the system
(Kraut & Resnick, 2012). So instead, Lofts use badges to scaffold
the design process and communicate learning goals, which should
increase learning (Ambrose, Bridges, DiPietro, Lovett, &
Norman, 2010).
Combining flipped instruction with face-to-face teaching can be
more effective than face-to-face teaching or online-only teaching
alone (Scheines, Leinhardt, Smith, & Cho, 2005); (Lovett et al.,
2008). Our flipped instructional material will use a guided-
experiential learning approach shown to improve learning
outcomes relative to traditional project-based learning (Velmahos
et al., 2004); (Clark, 2004/2008).
Providing high quality feedback to learners is one of the most
effective ways to increase learning (Hattie, 2009); (Hattie &
Timperley, 2007); (Ambrose et al., 2010). The Loft provides
learners with two underutilized sources of feedback: professional
mentors and peers. Giving peers well-designed assessment
rubrics can make their feedback as effective as instructor feedback
(Sadler & Good, 2006). The Loft thus uses crowd-feedback to
increase the frequency and quality of feedback available to
learners.
But what if students refuse to submit work or mentors and peers
refuse to review it (Kraut & Resnick, 2012)? Our needs analysis
found that DFA students are hungry for feedback on their project
and very willing to submit work to get this feedback. Professional
design mentors are also very willing to provide this feedback
assuming that students ‘drive’ the process by providing them with
well-prepared material from their design process (which the
badges help students to do).
The case development loop
Developing useful learning resources can be a challenging task
especially with design teams that may all be pursuing different
directions at different times--how can cyberlearning technologies
help produce effective and engaging learning resources?
Our needs analysis found that DFA students prefer to share design
lessons through stories about how they created their designs and
how well those designs worked. In the learning sciences, this falls
under the heading of case-based reasoning, where each story
describes an example or case of a design that worked (or didn’t
work) along with an explanation of the key features that led the
result, in which context, and so on. Teaching effectively with
cases has been well studied in several forms, including learning
from cases (Kolodner, 1993; 1997), analogies (Gentner,
Loewenstein, & Thompson, 2003), and worked-examples (Ward
& Sweller, 1990; Salden, Aleven, Renkl, & Schwonke, 2009).
Unfortunately, DFA students’ learning from cases suffers many
limitations: (a) it is done informally, so knowledge of particular
cases is not spread widely; (b) students do not effectively teach
with cases, sometimes hiding illustrative mistakes, promote their
projects rather than teaching, and failing to highlight the key
design lesson or principle; and (c) students do not present
contrasting cases that would allow learners to understand the deep
features and the context of applicability of a case. Such
knowledge sharing is typical of large distributed organizations
(Argote, 1999).
Furthermore, it is difficult to create case-based teaching material
both in terms of creating a useful library of cases and in creating
ways for learners to find the appropriate case when needed
(Kolodner, 1997).
Digital Lofts overcome this challenge through a case development
loop. In the case development loop, the Loft uses assessments of
students’ work to semi-automatically create case libraries--
examples of student work that include reflections about what
worked, what didn’t, in what context. First, the crowd-feedback
from the crowd-critique loop is used to recommend particularly
successful and unsuccessful examples of each design step,
producing sets of contrasting cases. Second, an instructional
designer creates curated cases by selecting cases that best
illustrate key design principles. The instructional designer then
refines these cases. Finally, the contrasting cases are then
presented as an instructional resources linked to each badge.
The crowd-feedback and badging systems of the Loft reduce the
orchestration challenge of providing relevant and engaging
instruction to a manageable level in several ways. First, the Loft
continually collects student work from multiple campuses, so we
get the initial material for the case library “for free” using
crowdsourcing, or production of work by a distributed crowd of
people (Von Ahn & Dabbish, 2004). Second, project critiques act
as a recommender system (Kiesler, Kraut, Resnick, & Kittur,
2012) sorting student work into contrasting cases. Third, cases
are already linked to particular phases of the design process
through the badges, so we automatically generate index that links
the case to the relevant goal the student is working on. After the
Digital Loft has done the heavy-lifting of generating,
recommending, and indexing cases, the instructional designers
can make the final case selection. Instructional designers can also
edit the cases to improve their quality (Puntambekar & Kolodner,
2005; Kolodner et al., 2004), and present related so to encourage
case comparison thus improving the chances of transfer
(Thompson, Gentner, & Loewenstein, 2000; Gentner et al., 2003).
The learner-driven instructional loop
One of the difficulties of teaching groups of students of varying
abilities engaged in projects at differing stages is how to provide
face-to-face group instruction in a relevant and timely manner.
When should a facilitator lead a “user research” workshop if each
group is at a different stage of the design process? While the Loft
tailors feedback and instruction to each project team, there is still
a need for group instruction taught by a knowledgeable facilitator.
In the learner-driven instruction loop, students’ self assessments
of their abilities and interest in learning different design skills are
collected and monitored by the Loft. When enough students
indicate a desire to learn a certain skill set, facilitators are notified
that there is an opportunity to teach a workshop on an in-demand
topic. The learner-driven instructional loop begins after students
complete a badge. At this point, the Loft reminds learners to
update their “individual development plans” (Beausaert, Segers,
& Gijselaers, 2011). An individual development plan (IDP) is a
list of skills along with the learner’s self-assessment of his current
ability level and desire to learn that skill. As students take on new
badge challenges, the skills necessary for completing that badge
are added to their IDP. Once a given number of students at a
DFA studio or classroom express an interest in learning a
particular skill, facilitators are notified that they should conduct a
particular workshop (and provided with a facilitator’s guide for
that workshop). Because these workshops are triggered by
students’ current interests, the workshops maximally target
students’ interests and needs. While students may not be perfectly
accurate in their self-assessments, feedback from mentors and
peers provide a reality check on the students’ self-assessments
(i.e., negative feedback from mentors will prompt students to
reassess their skills).
People who implement career goal plans report greater success
and satisfaction in their career (Ng, Eby, Sorensen, & Feldman,
2005), so IDPs for civic innovation should increase the success
and satisfaction of novice civic innovators on their journey to
become more successful designers.
4. CONCLUSION
The study of Digital Lofts will lead empirically-grounded
principles for designing online environments for civic innovation
education, contributing to number of research areas including
digital badges, crowdsourcing, learning-by-cases, design-based
learning, and online learning communities. Because many
domains can be framed as design disciplines including
engineering (making technologies), policy (creating government
programs), English language arts (creating texts and speeches),
and even science (creating research studies), principles for online
innovation education apply to myriad disciplines. And by
coordinating groups of learners and mentors throughout the design
process, Digital Lofts blur the boundaries between informal and
formal learning environments: making extra curricular
environments more effective and classroom environments more
like real life. This project seeks to lay a theoretical foundation for
understanding the broader ecosystem of online, social, design-
based learning environments.
More broadly, our goal is to create a widely adopted online
learning environment that will support civic innovation training.
The Digital Loft platform will be disseminated broadly, targeting
use in the teaching, training, and learning of civic innovation. This
will fill an urgent need for learning environments that educate
civic innovators who can solve our greatest societal challenges.
Foreseeable impacts on higher education and society include:
increasing the number of graduates motivated and capable of
broader societal impact, improved education, curricular changes,
and support for future interventions. Successful output of this
project will help to foster and support a culture of innovation in
our future workforce. By developing a scalable, cost-effective,
online platform for design-based learning across many disciplines
(design, engineering, speaking, etc.) Digital Lofts have the
potential to fundamentally transform online learning.
5. ACKNOWLEDGMENTS
We are grateful to the participants of Design for America for their
enthusiastic participation. This work was generously funded by
the Digital Medial Learning Competition supported by the
MacArthur and Mozilla Foundations.
REFERENCES
ABET Engineering Accreditation Commission. (2011). Criteria
for accrediting engineering programs. Baltimore, MD: ABET.
Retrieved from
http://www.abet.org/uploadedFiles/Accreditation/Accreditation
_Process/Accreditation_Documents/Current/eac-criteria-2012-
2013.pdf
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., &
Norman, M. K. (2010). How learning works: 7 research-based
principles for smart teaching. San Francisco, CA: Jossey-Bass.
Argote, L. (1999). Organizational learning: Creating, retaining
and transferring knowledge. New York: Springer.
Bandura, A. (1997). Self-efficacy: The exercise of control. W. H.
Freeman and Company.
Beausaert, S., Segers, M., & Gijselaers, W. (2011). The use of a
personal development plan and the undertaking of learning
activities, expertise-growth, flexibility and performance: The
role of supporting assessment conditions. Human Resource
Development International, 14(5), 527-543.
Beckman, S. L., & Barry, M. (2007). Innovation as a learning
process: Embedding design thinking. California Management
Review, 50(1), 25.
Bell, P., Davis, E. A., & Linn, M. C. (1995). The knowledge
integration environment: Theory and design. In The first
international conference on computer support for collaborative
learning (pp. 14-21).
Blackboard Inc. (2012). Blackboard [Computer Software].
Retrieved from http://www.blackboard.com/
Brown, A. L., & Campione, J. C. (1996). Psychological theory
and the design of innovative learning environments: On
procedures, principles, and systems. In L. Schauble & R. Glaser
(Eds.), Innovations in learning: New environments for
education (pp. 289-322). Mahway, NJ: Erlbaum.
Canvas. (2012). Instructure [Computer Software]. Retrieved from
http://www.instructure.com/
Clark, R. E. (2008). Design document for A guided experiential
learning course. Submitted to satisfy contract DAAD 19-99-D-
0046-0004 from TRADOC to the Institute for Creative
Technologies and the Rossier School of Education, University
of Southern California. (Original work published 2004)
Retrieved from
http://www.cogtech.usc.edu/publications/gel_design_document.
pdf
Deci, E. L., & Ryan. (1987). The support of autonomy and the
control of behavior. Journal of Personality and Social
Psychology, 53(6), 1024-1037
Deci, E. L., Koestner, R., & Ryan, R. M. (1999). A meta-analytic
review of experiments examining the effects of extrinsic
rewards on intrinsic motivation. Psychological Bulletin, 125(6),
627-668.
Dillenbourg, P., & Jermann, P. (2010). Technology for classroom
orchestration. In M. S. Khine & I. M. Salhey (Eds.), New
science of learning (pp. 525-552). New York: Springer.
Dong, A. (2005). The latent semantic approach to studying design
team communication. Design Studies, 26(5), 445-461.
Dubberly, H. (2005). How Do You Design? Retrieved from
http://www.dubberly.com/wp-
content/uploads/2008/06/ddo_designprocess.pdf
Duncan, A. (2011, September 15). Digital badges for learning:
Remarks by secretary duncan at 4th annual launch of the
macarthur foundation digital media and lifelong learning
competition. Retrieved from
http://www.ed.gov/news/speeches/digital-badges-learning
Easterday, M. W. (2012). Matthew easterday: Cyber-Civics 101.
Presentation at the NSF cyberlearning summit, jan 18, 2012,
washington D.C. Retrieved from from
http://www.youtube.com/watch?v=UBPeDVR2nOo&feature=y
outu.be
Edelson, D. C. (2001). Learning-for-Use: A framework for the
design of technology-supported inquiry activities . Journal of
Research in Science Teaching, 38(3), 355-385.
Edelson, D. C., Gordin, D. N., & Pea, R. D. (1999). Addressing
the challenges of inquiry-based learning through technology
and curriculum design. Journal of the Learning Sciences, 8(3-
4), 391-450.
ETR Associates. (2012). What is service-learning? [Web page]
Retrieved from http://www.servicelearning.org/what-service-
learning.
Furco, A. (1996). Service-learning: A balanced approach to
experiential education. Expanding Boundaries: Serving and
Learning, 1, 1-6.
Gall, S. N. -L. (1981). Help-seeking: An understudied problem-
solving skill in children. Developmental Review, 1(3), 224 -
246. doi:10.1016/0273-2297(81)90019-8
Gentner, D., Loewenstein, J., & Thompson, L. (2003). Learning
and transfer: A general role for analogical encoding. Journal of
Educational Psychology, 95(2), 393.
Gerber, E. M., Marie Olson, J., & Komarek, R. L. D. (2012).
Extracurricular design-based learning: Preparing students for
careers in innovation. International Journal of Engineering
Education, 28(2), 317.
Hardiman, P. T., Pollatsek, A., & Well, A. D. (1986). Learning to
understand the balance beam. Cognition and Instruction, 3(1),
63-86.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-
analyses relating to achievement.
Hattie, J., & Timperley, H. (2007). The power of feedback.
Review of Educational Research, 77(1), 81-112.
Khan, S. (2012). The khan academy. [Web page] Retrieved from
http://www.khanacademy.org/
Kiesler, S., Kraut, R., Resnick, P., & Kittur, A. (2012). Regulating
behavior in online communities. In P. Resnick & R. Kraut
(Eds.), Evidence-based social design: Mining the social
sciences to build online communities (pp. 125-178). Cambridge,
MA: MIT Press.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal
guidance during instruction does not work: An analysis of the
failure of constructivist, discovery, problem-based, experiential,
and inquiry-based teaching. Educational Psychologist, 41(2),
75-86.
Koedinger, K. R., & Aleven, V. (2007). Exploring the assistance
dilemma in experiments with cognitive tutors. Educational
Psychology Review, 19(3), 239-264.
Kolodner, J. L. (ed.) (1993). Case-based learning. Boston:
Springer.
Kolodner, J. L. (1997). Educational implications of analogy: A
view from case-based reasoning. American Psychologis, 52(1),
57.
Kolodner, J. L., Crismond, D., Gray, J., Holbrook, J., &
Puntambekar, S. (1998). Learning by design from theory to
practice. Proceedings of the International Conference of the
Learning Sciences, 16-22.
Kolodner, J. L., Owensby, J. N., & Guzdial, M. (2004). Case-
based learning aids. In Handbook of research for educational
communications and technology (Vol. 2, pp. 829-861).
Kraut, R., & Resnick, P. (2012). Encouraging contribution to
online communities. In P. Resnick & R. Kraut (Eds.), Evidence-
based social design: Mining the social sciences to build online
communities (pp. 21-76). Cambridge, MA: MIT Press.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate
peripheral participation. Cambridge university press.
Lore. (2012). Lore. [Computer Software] Retrieved from
http://lore.com/
Lovett, M., Meyer, O., & Thille, C. (2008). The open learning
initiative: Measuring the effectiveness of the OLI statistics
course in accelerating student learning. Journal of Interactive
Media in Education. Retrieved from
http://jime.open.ac.uk/2008/14
Massachusetts Department of Education. (2006). Massachusetts
science and technology/engineering curriculum framework.
Massachusetts. Retrieved from
www.doe.mass.edu/frameworks/scitech/1006.pdf
Massachusetts Institute of Technology. (2012). MIT
opencourseware [Computer Software]. Retrieved from
http://ocw.mit.edu/index.htm
Ng, T. W. H., Eby, L. T., Sorensen, K. L., & Feldman, D. C.
(2005). Predictors of objective and subjective career success: A
meta-analysis. Personnel Psychology, 58(2), 367-408.
Owen, C. L. (1998). Design research: Building the knowledge
base. Design Studies, 19(1), 9-20.
Papert, S. (1980). Mindstorms: Children, computers, and powerful
ideas. Basic Books, Inc.
Papert, S., & Harel, I. (1991). Situating constructionism. In
Constructionism (pp. 1-11).
Peters, V. L., & Slotta, J. D. (2010). Scaffolding knowledge
communities in the classroom: New opportunities in the web
2.0 era. In Designs for learning environments of the future (pp.
205-232). New York: Springer.
Pintrich, P. R. (2004). A conceptual framework for assessing
motivation and self-regulated learning in college students.
Educational Psychology Review, 16(4), 385-407.
Puntambekar, S., & Kolodner, J. L. (2005). Toward implementing
distributed scaffolding: Helping students learn science from
design. Journal of Research in Science Teaching, 42(2), 185-
217.
Reiser, B. J. (2004). Scaffolding complex learning: The
mechanisms of structuring and problematizing student work.
The Journal of the Learning Sciences, 13(3), 273-304.
Reiser, B. J., Tabak, I., Sandoval, W. A., Smith, B. K.,
Steinmuller, F., & Leone, A. J. (2001). BGuILE: Strategic and
conceptual scaffolds for scientific inquiry in biology
classrooms. In Cognition and instruction (Vol. 25, pp. 263-
306).
Resnick, M. (2009). Scratch: Programming for all.
Communications of the ACM, 52(11), 60-67.
doi:10.1145/1592761.159277
Ryan, A. M., Pintrich, P. R., & Midgley, C. (2001). Avoiding
seeking help in the classroom: Who and why? Educational
Psychology Review, 13(2), 93-114.
Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic
motivations: Classic definitions and new directions.
Contemporary Educational Psychology, 25(1), 54-67.
doi:10.1006/ceps.1999.1020
Sadler, P. M., & Good, E. (2006). The impact of self-and peer-
grading on student learning. Educational Assessment, 11(1), 1-
31.
Sakai Foundation. (2012). Sakai [Computer Software]. Retrieved
from http://www.sakaiproject.org/
Salden, R. J. C. M., Aleven, V. A. W. M. M., Renkl, A., &
Schwonke, R. (2009). Worked examples and tutored problem
solving: Redundant or synergistic forms of support? Topics in
Cognitive Science, 1(1), 203-213. doi:10.1111/j.1756-
8765.2008.01011.x
Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven
inquiry: Integrating conceptual and epistemic scaffolds for
scientific inquiry. Science Education, 88(3), 345-372.
Scardamalia, M., Bereiter, C., & Steinbach, R. (1984).
Teachability of reflective processes in written composition.
Cognitive Science, 8(2), 173-190.
Scheines, R., Leinhardt, G., Smith, J., & Cho, K. (2005).
Replacing lecture with web-based course materials. Journal of
Educational Computing Research, 32(1), 1-26.
Severance, C. (2012). Teaching the world: Daphne koller and
coursera. Computer, 45(8), 8-9.
Shaffer, D. W., & Resnick, M. (1999). Thick" authenticity: New
media and authentic learning. Journal of Interactive Learning
Research, 10(2), 195-215.
Slotta, J. D. (2004). The web-based inquiry science environment
(WISE): Scaffolding knowledge integration in the science
classroom. In M. C. Linn, E. A. Davis, & P. Bell (Eds.),
Internet environments for science education (pp. 203-232).
Lawrence Erlbaum Mahwah, NJ.
Songer, N. B. (2006). BioKIDS: An animated conversation on the
development of complex reasoning in science. In R. Keith
Sawyer (Ed.), The cambridge handbook of the learning
sciences (pp. 355-370). New York: Cambridge University
Press.
Thompson, L., Gentner, D., & Loewenstein, J. (2000). Avoiding
missed opportunities in managerial life: Analogical training
more powerful than individual case training. Organizational
Behavior and Human Decision Processes, 82(1), 60-75.
Velmahos, G. C., Toutouzas, K. G., Sillin, L. F., Chan, L., Clark,
R. E., Theodorou, D., & Maupin, F. (2004). Cognitive task
analysis for teaching technical skills in an inanimate surgical
skills laboratory. American Journal of Surgery, 187(1), 114-9.
Von Ahn, L., & Dabbish, L. (2004). Labeling images with a
computer game. In Proceedings of the SIGCHI conference on
human factors in computing systems (pp. 319-326).
Ward, M., & Sweller, J. (1990). Structuring effective worked
examples. Cognition and Instruction, 7(1), 1-39.
White, B., Frederiksen, J., Frederiksen, T., Eslinger, E., Loper, S.,
& Collins, A. (2002). Inquiry island: Affordances of a multi-
agent environment for scientific inquiry and reflective learning.
In Proceedings of the fifth international conference of the
learning sciences (ICLS) . Mahwah, NJ: Erlbaum.
... Students may perceive the feedback favorably due to its authenticity [33] and potential insights outside the course material [17]. Being responsible for knowing when and how to gather feedback can also help students learn to be more self-directed in their work [7]. ...
... External crowd feedback could therefore supplement peer feedback received at key project milestones or replace it when tight schedules or an already high student workload make peer feedback less desirable. Instructors could also use external crowds to help students learn to become more self-directed in planning for and acquiring feedback [7], as this is an essential skill for innovation careers [11]. Finally, external crowds can be helpful for accessing specialized user audiences, as mentioned by student interviewees. ...
Conference Paper
As demand for design education increases, instructors are struggling to provide timely, personalized feedback for student projects. Gathering feedback from classroom peers and external crowds offer scalable approaches, but there is little evidence of how they compare. We report on a study in which students (n=127) created early- and late-stage prototypes as part of nine-week projects. At each stage, students received feedback from peers and external crowds: their own social networks, online communities, and a task market. We measured the quality, quantity and valence of the feedback and the actions taken on it, and categorized its content using a taxonomy of critique discourse. The study found that peers produced feedback that was of higher perceived quality, acted upon more, and longer compared to the crowds. However, crowd feedback was found to be a viable supplement to peer feedback and students preferred it for projects targeting specialized audiences. Feedback from all sources spanned only a subset of the critique categories. Instructors may fill this gap by further scaffolding feedback generation. The study contributes insights for how to best utilize different feedback sources in project-based courses.
... Undergraduate engineering students are more willing to engage in the writing up of experiences that engender reflective learning. For example, participants in the Design for America program (Gerber, 2014;Gerber & Easterday, 2015;Gerber, Olson, & Komarek, 2012) use the Digital Loft (Easterday, Lewis, & Gerber, 2013) to scaffold real-world design projects. This software provides scaffolding in the forms of structuring and hints using cases and stories of other engineering design teams. ...
Article
Full-text available
Current theories and models of education often argue that instruction is best administered when knowledge is situated within a context. Problem-based learning (PBL) provides an approach to education that has particularly powerful affordances for learning disciplinary content and practices by solving authentic problems within a discipline. However, not all implementations of PBL have been equally successful at fostering such learning, and some argue that this form of instruction is beyond the capabilities of novices. We revisit the theoretical foundations of PBL and call on the theoretical foundations of case-based reasoning (CBR) to help us identify the reasons many PBL implementations do not succeed as well as expected. Based on that analysis, we suggest priorities for facilitator scaffolding during PBL, ways well-curated case libraries can expose novices to new experiences and help them direct attention to important variables that require additional investigation, and ways case-authoring tools can help foster the kinds of reflection needed for learning from problem-solving experiences. Finally, we discuss challenges that, if addressed, will support even more effective PBL implementations.
... Participants engaged in weekly 75-90 minute computer supported group critique sessions using the Loft. The Loft is an online learning platform for design teams that includes a group critique system for novice designers [27]. Prior to the first critique session the facilitator and a member of the research team led a 20minute discussion to outline the goals of critique. ...
Article
Full-text available
Groups of novice critiquers can sometimes provide feedback of the same quality as a single expert. Unfortunately, we do not know how to create systems for novice group critique in design education. We tested whether 4 principles: write-first scripts, critique prompts, interactive critiquing & formative framing, allow us to create systems that combine the advantages of face-to-face and computer-mediated critique. We collected observations and 48 interviews with 12 undergraduate design students who used a computer supported group critique system over 5 critique sessions, analyzed using grounded theory. We found that: (a) the write-first script helped overcome initial learning costs; (b) the interactive critique features created a dual-channel critique that increased the number of critiquers, duration of critique and interactivity; and (c) the system produced a greater volume of useful critique and promoted reciprocity among critiquers. The study provides improved principles for developing computer supported novice group critique systems in design.
Chapter
Although feedback is a very time-consuming task, it is crucial to improve learning and foster students’ development. For this reason, automating feedback has become one of the most pursued challenges within the field of smart learning environments -SLEs-. After performing a literature study about which dimensions of feedback are relevant to properly automate it, we present a novel conceptual framework that structures, explains, and analyses these dimensions. The framework can be key to appropriately contextualize future research about feedback, and can also be a guide for educators and developers in order to design and implement quality feedback in a more methodical way in SLEs.
Chapter
Technology-enhanced instructional delivery is increasingly becoming the norm in the delivery of the engineering curriculum. Electronic inking is one such technology that has proven to afford enhanced learning experiences such as supporting note taking and sharing, including real-time distributed conversation. However, despite its proven benefits including other classroom technologies, several institutions still struggle with university-wide implementation or department-wide adoption. We review the key developments that some signature engineering schools have undertaken to foster the use of inking technology in the last two decades. We also examine the different aspects of inking as a classroom tool that makes learning in the 21st Century more enriching. While there are a plethora of studies that highlight computer use or other novel technologies in education, we demonstrate a couple of success stories and best practices for implementing inking technology in science and engineering disciplines. Findings from the review indicate that effective integration of inking pedagogies in delivering instruction spurs higher student engagement with content which in turn contributes to better learning outcomes/gains.
Article
Full-text available
Government and industry depend on educational institutions to play a pivotal role in preparing the future workforce for careers in innovation. Yet students learn how to innovate through practice, and opportunities for practice are limited in higher education. This paper addresses this challenge by presenting a new student-led approach to innovation education called Extracurricular Design-Based Learning. This model allows students to practice innovating solutions to authentic, pro-social, and local challenges in an extracurricular setting. This paper provides an overview of the model and its implementation in Design for America at Northwestern University. Findings from surveys, daily diaries, interviews, and observations suggest that students build innovation self-efficacy through successful task completion, social persuasion, and vicarious learning in communities of practice with clients, peers, industry professionals, and faculty. Further, students report achievement in learning outcomes outlined by the Accreditation Board for Engineering and Technology.
Article
Full-text available
Evidence for the superiority of guided instruction is explained in the context of our knowledge of human cognitive architecture, expert–novice differences, and cognitive load. Although unguided or minimally guided instructional approaches are very popular and intuitively appealing, the point is made that these approaches ignore both the structures that constitute human cognitive architecture and evidence from empirical studies over the past half-century that consistently indicate that minimally guided instruction is less effective and less efficient than instructional approaches that place a strong emphasis on guidance of the student learning process. The advantage of guidance begins to recede only when learners have sufficiently high prior knowledge to provide “internal” guidance. Recent developments in instructional research and instructional design models that support guidance during instruction are briefly described.
Article
Intrinsic and extrinsic types of motivation have been widely studied, and the distinction between them has shed important light on both developmental and educational practices. In this review we revisit the classic definitions of intrinsic and extrinsic motivation in light of contemporary research and theory. Intrinsic motivation remains an important construct, reflecting the natural human propensity to learn and assimilate. However, extrinsic motivation is argued to vary considerably in its relative autonomy and thus can either reflect external control or true self-regulation. The relations of both classes of motives to basic human needs for autonomy, competence and relatedness are discussed.
Article
Meeting ambitious content and process (inquiry) standards is an important challenge for science education reform particularly because educators have traditionally seen content and process as competing priorities. However, integrating content and process together in the design of learning activities offers the opportunity to increase students' experience with authentic activities while also achieving deeper content understanding. In this article, I explore technology-supported inquiry learning as an opportunity for integrating content and process learning, using a design framework called the Learning-for-Use model. The Learning-for-Use model is a description of the learning process that can be used to support the design of content-intensive, inquiry-based science learning activities. As an example of a technology-supported inquiry unit designed with the Learning-for-Use model, I describe a curriculum called the Create-a-World Project, in which students engage in open-ended Earth science investigations using WorldWatcher, a geographic visualization and data analysis environment for learners. Drawing on the Learning-for-Use model and the example, I present general guidelines for the design of inquiry activities that support content learning, highlighting opportunities to take advantage of computing technologies.
Article
Inquiry experiences can provide valuable opportunities for students to improve their understanding of both science content and scientific practices. However, the implementation of inquiry learning in classrooms presents a number of significant challenges. We have been exploring these challenges through a program of research on the use of scientific visualization technologies to support inquiry-based learning in the geosciences. In this article, we describe 5 significant challenges to implementing inquiry-based learning and present strategies for addressing them through the design of technology and curriculum. We present a design history covering 4 generations of software and curriculum to show how these challenges arise in classrooms and how the design strategies respond to them.
Book
This unique and ground-breaking book is the result of 15 years research and synthesises over 800 meta-analyses on the influences on achievement in school-aged students. It builds a story about the power of teachers, feedback, and a model of learning and understanding. The research involves many millions of students and represents the largest ever evidence based research into what actually works in schools to improve learning. Areas covered include the influence of the student, home, school, curricula, teacher, and teaching strategies. A model of teaching and learning is developed based on the notion of visible teaching and visible learning. A major message is that what works best for students is similar to what works best for teachers - an attention to setting challenging learning intentions, being clear about what success means, and an attention to learning strategies for developing conceptual understanding about what teachers and students know and understand. Although the current evidence based fad has turned into a debate about test scores, this book is about using evidence to build and defend a model of teaching and learning. A major contribution is a fascinating benchmark/dashboard for comparing many innovations in teaching and schools.