Content uploaded by Nathan Heller
Author content
All content in this area was uploaded by Nathan Heller on Dec 15, 2020
Content may be subject to copyright.
Decision Sciences Journal of Innovative Education
Volume 17 Number 3
July 2019
C2019 Decision Sciences Institute
EMPIRICAL RESEARCH
A Meta-Analysis of the Relationship
Between Experiential Learning
and Learning Outcomes
Gerald F. Burch†
Department of Management, Texas A&M University-Commerce, Commerce, TX,
e-mail: gerald.burch@tamuc.edu
Robert Giambatista
Management/Marketing Department, University of Scranton, Scranton, PA,
e-mail: robert.giambatista@scranton.edu
John H. Batchelor
Department of Management, University of West Florida, Pensacola, FL,
e-mail: jbatchelor1@uwf.edu
Jana J. Burch
Department of Education, Southwestern Assemblies of God University, Tarleton State
University Stephenville, TX, e-mail: jburch@tarleton.edu
J. Duane Hoover
Department of Management, Texas Tech University, Lubbock, TX,
e-mail: duane.hoover@ttu.edu
Nathan A. Heller
Department of Management, Tarleton State University Stephenville, TX,
e-mail: heller@tarleton.edu
ABSTRACT
Experiential learning activities have been used for over 40 years with the hope that
they increase students learning. However, a definitive study that showed their overall
effectiveness has not been produced. The purpose of this study is to address this gap
in the literature. This meta-analysis examined a 43-year span and identified 13,626
journal articles, dissertations, thesis articles, and conference proceedings written about
experiential learning and found only 89 of these studies contained empirical data with
both a treatment and control group. Meta-analysis of these studies show that students
experienced superior learning outcomes when experiential pedagogies were employed.
Further, learning outcomes were almost a half standard deviation higher (d=.43)
in classes employing experiential learning pedagogies versus traditional learning
†Corresponding author.
239
240 Experiential Learning Meta-Analysis
environments. This review definitively, and quantitatively, shows the importance
of experiential learning activities. We use these results to discuss future research
areas that need to be addressed based on our analysis of potential modera-
tors and provide recommendations on how to best employ experiential learning
pedagogies.
Subject Areas: Experiential learning, Meta-analysis, Service learning, and
Pedagogy.
INTRODUCTION
Experiential learning continues to be a major consideration in improving learning
outcomes as witnessed by recent research (Coker et al., 2017; Hajshirmohammadi,
2017; Morris, 2016; Tomkins & Ulus, 2016) and from increased calls from outside
agencies to use experiential learning activities in higher education (AACSB, 2013).
A 5-year study at Elon University showed that key learning outcomes are driven
by the depth and breadth of experiential learning college students are exposed to
during their studies (Coker et al., 2017). Despite this, some scholars have claimed
a lack of established and confirmed empirical relationships between experiential
learning activities and learning outcomes (Anderson & Lawton, 1997; Freedman
& Stumpf, 1980; Washbush & Gosenpud, 1994), while others have argued against
the theoretical underpinnings (Holman, Pavlica, & Thorpe, 1997). Such criticisms
have impeded the progress and integration of experiential learning into higher
education, despite calls urging its importance (Rynes, Trank, Lawson, & Ilies,
2003).
Evidence of this is witnessed by the number of empirical studies that have
been conducted to determine the relationship between experiential learning peda-
gogies and learning outcomes. And yet, no meta-analysis has been conducted.
Conversely, there have been several meta-analytic studies of service learning
(Andrews, 2007; Celio, Durlak, & Dymnicki, 2011, Conway, Amel, & Gerwien,
2009; Stewart & Wubbena 2015, and Yorio & Ye, 2012), which is based on
Dewey’s (1938) argument that learning is the interaction of knowledge and skills
with experience (Stewart & Wubbena, 2015). In a review of 103 samples of ser-
vice learning, Conway et al. (2009) posited that socially responsible knowledge
involves experience-based education. Similarly, Yorio and Ye (2012, p. 11) stated
that “service learning provides students with a type of reality and reciprocity
experience, allowing them to develop a deeper understanding of social issues.”
Service learning is, therefore, experiential learning conducted in the context of a
community setting. However, the field of experiential learning is much larger and
encompasses in class and out-of-class learning experiences that focus on cognitive,
social, and personal outcomes. The goal of this research is to explore this gap in
the literature and to conduct a meta-analysis to measure the strength of the rela-
tionship between experiential learning activities and learning outcomes across all
studies in experiential learning that met our criteria for inclusion. Our study further
investigates several situational moderators that affect the magnitude of learning out-
comes and provides the necessary evidence to use them in a variety of educational
settings.
Burch et al. 241
THEORETICAL FOUNDATION AND HYPOTHESES
“Experiential learning exists when a personally responsible participant cogni-
tively, affectively, and behaviorally processes knowledge, skills, and/or atti-
tudes in a learning situation by a high level of active involvement” (Hoover &
Whitehead, 1975: 25). Learning is, therefore, “the process whereby knowledge is
created through the transformation of experience” (Kolb, 1984: 41). Experiential
Learning Theory (Kolb, 1984) further states that a concrete experience leads to
students reflectively thinking about the experience until they can create abstract
conceptualizations that link the experience to previous experiences, which ulti-
mately results in active experimentation when the student determines how they
will put the experience into future actions. The experience does more than the
transferring of facts. It changes the way the student will think from that moment
forward. This means that experiential learning stands in contrast to traditional lec-
ture, which is often criticized for its passive style, lack of realism, and failure to
provide whole-person, learning-focused outcomes (Godfrey, Illies, & Berry, 2005;
Hoover, Giambatista, Sorenson, & Bommer, 2010).
Experiential learning activities and experiential programs are often posited
as an antidote to passivity and lack of engagement in learning, yet they are not
a required part of the curricula at many colleges and universities (McCarthy &
McCarthy, 2006). Nonetheless, there are growing calls for improved integration
of concepts and development of “soft skills” (Duncan & Dunifor, 2012; Mitchell,
Skinner & White, 2010), acquiring a global perspective, appreciation for social
responsibility, and guidance toward ethical behavior (Navarro, 2008; Waddock &
Lozano, 2013). Many instructors have added experiential learning activities to their
classes based on personal observations and positive results in the research literature
(Huang, Chen, & Chou 2016), and some institutions have encouraged the use of
these learning activities across their campuses (Coker, et al., 2017; Morris, 2016).
Concerns Over Experiential Learning
In spite of the movement to increase experiential learning activities, there have
been warnings that experiential learning “can be a powerful pedagogical tool, but
it is not a panacea for increasing engagement” (Dean & Jolly, 2012: 231). Boyatzis
et al. (1995: 235) further stated that “you can lead students to an experience, but
you cannot make them learn.” Others have noted that students benefit differently
from experience (Allen & Young, 1997; Bunker & Webb, 1992). It is likely that
some of this caution originates from criticisms of Kolb’s Experiential Learning
Theory model as posited in 1984 (Holman, Pavlica, & Thorpe, 1997; Hopkins,
1993; Miettinen, 1998; Reynolds, 1999; Vince, 1998) according to Kayes (2002).
These critiques proposed that there was not enough underlying theory to substan-
tiate Kolb’s model. However, the literature indicates that such caution is largely
unwarranted. It is important to keep in mind that Kolb’s theorizing was based on
the foundations of Kurt Lewin’s precedent setting theories of action, research, and
field theory in social science (Lewin, 1946, 1951). Just as Lewin’s foundational
work has stood the test of time, so has the experiential learning theory as posited
by Kolb (1984).
242 Experiential Learning Meta-Analysis
Reframing the Experiential Learning Debate to Focus on Learning
Outcomes
Defining experiential learning as a single theory can certainly lead to debate. There-
fore, we focus on the agreement on the concepts surrounding experiential learning.
Meyer (2003) stated that “there is different knowledge to be gained through ac-
tive participation in, as opposed to passive reception of, learning” (Meyer, 2003:
353). Similarly, Kolb stated that learning is “the process whereby knowledge is
created through the transformation of experience” (Kolb, 1984: 41). And finally,
Hoover and Whitehead (1975) claimed that “experiential learning exists when a
personally responsible participant cognitively, affectively, and behaviorally pro-
cesses knowledge, skills, and/or attitudes in a learning situation by a high level of
active involvement” (Hoover & Whitehead, 1975: 25). Based on these three state-
ments, experiential learning occurs when students actively process an experience
that leads to new knowledge, skills, or insight.
Altering the way one thinks requires repeated cycles of theory, practice, feed-
back, and reflection (Rynes, Trank, Lawson, & Ilies, 2003). Experiential learning
activities allow for increased learning because knowledge is gained through active
participation (Meyer, 2003), and experience acts as a catalyst for dialectic inquiry
(Kayes, 2002). Experiential learning is so ubiquitous that Morrison and Brantner
(1992) claimed that 70% of all adult learning comes from experience.
Experiential learning stands in contrast to most classroom lectures that rely
heavily on semantic memory (Sprenger, 1999). One problem associated with se-
mantic memory is that it requires repetition to help move new information into
long-term storage. Semantic memory can be stimulated by associations, compar-
isons, and similarities (Sprenger, 1999). Experiential learning activities may sig-
nificantly assist in this process since students are given the opportunity to develop
their own associations between new material and previous knowledge.
Similarly, experiential learning exercises allow students time and opportu-
nity to interface with multiple learning dimensions while making comparisons of
old and new knowledge (Hoover, Giambatista, Sorenson, & Bommer, 2010). Ex-
periential learning activities may, therefore, allow students more opportunities to
process information, make connections, consider previously held emotional states
and skills, and opportunities to apply new knowledge. Based on this set of asser-
tions, we propose that when experiential learning activities are present, student
learning is enhanced by allowing the student to actively process an experience that
leads to new knowledge, skills, or insight.
Hypothesis 1. Experiential learning activities will have a positive effect on stu-
dents learning outcomes.
Moderator Variables
In the second part of our analysis, we focus on potential moderators of the
relationship between experiential learning activities and learning outcomes. Based
on previous research, and available data explicated within these studies, we were
able to evaluate four moderators: (1) type of learning outcome (understanding
social issues, developing personal insight, or cognitive development); (2) type of
assessment used to measure the outcome (objective vs. subjective); (3) whether or
Burch et al. 243
not students received feedback during the activity; and (4) duration of the activity.
We discuss each moderator in the following paragraphs and provide hypotheses if
previous research has provided sufficient theoretical support for such a moderator.
Learning outcomes have been categorized in service learning studies as un-
derstanding a social issue, developing personal insight, and cognitive development
(Yorio & Ye, 2012). We propose using these learning outcomes since service
learning is a subset of experiential learning. Returning to our conceptualization,
experiential learning occurs when students actively process an experience that
leads to new knowledge, skills, or insight. It is, therefore, possible that the type of
learning outcome moderates the effectiveness of experiential learning.
Higher education has called for an increasing need for students to understand
social issues (Giacalone & Thompson, 2006; Pless, Maak, & Stahl, 2011). Yorio
and Ye (2012) conceptualized understanding social issues as a student’s frame of
reference, which guides decision making in terms of complex social issues like
cultural awareness, tolerance for diversity, understanding community needs, and
responsibility/commitment to engage in future activities for the community. Ex-
periential learning activities have shown to be effective in providing students with
the necessary background and time to consider social issues, thereby, affecting
their social attitudes. Similarly, Plante, Lackey, and Hwang (2009) demonstrated
how community-based learning immersion trips were successful in increasing
student compassion and empathy. Experiential learning exercises similar to these
may allow students the time and focus to gain new insight on social issues.
While an understanding of social issues helps students to develop an out-
ward focus of the world and their place in that world, personal insight, by contrast,
focuses on internal reflection and understanding. Self-awareness and self-insight
are common to higher education pedagogies, as witnessed by ubiquitous self-
assessment instruments in many university classes and texts. Education often al-
lows students to alter their perception of themselves by increasing personal aware-
ness, a sense of self-efficacy, compassion, empathy, leadership ability perception,
determination, and persistence. Empirical research conducted in medical educa-
tion has shown how experiential activities increase student self-efficacy (Doyle
et al., 2011) and personal reflection (Aukes, et al., 2008). Similarly, an experiential
learning activity based around gender inequity in the workplace reduced feelings
of reactance and promoted self-efficacy (Zawadzki, Danube, & Shields, 2012).
Other activities have demonstrated that experiential activities go beyond personal
insight and provide the basis for altering personal behavior to include reducing
aggressive behavior (Teglasi & Rothman, 2001), reducing stress (Robert-McComb
et al., 2015), and decreasing chronic combat-related posttraumatic stress disorder
conditions (Gelkopf, et al., 2013; Wolf & Mehl, 2011).
The final learning outcome that we will explore is cognitive development.
Almost six decades ago, Benjamin Bloom and his colleagues proposed a now
classic hierarchal taxonomy of educational objectives. Bloom’s taxonomy, which
has six levels (Bloom, 1954), states that at the lowest level students must develop
knowledge that includes recall of facts, methods, processes, patterns, structures,
or settings. As objectives move up the hierarchy, greater cognitive development is
needed for mastery. Comprehension builds upon knowledge whereby the student
understands what is being communicated and can make use of the material or idea
244 Experiential Learning Meta-Analysis
being communicated. In application, the learner uses abstractions in both particular
and concrete situations. In the fourth level, analysis occurs when the learner can
break down information into its constituent elements or parts so that a relative
hierarchy of ideas is made and relationships with other ideas are made explicit.
The final two levels are synthesis and evaluation, where students put together
elements and parts to form a whole and make judgments about the material for
a given purpose. Anderson and Krathwohl (2001) modified this hierarchy by
replacing Bloom’s original titles with descriptors focused on process. These new
stages are remembering, understanding, applying, evaluating, and creating. The
total Anderson and Krathwohl (2001) model is completed by their addition of
content dimension identifiers; these are factual knowledge, conceptual knowledge,
procedural knowledge and metacognitive knowledge.
Recent research in the area of threshold concepts (see Meyer & Land, 2005;
1998) has recommended modifying Bloom’s taxonomy to acknowledge that stu-
dents must integrate conceptual knowledge, procedural knowledge, and factual
knowledge within a discipline to move to the higher cognitive processes of appli-
cation, analysis, synthesis, and evaluation (Burch, Burch, & Heller, 2015). Stu-
dents who have limited understandings or misunderstandings are left with limited
conceptions, which “lead(s) to limited application and significantly reduce(s) the
chance for the student to move any further up Bloom’s taxonomy” (Burch, Burch,
Bradley, & Heller, 2015: 492).
In practice, educators may not be explicitly guided by Bloom’s or more
recent taxonomies, but they presumably distinguish between direct, applied, and
other more advanced learning outcomes in designing pedagogy and activities.
Experiential learning activities provide students with more frequent opportunities
to integrate conceptual knowledge, procedural knowledge, and factual knowledge
within the setting of the discipline (Bradley, Burch, & Burch, 2015). Therefore,
experiential learning activities have the potential to allow students to develop more
complete understandings or conceptions that will lead to more advanced learning.
However, the strength of the relationship between experiential learning and
learning outcomes may vary based on the type of targeted learning outcome as
witnessed in three separate meta-analytic studies of service learning. Conway
et al. (2009) showed that students who were involved in service learning showed
larger mean difference outcomes using academic outcomes (d=.43), followed by
social outcomes (d=.28), and personal outcomes (d=.21), compared to students
who received traditional lecture. Similarly, Celio et al. (2011) reported an overall
effect of using service learning (d=.28), while academic achievement was higher
(d=.43) than developing social skills (d=.30) and gaining attitudes toward self
(d=.28). Finally, Yorio and Ye (2012) stated that service learning resulted in
the highest gain for cognitive development (d=.52), while understanding social
issues (d=.34) and personal insight (d=.28) were still positive, but lower.
Based on the discussion surrounding the types of learning outcomes and the
results of service learning studies, we propose that:
Hypothesis 2. Experiential learning activities will have a positive effect on learn-
ing outcomes, but the strength of the relationship will vary based
on the type of learning outcome (personal insight, understanding
social issues, and cognitive development).
Burch et al. 245
Researchers have chosen to use a variety of measures to evaluate the level of
understanding social issues, developing personal insight, and increasing cognitive
development. In this study, we classified measures as objective or subjective based
on previous research (Koriat, Sheffer, & Ma’ayan, 2002). Objective measures in-
clude grade point average, number of correct answers on a test/quiz, evaluation
from a trained third party, or similar measures. Subjective measures include stu-
dent self-assessment of learning outcomes or qualitative assessments of learning
outcomes. Yorio and Ye (2012) found that subjective measures of learning had
a significantly lower effect on student learning than did objective measures for
cognitive development. This result was most evident for cognitive studies where
the mean effect size for objective measures was .88 while for subjective mea-
sures it was only .38. This shows that students may significantly underestimate the
amount of learning that has occurred. However, this effect was not significant for
understanding social issues (.33 for objective and .37 for subjective) and gaining
personal insight (.34 for objective and .37 for subjective).
This supports comments by other researchers that students may not be able
to effectively evaluate their learning or may respond in a manner that they think the
researcher wants them to respond (Rama, et al., 2000). Koriat, Sheffer, and Ma’ayan
(2002) found that students were slightly overconfident about the knowledge gained
on the first study-test cycle but underconfident from the second cycle on.
It is also possible that in some instances asking students to evaluate their
learning outcomes may be the best, or only, means of gathering data. Such sit-
uations may certainly occur in the areas of gaining personal insight. Veenhoven
(2007) suggests that subjectivity can be divided into subjective assessment and
subjective substance. Examples associated with well-being are diagnosing an ill-
ness with a medical test (objective assessment and objective substance) and asking
a patient if they are ill (subjective assessment and subjective substance). It is ex-
pected that a clinical study that used the first method of diagnosis would preferred
using the subjective assessment and subjective substance approach. For the pur-
pose of this study, we will only focus on subjective assessment but acknowledge
the roles that subjective assessment may also have on the effects of learning. Fi-
nally, we found no source study that explicitly explored or hypothesized objective
versus subjective measurement, so an examination of this potential moderator is
an important contribution to the field in its own right.
Hypothesis 3. Measurement method affects the relationship between experien-
tial activities and learning such that learning outcomes assessed
through objective measures will be higher than experiential learn-
ing outcomes assessed through subjective measures.
Another moderator that potentially affects the relationship between experien-
tial learning activities and learning is feedback, especially from the instructor as a
facilitator and as an authority/expertise figure. Learning is a process of experience
that can be facilitated by feedback (Dewey, 1897:79), which occurs when students
make sense of new experiences with existing conceptualizations (Piaget, 1977).
In the classroom, the responsibility for delivering appropriate feedback fitting the
experiential learning activity at hand falls on the instructor. This forces educators
to continuously evaluate the learning process and to give adequate feedback to
246 Experiential Learning Meta-Analysis
the student to ensure desired learning outcomes are achieved. Postactivity stages
are, therefore, crucial to transferring experience to knowledge (Meyer, 2003), and
learning is best facilitated when instructors design, conduct, evaluate, and give
feedback (Gentry, 1990). This feedback becomes an important factor in learner
motivation based on the influence on learner confidence, satisfaction (Burke &
Moore, 2003; Keller, 1983), and motivation (Linnenbrink & Pintrich, 2002).
A concern for educators is to determine when to provide feedback. Yorke
(2003) states that feedback can be formal or informal and formative or summative.
The point is that feedback can be applied at different times during the experiential
exercise and for different purposes. During this study, we adopted the approach of
Bruner (1970, p. 120) that “learning depends on knowledge of results, at a time
when, and at a place where, the knowledge can be used for correction.” As such,
any feedback that is received either during the experiential exercise or immediately
after the exercise will increase students learning since people seldom learn from
their experience unless meaning is applied (Ekpenyong, 1999).
Hypothesis 4. The use of feedback from the instructor moderates the relationship
between experiential activities and learning such that activities
including feedback from the instructor will have increased learning
outcomes versus activities where students do not receive feedback
from the instructor.
The final factor for which we will provide a formal hypothesis is the activity
duration, which is measured by the total time the educator uses to conduct the
experiential exercise. Similar to our H3, we found no source study that explicitly
explored or hypothesized activity duration and outcomes, so an examination of this
potential moderator is an important contribution to the field in its own right. Learn-
ing is an uncertain and often imprecise process, where unintended outcomes often
present themselves (Burke & Sadler-Smith, 2006). Time may, therefore, plays in
the development of knowledge since the lack of time may not allow educators to
adequately overcome unintended outcomes. A second time consideration is that
to absorb certain types of knowledge, students must have opportunities to practice
responses and evaluate effects (Taras, et al., 2013). This is due in large part to the
premise that “all learning is relearning” (Kolb, 1984) and students benefit from
a process that draws out their beliefs and ideas about a topic. Experiential learn-
ing activities allow students time to examine, test, and integrate ideas. However,
Sprenger (1999) stated that the process may need to be repeated several times
before long-term memory is formed. This idea is supported by other researchers
who have stated that “time has the primary influence on learning” (Morrison &
Brantner, 1992: 936) and the empirical evidence that increased cultural intelli-
gence is positively related to the length of overseas work experience (Li, Mobley,
& Kelly, 2013). Based on these arguments, it is possible that:
Hypothesis 5. The amount of time (duration) of an experiential learning ac-
tivity affects the relationship between experiential exercises and
learning such that longer experiential exercises have a more posi-
tive affect on learning outcomes than shorter experiential learning
exercises.
Burch et al. 247
METHODS
Criteria for Inclusion
Four criteria were used to identify potential studies for inclusion: (1) the study
must be a quantitative empirical investigation of learning outcomes associated
with experiential learning, (2) the study must include a treatment group who re-
ceived an experiential learning activity and a control group that received traditional
education, (3) the study must be published in English, and (4) the search start date
was 1974 since this is the first date associated with the often cited definition of
experiential learning (Hoover, 1974) and ending date was 2017. This approach
allowed for the greatest inclusion of potential studies since no exceptions were
made based on age of participants, nation where the study was conducted, or the
discipline associated with the research.
Literature Search
Two reviewers independently conducted electronic searches of EBSCO/Host
databases (e.g., Academic Search Complete, Business Source Complete, Educa-
tion Research Complete, ERIC, PsycArticles, and PsycINFO) and Google Scholar
using the keywords “experiential learning,” “experiential,” and “service learning”
from 1974 through 2017. The resulting search identified 13,626 articles. These
articles were individually reviewed and 1,244 were determined to be empirical.
Full text copies of the 1,244 empirical studies were obtained and were further ex-
amined to ensure all remaining criteria were met. This search resulted in 89 usable
studies (12,060 individual respondents) that contained sufficient information to be
included in the meta-analysis. One journal article contained two studies, so the
count of articles is 88. The articles included in this study are listed in Table 1.
Metric for Expressing Effect Size
We employed Cohen’s d(Cohen, 1988) to estimate and describe the effect of
experiential learning exercises on learning outcomes. This metric allows for the
reporting of the standardized mean difference (Cohen, 1988) and Cohen’s dis a
common metric used in meta-analysis (Borenstein, et al., 2009).
Variable coding
A coding sheet was constructed that included the definitions of all variables that
would be recorded. Variables were independently coded by two researchers, who
reviewed all discrepancies and recoded the results based on reaching a consensus.
Main effect size was coded by determining the learning outcome after the treatment.
This approach ensured that the greatest number of articles were included since
fewer than half of the articles reported a pre-/post-test difference. Articles were then
assessed to determine if the learning outcome was designed around understanding
a social issue (altering an individual’s frame of reference in terms of complex
social issues), increasing personal insight (changing an individual’s perception of
self), or increasing cognitive development (task development, skill development,
or academic achievement). Each study was further evaluated to determine how the
learning outcome was assessed using an objective or subjective measure, whether
248 Experiential Learning Meta-Analysis
Table 1: Studies included in the meta-analysis (number of studies =89).
Study N
Learning
Outcome
Service
Learning
Assessment
Method Feedback Time
Abdulwahed & Nagy (2009) 50 Cognitive No Objective Unknown 6 hours
Anderson & Woodhouse (1984) 59 Cognitive No Subjective Unknown 1 semester
Angelopoulou et al. (2015) 184 Cognitive No Objective Unknown 3 months
Aukes et al. (2008) 201 Personal No Subjective Yes 2 weeks
Axe (1988) 60 Cognitive No Objective Yes 1 semester
Battjes-Fries et al. (2017) 641 Personal No Objective Unknown 5 hours
Beling (2004) 38 Cognitive Yes Objective Unknown 32 hours
Bernacki & Jaeger (2008) 46 Cognitive Yes Objective Unknown 3 hours
Boss (1994) 71 Cognitive Yes Objective Unknown 20 hours
Bostick (1977) 116 Cognitive Yes Objective Unknown 1 semester
Branch et al. (2009) 76 Personal No Subjective Unknown 18 months
Brannan et al. (2008) 107 Cognitive No Objective Yes 2 hours
Brenenstuhl (1975) 66 Cognitive No Objective No 12 weeks
Bringle & Kremer (1993) 34 Personal Yes Objective Unknown 6 hours
Burns & Sherrell (1984) 34 Cognitive No Objective Unknown 1 hour
Burns, Storey, & Certo (1999) 36 Personal Yes Objective Unknown 2 days
Chen et al. (2012) 26 Cognitive Yes Objective Unknown 10 hours
Cram (1998) 32 Personal Yes Objective Yes 20 weeks
Diachun et al. (2006) 42 Cognitive No Objective Unknown 3 hours
Doyle et al. (2011) 33 Personal No Subjective Unknown 7 hours
Eskrootchi & Oskrochi (2010) 52 Cognitive No Objective Unknown Unknown
Fabiano et al. (2013) 88 Cognitive No Objective Yes 4 days
Faria & Whiteley (1990) 73 Cognitive No Objective Unknown 5 weeks
Feldman et al. (2006) 32 Cognitive Yes Objective No 20 hours
Fleck et al. (2017) 67 Cognitive Yes Cognitive Unknown 30 hours
Continued
Burch et al. 249
Table 1: Continued.
Study N
Learning
Outcome
Service
Learning
Assessment
Method Feedback Time
Flowers et al. (2010) 79 Cognitive No Objective Unknown 1 month
Fritzsche (1974) 73 Cognitive No Objective Yes 1 semester
Gamlath (2007) 77 Cognitive No Objective Yes 1 semester
Gelkopt et al. (2013) 48 Personal No Subjective Yes 12 months
Gallini & Moely (2003) 313 Personal Yes Objective Unknown 6 hours
Gorman et al. (1994) 70 Cognitive Yes Objective Unknown 1 semester
Hamilton & Zeldin (1987) 88 Cognitive No Objective Unknown Unknown
Hansen & Hansen (2006) 67 Cognitive No Objective No 1 hour
H´
ebert & Hauf (2015) 130 Cognitive Yes Objective Yes 1 semester
Hergert & Hergert (1990) 482 Cognitive No Subjective Unknown 1 semester
Hewson et al. (2006) 48 Cognitive No Subjective Unknown 8 hours
Hoover et al. (2010) 483 Cognitive No Objective Unknown 4 semesters
Johnston-Goodstar et al. (2016) 46 Cognitive No Objective Unknown 1 year
Keller & Li (2007) 42 Cognitive No Objective Unknown 1 semester
Klein (1980) 26 Cognitive No Objective No 1 semester
Knapp & Stubblefield (2000) 44 Cognitive Yes Objective Unknown 1 semester
Leung et al. (2012) 103 Cogntivie Yes Objective Unknown 20 hours
Li (2013) 40 Cognitive No Objective Unknown 1 semester
Moreno-Lopez et al. (2017) 110 Cognitive No Objective Unknown 15 hours
Lundy (2007) 119 Cognitive Yes Objective No 24 hours
Markus et al. (1993) 89 Cognitive Yes Objective Unknown 20 hours
Marshall et al. (2016) 32 Cognitive No Objective Unknown 4 days
McComb et al. (2004) 18 Personal No Objective Unknown 8 weeks
Mills et al. (2007) 127 Personal Yes Objective Unknown 10 days
Moely et al. (2002) 536 Personal Yes Objective Unknown 1 semester
Continued
250 Experiential Learning Meta-Analysis
Table 1: Continued.
Study N
Learning
Outcome
Service
Learning
Assessment
Method Feedback Time
Morgan et al. (2002) 187 Cognitive No Subjective Yes 2 weeks
Mpofu (2007) 130 Cognitive Yes Objective No 1 semester
Newmann & Rutter (1983) 314 Personal Yes Objective Unknown 5 months
Odom & Murphy (1992) 56 Cognitive No Objective No Unknown
Okon et al. (2004) 17 Cognitive No Objective Yes 1 hour
Osborne et al. (1998) 93 Cognitive Yes Objective Unknown 1 semester
Patton & Davis (1999) 308 Cognitive No Subjective No 16 weeks
Plante, Lackey, & Hwang (2009) 85, 39 Social No Subjective Yes 7 days
Ras & Rech (2009) 36 Cognitive No Objective Unknown 3 days
Reese (1997) 79 Social Yes Objective Unknown 20 hours
Reeve et al. (2004) 73 Personal No Objective Yes 5 months
Rosen et al. (2009) 39 Cognitive No Objective Unknown 6 hours
Rusu, Copaci, & Soos (2015) 18 Social Yes Objective Unknown 3 weeks
Saraswat et al. (2017) 18 Cognitive No Objective Yes 15 minutes
Sass & Coll (2015) 126 Cognitive Yes Objective Unknown 15 hours
Scott (1977) 86 Cognitive No Objective Unknown 1 semester
Seider et al. (2010) 399 Social Yes Objective Yes 150 hours
Siddique et al. (2013) 44 Cognitive No Objective No 1 class period
Siegel et al. (1997) 100 Cognitive No Objective Yes 90 min
Smith (1979) 83 Social No Subjective Yes 1 semester
Specht & Sandlin (1991) 46 Social No Objective Unknown 100 minutes
Stewart et al. (2015) 38 Cognitive No Subjective Unknown 10 sessions
Strage (2000) 477 Cognitive Yes Objective Unknown 15 hours
Tagawa & Imanaka (2010) 181 Cognitive No Objective Yes 18 hours
Teglasi & Rothman (2001) 59 Personal No Objective No 15 sessions
Continued
Burch et al. 251
Table 1: Continued.
Study N
Learning
Outcome
Service
Learning
Assessment
Method Feedback Time
Ti et al. (2009) 210 Cognitive No Objective Yes 10 minutes
Tomolo et al. (2011) 86 Cognitive No Objective Unknown 4 weeks
Tower & Hash (2013) 92 Cognitive No Subjective Unknown 2 years
Trapp et al. (1995) 40 Cognitive No Subjective No 1 semester
Wheatley, Hornaday, & Hunt (1986) 230 Cognitive No Objective Yes 1 semester
Wolf & Mehl (2011) 157 Personal No Subjective Yes 3 hours
Wolfe & Byrne (1978) 45 Cognitive No Objective Unknown 1 semester
Wong et al. (2012) 180 Personal No Subjective Unknown 20 hours
Zawadzki et al. (2012) 190 Personal No Objective Yes 1 hour
Zelechoski et al. (2017) 270 Cognitive No Objective Unknown 1 semester
Zhang et al. (2016) 388 Personal No Objective Unknown 5 days
Zhang & Campbell (2012) 385 Cognitive No Subjective Yes 1 academic year
Zigmont et al. (2015) 342 Personal No Objective Unknown 6 weeks
252 Experiential Learning Meta-Analysis
feedback was provided during any part of the activity and the length of time
(duration) of the activity.
RESULTS
Main Effects Analysis of Experiential Learning Outcomes
Our analysis computed the effects of experiential learning outcomes using Com-
prehensive Meta Analysis (Borenstein et al., 2009; metaanalysis.com), consistent
with many other meta-analytic studies (Allen, et al., 2012; Banks, Batchelor, &
McDaniel, 2010; Van Horn, Green, & Martinussen, 2009). The results in Table 2
show a point estimate for the standard difference in means for experiential learning
and the control group of d=.43 and d=.70 for the fixed effects models (FEMs)
and random effects models (REMs), respectively; both models are reported to ver-
ify that findings are robust across models. Because individual studies varied in how
experiential learning was measured and the possibility of moderating variables and
additional heterogeneity exists, the results of the REM should be evaluated as more
appropriate in this setting than those of the FEM since the REM assumes the “true
effect size varies from study to study” (Borenstein et al., 2009, p. 6).
An evaluation of fixed and random model forest plots showed support for
the positive effect of experiential learning on learning, and the results of a one-
study-removed analysis showed a range of individual study point estimates from
.39 to .47 for the fixed model and .52 to .88 for the random model. Therefore,
no one study exerted undue influence on our results. Publication bias was tested
using funnel plot analysis of the standard deviations of the means for the samples
included in this study. Analysis of the funnel plot shows some publication bias
toward studies showing positive mean difference in favor of the performance of the
experiential learning groups as compared to the nonexperiential learning groups.
After imputing the missing studies, the standard deviations of the means decreased
slightly. This finding indicates that the overall effect of experiential learning may
be slightly overstated in this study, but the publication bias is not severe enough
to bring the direction of the effect of experiential learning practices in this meta-
analysis into question.
Hypothesis one, which proposed that experiential learning will have a positive
effect on students learning, was supported across the 89 included studies. After
testing hypothesis one, a test for heterogeneity was conducted on this model using
the I2and Q-statistic, which detect the possible presence of moderating variables.
For our primary analysis associated with hypothesis one, the I2was .95 and the
Q-statistic of 1,725.02 was significant; thus, both scores substantiate the value in
investigating moderators.
Hypotheses two proposed that experiential learning activities will have a
positive effect on understanding social issues, developing personal insight, and
cognitive development, but that the strength of the relationship would vary. As
shown in Table 2, the point estimate of the mean difference was .57 for social
issues (both REM and FEM), .25 for REM, and .05 for FEM for personal insight,
and .89 for REM and .65 for FEM for cognitive development. The 95% confidence
interval for the REM was .38 to .76 for social issues, −.22 to .71 for personal insight,
Burch et al. 253
Table 2: Results of experiential learning and control group differences.
NKREM SER FEM SER
95% Confidence
Interval (Random) df Q T2I2
All Studies (H1) S 12,060 89 .43 .03 .70 .09 .52 .88 88 1,725.02 .67 94.89
Social (H2) S 703 6 .57 .10 .57 .10 .38 .76 5 2.06 .00 .00
Personal (H2) NS 3,789 20 .25 .24 .05 .04 -.22 .71 19 799.60 1.06 97.62
Cognitive (H2) S 7,568 63 .89 .09 .65 .03 .73 1.06 62 680.58 .40 90.89
254 Experiential Learning Meta-Analysis
and .73 to 1.06 for cognitive development. Thus, hypotheses two is supported for
social and cognitive development, but not for personal insight.
Hypotheses three stated that objective outcome measures would outperform
objective measures when evaluating the effects of experiential learning. Table 3
shows that the REM difference for objective measures was .79 while subjective
measures REM difference was .53. However, the p-value for the difference was
.08, and so hypothesis three was not supported.
Hypothesis four proposed that experiential exercises that included feedback
would have higher learning outcomes than those that did not have feedback. The
results in Table 3 show that feedback is an important moderator with a REM
difference of .97 for lessons with feedback; however, not using feedback still
resulted in and REM difference of .83. Based on .07 p-value, this hypothesis was
not supported.
The final hypothesis (five) stated the duration of the activity moderated
learning. Comprehensive Meta Regression software was used to test the strength
and direction of the effect of time on learning. The results of metaregression
are presented in Table 4. Time resulted in a coefficient of .03 with a confidence
interval including zero and was not significant. Thus, hypothesis five was not
supported.
Post Hoc Analysis
Six post hoc analyses were run to determine the influence of whether the experi-
ential learning was service learning, if it was a simulation, nationality of sample,
academic discipline, age, and gender. Type of exercise (service learning/not ser-
vice learning and simulation/not simulation), nationality of sample (United States
vs. non-United States) and academic discipline (business vs. non-business) were
tested using the same type of moderator analysis used to test hypotheses three and
four. The results of this analysis are presented in Table 5.
The first ad hoc analysis took a deeper look at whether there was a difference
between service learning experiential exercises and nonservice learning. The REM
mean difference for those activities without a service learning component (REM
.83) was significantly higher (p<.01) than service learning exercises (REM .49).
Similarly, experiential learning activities that included a simulation (REM .96)
outperformed (p<.001) those exercises that did not include a simulation (REM
.70). Nationality of sample was found to moderate the relationship of groups
based on the US group (.81) being significantly different (p<.05) than the mean
difference for the non-United States group (.41). Discipline was also a significant
moderator since the mean difference for the business group (1.04) and nonbusiness
group (.64) were significant (p<.001).
The final two post hoc analyses were tested using metaregression in the same
manner as hypothesis five. As illustrated in Table 6, age of group and percent of
female participants both produced very low order negative coefficients, −.04 and
−.03, respectively, indicating a very low practical level of significance. In contrast,
the 95% confidence interval did not support a difference based on gender, but it
did show a negative relationship between age and learning.
Burch et al. 255
Table 3: Moderator analysis.
NkREM SER
95% Confidence
Interval REM
p-Val u e
Difference df Q T2I2
Assessment (H3) Not supported
Subjective 3,689 19 .53 .11 .31 .75 .08 18 158.60 .19 88.65
Objective 8,371 70 .79 .12 .56 1.02 69 1,521.45 .90 95.47
Feedback (H4Not supported
Feedback 2,555 21 .97 .21 .57 1.38 .07 20 432.78 .81 95.38
No Feedback 1,014 8 .83 .24 .36 1.29 7 56.71 .37 87.66
256 Experiential Learning Meta-Analysis
Table 4: Metaregression results.
NKCoefficient SE
95%
Confidence
Interval df Q Q-sig T2I2
Time (H5) not
supported
11,704 84 .03 .07 −.11 .17 82 1,631.72 .00 .67 94.97
DISCUSSION
While we identified and tested several moderators and found that many of them
were associated with differential learning outcomes, the two most important find-
ings of this meta-analysis are that, first, experiential activities “work;” they generate
better learning outcomes than control groups who do not receive experiential ac-
tivities. Although this has been the feeling of many researchers, there has not been
a definitive study that addresses the effects of all experiential learning. Our results
support the previous results found in meta-analytic studies of service learning,
but they go one step further to show that all experiential learning pedagogies are
beneficial means of educating students.
Second, each and every moderator analysis shared one important finding in
common, specifically that experiential activities’ efficacy was robust across almost
every moderator condition. While learning was relatively higher or lower in some
conditions or subgroups, learning was always positive and Cohen’s dwas almost
always at least of medium effect size. Cohen (1988) considered effect sizes of .5
medium and .8 large, and while these are rules of thumb, our analysis shows that
most reach the level of moderate while many exceed the large effect size. Thus,
our moderators mainly distinguished between “good” and “even better” findings
as they relate to experiential activities and learning.
Discussion of Findings with Respect to Implications for Teaching
Regarding the three different types of learning outcomes (Hypothesis 2), we did
observe greater learning outcomes for cognitive and social issue measures than
for personal insight measures. While there are many possible explanations for this
result, it is quite possible that personal insights are more pedagogically distal, more
abstract, or more difficult generally for students to encompass and that individual
differences may play as a significant contributor concerning the students processing
of the exercise. Social issues are usually emphasized in business and society and
ethics/social responsibility classes; and although they may be difficult to cover,
there appears to be a growing consensus around which social issues are important
and how they should be addressed. Even so, experiential activities are beneficial
regarding personal insight outcomes (d=.25).
One intriguing finding in our study was the difference between objective and
subjective measures was not statistically significant (p=.08) despite the larger
effect size for objective measures (d=.79) compared to subjective measures
(d=.53). An argument could be made that because of student overconfidence
Burch et al. 257
Table 5: Post hoc moderator analysis.
NkREM SER
95% Confidence
Interval REM
p-Val u e
Difference df Q T2I2
Service Learning
Yes 3,661 27 .49 .06 .36 .62 <.01 26 68.31 .06 61.94
No 8,399 62 .83 .13 .56 1.08 61 1,612.17 .93 96.22
Simulation
Yes 8,399 33 .96 .13 .71 1.21 <.001 32 389.29 .46 91.78
No 4,213 29 .70 .22 .24 1.12 28 1,134.86 1.37 97.53
Nationality
US 9,079 71 .81 .10 .61 1.00 <.05 70 1,159.758 .60 93.96
Non-US 2,981 18 .41 .22 -.01 .83 17 467.18 .77 96.36
Discipline
Business 2,887 20 1.04 .17 .71 1.36 <.001 19 263.84 .48 92.80
Non-business 9,173 69 .64 .11 .43 .84 68 1,134.90 .70 94.95
258 Experiential Learning Meta-Analysis
Table 6: Post Hoc meta-regression results.
NkCoefficient SE
95%
Confidence
Interval df Q Q-sig T2I2
Age 2,657 14 −.04 .01 −.06 −.02 12 148.93 .00 .30 91.94
Percent
Female
4,473 25 −.03 .79 −1.58 1.51 23 601.73 .00 .67 96.01
and lack of self-insight in behavioral skills (Giambatista & Hoover, 2014), the
opposite direction could have been hypothesized. We think it likely that students
underestimate their self-assessed learning because higher order learning (e.g.,
analysis, synthesis, or creation) is a more subtle process than lower level learning
(e.g., recalling facts) where there are obviously right/wrong answers (see Bloom
1954 for more discussion). Indeed, nearly every lesson in higher education comes
with caveats, boundary conditions, situational limits, etc. This is very likely to
leave students uncertain of how much learning progress they are actually making.
Thus, we advise educators to develop activities and teaching moments that increase
transparency of the learning students are experiencing. Educators can, for example,
log student inputs on experiential activities from early in the semester and compare
them to notes on later activities to show students how the later inputs are more
sophisticated and demonstrate more direct and applied learning than derived from
earlier activities. Journaling in this instance may be one means of allowing students
to reflect on their previous views and see any changes in their understandings and
misunderstandings.
Our findings concerning feedback are also somewhat confusing. The pres-
ence of feedback does increase learning (d=.97) by a very large level, but even
those studies without feedback (d=.83) performed well. Feedback facilitates
learning, and instructors should try to ensure a temporal and learning space for
feedback. However, all is not lost if such space does not occur, as student learning
was enhanced even in experiential activities not providing such feedback.
Learning outcomes did not vary as a function of activity duration. While it
seems intuitive to expect more learning from longer activities, it may be that an
activity’s fit to an appropriate duration matters more. It may also be that learning
gains in longer experiential activities come with an offsetting opportunity cost or
that time offers qualitatively different benefits for traditional (i.e., control groups)
learning that also tend to offset. We encourage future experiential learning scholars
to explore these issues in more depth so that the field can begin to develop a less
speculative and more conclusive understanding of the role, if any, that duration
plays in influencing learning outcomes. The idea of spaced learning argues that
some learning occurs in the spaces between learning sessions, and the abstract
conceptualization learning process (Kolb, 1984) associated with lecture-based
learning may benefit from longer time duration. Thus, we interpret our time finding
as one of equifinality, where time offers different types of marginal and offsetting
benefits in experiential environments versus traditional learning environments.
Burch et al. 259
Our post hoc evaluation of moderators also contributes significantly to the
ongoing discussion of experiential learning. Although it was not initially expected,
there is a significant difference between experiential learning exercises that are
based around social issues (service learning) and those that do not contain such
a focus. This may be due to 74% of nonservice learning studies focusing on
cognitive outcomes while only 61% of service learning studies chose to evaluate
cognitive outcomes. These results do not suggest a move away from service learn-
ing approaches; they simply show the robust nature of using experiential learning
techniques both in and out of the classroom and support our decision to evaluate
all experiential learning exercises instead of just service learning.
Similar to the previous results, our ad hoc analysis showed that 20 stud-
ies that were conducted in business courses significantly outperformed those in
nonbusiness disciplines. Again, 95% of the business studies chose to evaluate cog-
nitive measures, while only 62% of the nonbusiness studies chose this approach.
Business educators can certainly take advantage of the use of experiential learning
approaches since many of the business topics can be evaluated using cognitive
measures.
Implications for Research and Limitations
In the discussion section, we have, thus, far focused on interpreting our findings
with an eye toward implications for teaching. However, we also believe this study
has important implication for research, particularly, regarding limitations of extant
studies. First, the proportion of usable studies with experimental and control groups
compared to the number of empirical studies (89/1,244) was worrisome, though
understandable. It is likely that many studies were conducted out of convenient
access to data rather than part of a proactive research program. We urge scholars
to take a programmatic approach that is designed to address questions of import
to higher education and include control groups whenever possible.
Second, even in a meta-analysis there may be artifacts that bias overall
findings. We were unable to rule out the possibility of instructor biasing effects. If
we (reasonably) assume that most published research in the field emanates from
faculty who are committed to experiential learning, it is possible that they were less
committed and enthusiastic about teaching control groups in a traditional, lecture-
based fashion. Thus, the differential learning effects that we observed across the
studies could be biased by instructor engagement. Frankly, we do not believe this to
threaten our basic findings because the effect size was high and because, across 89
studies, we would not expect a consistent and large gap in instructor commitment
across control groups. One possible workaround for future scholars in the field
would be to engage in programmatic research that involves multiple instructors
and including colleagues who are engaged in traditional, lecture-based teaching
in research design. The problem would then be a possible confound between the
learning impact of instructor differences and pedagogical differences.
Third, we urge scholars in the field interested in experiential learning to
consider several recommendations in future research stemming from our work
gathering and interpreting data. Scholars should take care to explicate characteris-
tics of the research setting and design such as (a) discussing faculty background,
260 Experiential Learning Meta-Analysis
(b) clarifying exactly how key variables are defined and operationalized, (c) in-
cluding design features and variables of clear import to the field, for one example,
higher levels of learning like creativity, and (d) generally engaging in a more ex-
plicit discussion of study aspects relating to the variables in this study. We were
unable to classify some studies on some variables, particularly feedback, and this
is attributable at least in part to deficiencies in how these studies were conducted
and written. Other possible moderators of interest could not even be tested because
of source study limitations. Finally, the field needs more longitudinal research
addressing learning retention over time.
CONCLUSION
The current meta-analysis was conducted to fill a long-standing gap in the literature
on experiential learning, and definitively and quantitatively shows the effects of
all experiential learning on students learning outcomes. Examination of 89 studies
published over a 43-year span showed that students experienced superior learning
outcomes when experiential pedagogies were employed. Further, these outcomes
were almost a half standard deviation higher (d=.43) than traditional delivery
techniques, and these effects were robust across every moderator we were able to
study, though they of course varied somewhat in magnitude.
We view the results of this meta-analysis as a clear affirmation of the efficacy
of experiential learning in delivering learning outcomes. We further conclude that
experiential learning’s efficacy is robust across varied learning outcomes relevant to
higher education. We were unable to identify a single context across the empirical
studies where experiential learning did not produce a positive effect on learning.
Because of these findings, we believe that while research needs to continue and
improve its basis in theory and design, educators can be confident that experiential
learning in general facilitates learning.
REFERENCES
Abdulwahed, M., & Nagy, Z. K. (2009). Applying Kolb’s experiential learning
cycle for laboratory education. Journal of Engineering Education,98(3),
283–294.
Allen, D., & Young, M. (1997). From tour guide to teacher: Deepening
cross-cultural competence through international experience-based education.
Journal of Management Education,21(2), 168–189.
Allen, T. D., Johnson, R. C., Saboe, K. N., Cho, E., Dumani, S., & Evans, S.
(2012). Dispositional variables and work-family conflict: A meta-analysis.
Journal of Vocational Behavior,80, 17–26.
Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching,
and assessing: A revision of bloom’s taxonomy of educational objectives.
Boston, MA: Allyn & Bacon.
Anderson, P. H. & Lawton, L. (1997). Demonstrating the learning effectiveness
of simulations: Where we are and where we need to go. Developments in
Business Simulation and Experiential Learning,24, 68–73.
Burch et al. 261
Anderson, P. H., & Woodhouse, R. H. (1984). The perceived relationship between
pedagogies and attaining objectives in the business policy course. Develop-
ments in Business Simulation & Experiential Learning,11, 152–156.
Andrews, C. (2007). Service learning: Applications and research in business. Jour-
nal of Education for Business,83, 19–26.
Angelopoulou, M. V., Kavvadia, K., Taoufik, K., & Oulis, C. J. (2015). Comparative
clinical study testing the effectiveness of school based oral health education
using experiential learning or traditional lecturing in 10 year-old children.
BMC Oral Health,15, 1–7.
Armstrong, S. J., & Mahmud, A. (2008). Experiential learning and the acquisi-
tion of managerial tacit knowledge. Academy of Management Learning &
Education,7(2), 189–208.
Association to Advance Collegiate Business Schools (AACSB). (2013). Eligibility
procedures and accreditation standards for business accreditation. Tampa, FL.
Aukes, L. C., Geertsma, J., Cohen-Schotanus, J., & Zwierstra, P. (2008). The effect
of enhanced experiential learning on the personal reflection of undergraduate
medical students. Medical Education Online,13(1), 1–10.
Axe, B. S. (1988). Should students play games in labor relations? Developments
in Business Simulation & Experiential Exercises,15, 165–169.
Banks, G. C., Batchelor, J. H., & McDaniel, M. A. (2010). Smarter people are (a bit)
more symmetrical: A meta-analysis of the relationship between intelligence
and fluctuating asymmetry, Intelligence,38, 393–401.
Battjes-Fries, M. C. E., Haveman-Nies, A., Zeinstra, G. G., Dongen, E. J. I.,
Meester, H. J., Top-Pullen, R., Veer, P., & Graaf, K. (2017). Effectiveness
of taste lessons with and without additional learning activities on children’s
willingness to taste vegetables. Appetite,109, 201–208.
Beling, J. (2004). Impact of service learning on physical therapist students’ knowl-
edge of and attitudes toward older adults and on their critical thinking ability.
Journal of Physical Therapy Education,18, 13–21.
Bernacki, M. L., & Jaeger, E. (2008). Exploring the impact of service-learning on
moral development and moral orientation. Michigan Journal of Community
Service Learning,14, 5–15.
Bloom, B. S. (1954). Taxonomy of educational objectives handbook I: Cognitive
domain. New York, NY: D. McKay & Co., Inc.
Borenstein, M., Hedges, L.V., Higgins, J. P., & Rothstein, H. R. (2009). Introduc-
tion to meta-analysis. West Sussex, UK: John Wiley & Sons, Ltd.
Boss, J. (1994). The effect of community service work on the moral development
of college ethics students. Journal of Moral Education,23, 183–197.
Bostick, L. A. (1977). Volunteers in the criminal justice system: Impact of the
experience on their attitudes and behavior. Doctoral dissertation, The Ohio
State University.
Boyatzis, R. E., Cowen, S. S., & Kolb, D. A. (1995). Innovation in professional
education. San Francisco, CA: Jossey-Bass.
262 Experiential Learning Meta-Analysis
Bradley, T. P., Burch, G. F., & Burch, J. J. (2015). Increasing knowledge by
leaps and bounds: Using experiential learning to address threshold concepts.
Organization Management Journal,12, 87–101.
Branch, W. T., Frankel, R., Gracey, C. F., Haidet, P. M., Weissmann, P. F., Cantey,
P., Mitchell, G. A., & Thomas, S. (2009). A good clinician and a caring
person: longitudinal faculty development and the enhancement of the hu-
man dimensions of care. Academic Medicine: Journal of The Association of
American Medical Colleges,84(1), 117–125.
Brannan, J. D., White, A., & Bezanson, J. L. (2008). Simulator effects on cognitive
skills and confidence levels. Journal of Nursing Education,47(11), 495–500.
Brenenstuhl, D. C. (1975). Experiential study of performance in a basic man-
agement course. Simulation Games and Experiential Exercises in Action,2,
83–91.
Bringle, R. G., & Kremer, J. F. (1993). Evaluation of an intergenerational service-
learning project for undergraduates. Educational Gerontology,19, 407–416.
Bruner, J. S. (1970). Some theories on instruction. In Stones, E. (ed.), Readings in
educational psychology. London, UK: Methuen, 112–124.
Bunker, K. A., & Webb, A. D. (1992). Learning how to learn from experience: Im-
pact of stress and coping. Greensboro, NC: Center For Creative Leadership.
Burch, G. F., Burch, J. J., & Heller, N. A. (2015). An empirical investigation of
the Conception Focused Curriculum: The importance of introducing under-
graduate business statistics students to the ‘Real World.’ Decision Sciences
Journal of Innovative Education,13, 485–512.
Burch, G. F., Burch, J. J., Bradley, T. P., & Heller, N. A. (2015). Identifying and
overcoming threshold concepts and conceptions: Introducing a conception-
focused curriculum to course design. Journal of Management Education,39,
476–496.
Burke, L. A., & Moore, J. E. (2003). A perennial dilemma in OB education: Engag-
ing the traditional student. Academy of Management Learning & Education,
2, 37–52.
Burke, L. A., & Sadler-Smith, E. (2006). Instructor intuition in the educational
setting. Academy of Management Learning & Education,2, 169–181.
Burns, A. C., & Sherrell, D. L. (1984). A path analytic study of the effects of
alternative pedagogies. Developments in Business Simulation & Experiential
Learning,11, 115–119.
Burns, M., Storey, K., & Certo, N. J. (1999). Effect of service learning on attitudes
towards students with severe disabilities. Education and Training in Mental
Retardation and Developmental Disabilities,34(1), 58–65.
Celio, C. I., Durlak, J., & Dymnicki, A. (2011). A meta-analysis of the impact
of service-learning on students. Journal of Experiential Education,34(2),
164–181.
Chen, H. C., McAdams-Jones, D., Tay, D. L., & Packer, J. M. (2012). The impact
of service-learning on students’ cultural competence. Teaching and Learning
in Nursing,7(2), 67–73.
Burch et al. 263
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.).
Glencoe, IL: Free Press.
Coker, J. S., Heiser, E., Taylor, L., & Book, C. (2017). Impacts of experien-
tial learning depth and breadth on studet outcomes. Journal of Experiential
Education,40, 5–23.
Conway, J. M., Amel, E. L., & Gerwien, D. P. (2009). Teaching and learning in
the social context: A meta-analysis of service learning’s effects on academic,
personal, social, and citizenship outcomes. Teaching of Psychology,36, 233–
245.
Cram, S. B. (1998). The Impact of Service-Learning on Moral Development and
Self-Esteem of Community College Ethics Students. Doctoral dissertation,
University of Iowa.
Dean, K. L., & Jolly, J. P. (2012). Student identity, disengagement, and learning.
Academy of Management Learning & Education,11(2), 228–243.
Dewey, J. (1897). The significance of the problem of knowledge. Chicago, IL:
University of Chicago Press.
Dewey, J. (1938). Experience and education. New York, NY: Touchstone.
Diachun, L. L., Dumbrell, A. C., Byrne, K., & Esbaugh, J. (2006). But does it stick?
Evaluating the durability of improved knowledge following an undergraduate
experiential geriatrics learning session. Journal Of The American Geriatrics
Society,54(4), 696–701.
Doyle, D., Copeland, H. L., Bush, D., Stein, L., & Thompson, S. (2011). A
course for nurses to handle difficult communication situations. A randomized
controlled trial of impact of self-efficacy and performance. Patient Education
and Counseling,82(1), 100–109.
Duncan, G. J. & Dunifor, R. (2012). Soft skills and long-run labor market success.
Research in Labor Economics,35, 313–339.
Ekpenyong, L. E. (1999). A reformulation of the theory of experiential learning ap-
propriate for instruction in formal business education. Journal of Vocational
education and Training,51(3), 449–471.
Eskrootchi, R., & Oskrochi, G. R. (2010). A study of the efficacy of project-based
learning integrated with computer based simulation - STELLA. Journal of
Educational Technology & Society,13(1), 236–245.
Fabiano, G. A., Vujnovic, R. K., Waschbusch, D. A., Yu, J., Mashtare, T., Pariseau,
M. E., Pelham, W. E., Parham, B. R., & Smalls, K. J. (2013). A comparison
of workshop training versus intensive, experiential training for improving
behavior support skills in early educators. Early Childhood Research Quar-
terly,28(2), 450–460.
Faria, A. J., & Whiteley, T. R. (1990). An empirical evaluation of the pedagogical
value of playing a simulation game in a principles of marketing course.
Developments in Business Simulation & Experiential Exercises,17, 53–57.
Feldman, A. M., Moss, T., Chin, D., Marie, M., Rai, C., & Graham, R. (2006).
The impact of partnership-centered, community based learning on first-year
264 Experiential Learning Meta-Analysis
students’ academic research papers. Michigan Journal of Community Service
Learning,13, 16–29.
Feldman, D. C. (1988). Managing careers in organizations. Glenview, IL: Scott,
Foresman.
Fleck, B., Hussey, H. D., Rutledge-Ellison, L. (2017). Linking class and commu-
nity: An investigation of service learning. Teaching of Psychology,44(3),
232–239.
Fleishman, E. A., & Mumford, M. D. (1989). Individual attributes and training
performance. In I. L. Goldstein (Ed.), Training and development in organi-
zations. San Francisco, CA: Jossey-Bass, 376–416.
Flowers, S. K., Vanderbush, R. E., Hastings, J. K., & West, D. (2010). Web-based
multimedia vignettes in advanced community pharmacy practice experi-
ences. American Journal of Pharmaceutical Education,74(3), 1–5.
Forrest, S. P. & Peterson, T. O. (2006). It’s called andragogy. Academy of Man-
agement Learning & Education,5(1), 113–122.
Freedman, R. D., & Stumpf, S. A. (1980). Learning style theory: Less than meets
the eye. Academy of Management Review,5(3): 445–447.
Fritzsche, D. J. (1974). The lecture vs. the game. Simulations, Games, and Expe-
riential Learning Techniques,1, 1–6.
Gallini, S. M., & Moely, B. E. (2003). Service-learning and engagement, academic
challenge, and retention. Michigan Journal of Community Service Learning,
10, 5–14.
Gamlath, S. L. (2007). Outcomes and observations of an extended accounting
board game. Developments in Business Simulation & Experiential Learning,
34, 132–137.
Ganesh, G., & Qin, S. (2009). Using simulations in the undergraduate marketing
capstone case course. Marketing Education Review,19(1), 7–16.
Gelkopf, M., Hasson-Ohayon, I., Bikman, M., & Kravetz, S. (2013). Nature adven-
ture rehabilitation for combat-related posttraumatic chronic stress disorder:
A randomized control trial. Psychiatry,209(3), 485–493.
Gentry, J. W. (1990). What is experiential learning? Guide to business gaming and
experiential learning. Durbuque, IL: Nichols Publishing Company; 9–20.
Gentry, J. W., Commuri, S., Burns, A. C., & Dickinson, J. R. (1998). The second
component to experiential learning: A look back at how ABSEL has handled
conceptual and operational definitions of learning. Developments in Business
Simulation and Experiential Learning,25, 62–68.
Giambatista, R. C. & Hoover, J. D. (2014). An exploration of overconfidence in ex-
periential learning of behavioral skills among MBA students. Developments
in Business Simulation and Experiential Learning,41, 268–278.
Giancalone, R. A., & Thompson, K. R. (2006). Business ethics and social responsi-
bility education: Shifting the Worldview. Academy of Management Learning
& Education,5, 266–277.
Burch et al. 265
Godfrey, P. C., Illies, L. M., & Berry, G. (2005). Creating breadth in business
education through service-learning. Academy of Management Learning &
Education,4(3), 309–323.
Gorman, M., Duffy, J., & Heffernan, M. (1994). Service experience and the moral
development of college students. Religious Education,89, 422–432.
Hajshirmohammadi, A. (2017). Incorporating experiential learning in engineering
courses. IEEE Communications Magazine,55(11), 166–169.
Hall, D. T. (1991). Twenty questions: Research needed to advance the field of
careers. In R. F. Morrison & J. Adams (Eds.), Contemporary career devel-
opment issues. New York, NY: Praefer, 219–233.
Hamilton, S. F., & Zeldin, R. S. (1987). Learning civics in the community. Cur-
riculum Inquiry,4, 407–420.
Hansen, K., & Hansen, R. S. (2006). Employment interview preparation: Assessing
the writing-to-learn approach. Developments in Business Simulation and
Experiential Learning,33, 140–148.
H´
ebert, A., & Hauf, P. (2015). Students learning through service learning: Ef-
fects on academic development, civic responsibility, interpersonal skills and
practical skills. Active Learning in Higher Education,16(1), 37–49.
Hergert, M., & Hergert, R. (1990). Factors affecting student perceptions of learning
in a business policy game. Developments in Business Simulation & Experi-
ential Exercises,17, 92–96.
Hewson, M. G., Copeland, H. L., Mascha, E., Arrigain, S., Topol, E., & Fox,
J. E. B. (2006). Integrative medicine: implementation and evaluation of a
professional development program using experiential learning and concep-
tual change teaching approaches. Patient Education & Counseling,62(1),
5–12.
Higgins, J. P., Thompsin, S. G., Deeks, J. J., & Altman, D. G. (2003). Mea-
suring inconsistency in meta-analysis. British Medical Journal,327, 557–
560.
Holman, D., Pavlica, K., & Thorpe, R. (1997). Rethinking Kolb’s theory of ex-
periential learning in management education: The contribution of Social
Constructionism and Activity Theory. Management Learning,28(2), 135–
148.
Hoover, J. D. (1974). Experiential learning: Conceptualization and definition.
Simulations, Games, and Experiential Learning Techniques,1, 31–35.
Hoover, J. D., & Whitehead, C. J. (1975). An experiential-cognitive methodology
in the first course in management: Some preliminary results. Simulation
Games and Experiential Learning in Action,2, 25–30.
Hoover, J., Giambatista, R., Sorenson, R., & Bommer, W. (2010). Assessing the ef-
fectiveness of whole person learning pedagogy in skill acquisition. Academy
of Management Learning & Education,9(2), 192–203.
Hopkins, R. (1993). David Kolb’s learning machine. Journal of Phenomenological
Psychology,24, 46–62.
266 Experiential Learning Meta-Analysis
Huang, T. C., Chen, C. C., & Chou, Y. W. (2016). Animating eco-education: To
see, feel, and discover in an augmented reality-based experiential learning
environment. Computers & Education,96, 72–82.
Johnston-Goodstar, K., Piescher, K., & LaLiberte, T. (2016). Critical experiential
learning in the Native American community for Title IV-E students. A pilot
evaluation. Journal of Public Child Welfare,10(3), 310–326.
Kamath, S., Agrawal, J., & Krickx, G. (2008). Implementing experiential ac-
tion learning in international management education: The Global Business
Strategic (GLOBUSTRAT) consulting program. Journal of Teaching in In-
ternational Business,19(4), 403–449.
Kayes, D. C. (2002). Experiential learning and its critics: Preserving the role of
experience in management learning and education. Academy of Management
Learning & Education,1(2), 137–149.
Keller, J., & Li, R. (2007). The use of multimedia learning tools to facilitate
online learning of business statistics. Developments in Business Simulation
and Experiential Learning,34, 51–56.
Keller, J. M. (1983). Motivational design of instruction. In C. M. Reigeluth (ed.),
Instructional design theories and models: An overview of their current status.
Hillsdale, NJ; Lawrence Erlbaum Associates, 386–483.
Kerr, C. (2001). The uses of the university (5th ed.). Harvard, MA: Harvard
University Press.
Klein, R. D. (1980). Can business games effectively teach business concepts?
Developments in Business Simulation and Experiential Learning,7, 128–
131.
Knapp, J. L., & Stubblefield, P. (2000). Changing students’ perceptions of ag-
ing: the impact of an intergenerational service learning course. Educational
Gerontology,26, 611–621.
Kolb, D. A. (1984). Experiential learning. Upper Saddle River, NJ: Prentice-Hall.
Koriat, A., Sheffer, L., & Ma’ayan, H. (2002). Comparing objective and subjective
learning curves: Judgments of learning exhibit increased underconfidence
with practice. Journal of Experimental Psychology: General,131(2), 147–
162.
Leung, A. Y. M., Chan, S. S. C., Kwan, C. W., Cheung, M. K. T., Leung, S. S. K.,
& Fong, D. Y. T. (2012). Service learning in medical and nursing training: A
randomized controlled trial. Advances in Health Sciences Education,17(4),
529–545.
Lewin, K. (1946) Action research and minority problems. Journal of Social Issues,
2(4), 34–46.
Lewin, K. (1951). Field theory in social science, selected theoretical papers.InD.
Cartwright (ed.). New York, NY: Harper and Row.
Li, M., Mobley, W., & Kelly, A. (2013). When do global leaders learn best to
develop cultural intelligence? An investigation of the moderating role of
Burch et al. 267
experiential learning style. Academy of Management Learning & Education,
12(1), 32–50.
Li, Q. (2013). Digital game building as assessment: A study of secondary stu-
dents’ experience. Developments in Business Simulation and Experiential
Exercises,40, 74–78.
Linnenbrink, E. A., Pintrich, P. R. (2002). Motivation as an enabler for academic
success. School Psychology Review,31(3), 313–327.
Lundy, B. L. (2007). Service learning in life-span developmental psychology:
Higher exam scores and increased empathy. Teaching of Psychology,34,
23–27.
Markus, G. B., Howard, J. P. F., & King, D. C. (1993). Integrating community
service and classroom instruction enhances learning: Results from an exper-
iment. Educational Evaluation and Policy Analysis,15, 410–419.
Marshall, M. M., Carrano, A. L., & Dannels, W. A. (2016). Adapting experien-
tial learning to develop problem solving skills in deaf and hard-of-hearing
engineering students. Journal of Deaf Studies and Deaf Education,21(4),
403–415.
McCall, M. W., Lombardo, M. M., & Morrison, A. M. (1988). The lessons of
experience. Lexington, MA: Lexington Books.
McCarthy, P. R., & McCarthy, H. M. (2006). When case studies are not enough:
Integrating experiential learning into business curricula. Journal of Education
for Business,81(4), 201–204.
Meyer, J. H. F., & Land, R. (2003). Threshold concepts and troublesome knowl-
edge: Linkages to ways of thinking and practicing within the disciplines
(ETL Project: Occasional Report 4). Edinburgh, Scotland: University of
Edinburgh.
Meyer, J. H. F., & Land, R. (2005). Threshold concepts and troublesome knowledge
(2): Epistemological considerations and a conceptual framework for teaching
and learning. Higher Education,49, 255–289.
Meyer, J. P. (2003). Four territories of experience: A developmental action inquiry
approach to outdoor-adventure experiential learning. Academy of Manage-
ment Learning & Education,2(4), 352–363.
Miettinen, R. (1998). About the legacy of experiential learning. Lifelong Learning
in Europe,3, 165–171.
Mills, B. A., Bersmina, R. B., & Plante, T. G. (2007). The impact of college student
immersion service learning trips on coping with stress and vocational identity.
The Journal for Civic Commitment,9, 1–8.
Mitchell, L. B., Skinner, L. B., & White, B. J. (2010). Essential soft skills for suc-
cess in the twenty-first century workforce as perceived by business educators.
Journal of Research in Business Education,52(1), 43–53.
Moely, B. E., McFarland, M., Miron, D., Mercer, S. & Ilustre, V. (2002). Changes in
college students’ attitudes and intentions for civic involvement as a function
268 Experiential Learning Meta-Analysis
of service-learning experiences. Michigan Journal of Community Service
Learning,9, 18–26.
Moreno-L´
opez, I., Ramos-Sellman, A., Miranda-Aldaco, C., Quinto, M. T. G.
(2017). Transforming ways of enhancing foreign language acquisition in
the Spanish classroom: Experiential learning approaches. Foreign Language
Annals,50, 398–409.
Morgan, P. J., Cleave-Hogg, D., McIlroy, J., & Devitt, J. H. (2002). Simulation
technology: a comparison of experiential and visual learning for undergrad-
uate medical students. Anesthesiology,96(1), 10–16.
Morris, L. V. (2016). Experiential Learning for all. Innovative Higher Education,
41, 103–104.
Morrison, R. F., & Brantner, T. M. (1992). What enhances or inhibits learning a
new job? A basic career issue. Journal of Applied Psychology,77(6), 926–
940.
Morrison, R. F., & Hoch, R. R. (1986). Career building: Learning from cumulative
work experience. In D. T. Hall & Associates (ed.). Career development in
organizations. San Franciso, CA: Jossey-Bass, 236–273.
Mpofu, E. (2007). Service-learning effects on the academic learning of rehabili-
tation services students. Michigan Journal of Community Service Learning,
14, 46–52.
Navarro, P. (2008). The MBA core curricula of top-ranked US business schools: A
study in failure? Academy of Management Learning & Education,7, 108–
123.
Newman, F. M., & Rutter, R. A. (1983). The Effects of High School Commu-
nity Service Programs on Students’ Social Development. Washington, DC:
National Institute of Education.
Odom, M. C., & Murphy, D. S. (1992). Expert systems versus traditional meth-
ods for teaching accounting issues. Developments in Business Simulation &
Experiential Exercises,19, 131–135.
Okon, T. R., Evans, J. M., Gomez, C. F., & Blackhall, L. J. (2004). Palliative
educational outcome with implementation of PEACE tool integrated clinical
pathway. Journal of Palliative Medicine,7(2), 279–295.
Osborne, R. E., Hammerich, S., & Hensley, C. 1998. Student effects of service-
learning: Tracking changes across a semester. Michigan Journal of Commu-
nity Service Learning,5: 5–13.
Patton, G. H., & Davis, D. C. (1999). Developing participant satisfaction models
of experiential exercises in business education. Developments in Business
Simulation & Experiential Learning,26, 165–169.
Pearce, G., & Jackson, J. (2009). Experiencing the product life cycle management
highs and lows through dramatic simulation. Journal of Marketing Educa-
tion,31(3), 212–218.
Piaget, J. (1977). The development of thought: Equilibration of cognitive structures.
New York, NY: Viking.
Burch et al. 269
Plante, T. G., Lackey, K., & Hwang, J. Y. (2009). The impact of immersion trips on
development of compassion among college students. Journal of Experiential
Learning,32(1), 28–43.
Pless, N. M., Maak, T., & Stahl, G. K. (2011). Developing responsible global lead-
ers through international service learning programs: The Ulysses Experience.
Academy of Management Learning & Education,10, 237–260.
Rama, D. V., Ravenscroft, S. P., Wolcott, S. K., & Zlotkowski, E. (2000). Service-
learning outcomes: Guidelines for educators and researchers. Issues in Ac-
counting Education,15, 657–692.
Ras, E., & Rech, J. (2009). Using Wikis to support the Net Generation in improving
knowledge acquisition in capstone projects. Journal of Systems & Software,
82(4), 553–562.
Reese, J. (1997). The impact of school based community service on ninth grade
students’ self-esteem and sense of civic inclusion. Doctoral dissertation, Rut-
gers, The State University of New Jersey.
Reeve, J. R., Gull, S. E., Johnson, M. H., Hunter, S., & Streather, M. (2004). A
preliminary study on the use of experiential learning to support women’s
choices about infant feeding. European Journal Of Obstetrics, Gynecology,
And Reproductive Biology,113(2), 199–203.
Reynolds, D. (2007). Restraining Golem and harnessing Pygmalion in the class-
room: A laboratory study of managerial expectations and task design.
Academy of Management Learning & Education,6, 475–483.
Reynolds, M. (1999). Critical reflection and management education: Rehabilitating
less hierarchical approaches. Journal of Management Education,23, 537–
553.
Robert-McComb, J. J., Cisneros, A., Tacon, A., Panike, R., Norman, R., Qian, X.
P., & McGlone, J. (2015). The effects of mindfulness-based movement on
parameters of stress. International Journal of Yoga Therapy,25(1), 79–88.
Rosen, L., Zucker, D., Brody, D., Engelhard, D., & Manor, O. (2009). The ef-
fect of a handwashing intervention on preschool educator beliefs, attitudes,
knowledge and self-efficacy. Health Education Research,24(4), 686–698.
Rusu, A. S., Copaci, I. A., & Soos, A. (2015). The impact of service-learning
on improving students’ teacher training: Testing the efficiency of a tutoring
program in increasing future teachers’ civic attitudes, skills, and self-efficacy.
Procedia – Social and Behavioral Sciences,203, 75–83.
Rynes, S. L., Trank, C. Q., Lawson, A. M., & Ilies, R. (2003). Behavioral
coursework in business education: Growing evidence of a legitimacy cri-
sis. Academy of Management Learning & Education,2, 269–283.
Saraswat, A., Bach, J., Watson, W. D., Elliott, J. O., & Dominguez, E. P. (2017).
A pilot study examining experiential learning vs didactic education of
abdominal compartment syndrome. The American Journal of Surgery,
214(2), 358–364.
270 Experiential Learning Meta-Analysis
Sass, M. S., & Coll, K. (2015). The effect of service learning on community college
students. Communiy College Journal of Research and Practice,39(3), 280–
288.
Scott, R. S. (1977). An experimental testing of teaching methodologies in mar-
keting. New Horizons in Simulation Games and Experiential Learning,4,
67–71.
Seider, S. C., Gillmor, S. C., & Rabinowicz, S. A. (2010). Complicating college
students’ conception of the American dream through community service
learning. Michigan Journal of Community Service Learning,17(1), 5–19.
Siddique, Z., Ling, C., Roberson, P., Xu, Y., & Geng, X. (2013). Facilitating higher-
order learning through computer games. Journal of Mechanical Design,
135(12), 121004–121013.
Siegel, P. H., Omer, K., & Agrawal, S. P. (1997). Video simulation of an audit:
an experiment in experiential learning theory. Accounting Education,6(3),
217–230.
Specht, L. B., & Sandlin, P. K. (1991). The differential effects of experiential
learning activities and traditional lecture classes in. Simulation & Gaming,
22(2), 196–210.
Sprenger, M. (1999). Learning & memory: The brain in action. Alexandria, VA:
ASCD.
Starkey, K., & Tempest, S. (2009). The winter of our discontent: The design chal-
lenge for business schools. Academy of Management Learning & Education,
8(4), 576–586.
Stewart, A. C., Williams, J., Smith-Gratto, K., Black, S. S., & Kane, B. T. (2011).
Examining the impact of pedagogy on student application of learning: Ac-
quiring, sharing, and using knowledge for organizational decision making.
Decision Sciences Journal of Innovative Education,9(1), 3–26.
Stewart, T., & Wubbena, Z. C. (2015). Systematic review of service-learning in
medical education: 1998 to 2012. Teaching and Learning in Medicine,27(2),
146–156.
Strage, A. A. (2000). Service-learning: Enhancing student learning outcomes in a
college-level lecture course. Michigan Journal of Community Service Learn-
ing,7, 5–13.
Tagawa, M., & Imanaka, H. (2010). Reflection and self-directed and group learning
improve OSCE scores. Clinical Teacher,7(4), 266–270.
Taras, V., Caprar, D. V., Rottig, D., Sarala, R. M., Zakaria, N., Zhao, F., Jimenez,
A . . . Zengyu-Huang, V. (2013). A global classroom: Evaluating the
effectiveness of global virtual collaboration as a teaching tool in management
education. Academy of Management Learning & Education,12, 414–435.
Teglasi, H., & Rothman, L. (2001). STORIES a classroom-based program to reduce
aggressive behavior. Journal of School Psychology,39(1), 71–94.
Burch et al. 271
Ti, L. K., C., F. G., Tan, G. M., Tan, W. T., Tan, J. M. J., Shen, L., & Goy, R.
W. L. (2009). Experiential learning improves the learning and retention of
endotracheal intubation. Medical Education,43(7), 654–660.
Tomkins, L., & Ulus, E. (2016). Oh, was that “experiential learning”? Spaces,
synergies and surprises with Kolb’s learning cycle. Management Learning,
47(2), 158–178.
Tomolo, A. M., Lawrence, R. H., Watts, B., Augustine, S., Aron, D. C., & Singh, M.
K. (2011). Pilot study evaluating a practice-based learning and improvement
curriculum focusing on the development of system-level quality improve-
ment skills. Journal Of Graduate Medical Education,3(1), 49–58.
Tower, L. E., & Hash, K. M. (2013). ‘Hearing the real stories about the issues
at hand’: Politically active elders engage bachelor in social work (BSW)
students in influencing social policy. Social Work Education,32(7), 920–
932.
Trank, C. Q., & Rynes, S. L. (2003). Who moved our cheese? Reclaiming pro-
fessionalism in business education. Academy of Management Learning &
Education,2, 189–205.
Trapp, J. N., Koontz, S. A., Peel, D. S., & Ward, C. E. (1995). Evaluating the effec-
tiveness of role playing simulation and other methods of teaching managerial
skills. Developments in Business Simulation & Experiential Exercises,22,
116–123.
Van Horn, P. S., Green, K. E., & Martinussen, M. (2009). Survey response rates
and survey administration in counseling and clinical psychology. Education
and Psychological Measurement,69, 389–403.
Veenhoven, R. (2007). Subjective measures of well-being. In: McGillivray M. (ed.).
Human well-being. Studies in development economics and policy. London,
UK: Palgrave Macmillan, 214–239.
Vince, R. (1998). Behind and beyond Kolb’s learning cycle. Journal of Manage-
ment Education,22, 304–319.
Waddock, S., & Lozano, J. M. (2013). Developing more holistic management
education: Lessons learned from two programs. Academy of Management
Learning & Education,12(2), 265–284.
Washbush, J., & Gosenpud, J. (1994). Simulation performance and learning revis-
ited. Developments in Business Simulation and Experiential Learning,21,
83–86.
Wheatley, W. J., Hornaday, R. W., & Hunt, T. G. (1986). Enhancing strategic
goal-setting skills in the business policy course. Developments in Business
Simulation & Experiential Exercises,13, 22–27.
Wolf, M., & Mehl, K. (2011). Experiential learning in psychotherapy: Ropes
course exposures as an adjunct to inpatient treatment. Clinical Psychology
& Psychotherapy,18(1), 60–74.
272 Experiential Learning Meta-Analysis
Wolfe, D. E., & Byrne, E. T. (1978). Programmatic experience-based learning
in an MBA program. Exploring Experiential Learning: Simulations and
Experiential Learning Exercises,5, 16–22.
Wong, M. C. S., Lau, T. C. M., & Lee, A. (2012). The impact of leadership pro-
gramme on self-esteem and self-efficacy in school: A randomized controlled
trial. PLoS ONE,7(12), 1–6.
Yorio, P. L. & Ye, F. (2012). A meta-analysis on the effects of service learning
on the social, personal and cognitive outcomes of learning. Academy of
Management Learning & Education,11(1), 9–27.
Yorke, M. (2003). Formative assessment in higher education: Moves towards
theory and the enhancement of pedagogic practice. Higher Education,45,
477–501.
Zawadzki, M. J., Danube, C. L., & Shields, S. A. (2012). How to talk about gender
inequality in the workplace: Using WAGES as an experiential learning tool
to reduce reactance and promote self-efficacy. Sex Roles,11–12, 605–616.
Zelechoski, A. D., Romaine, C. L. R., & Wolbransky, M. (2017). Teaching psy-
chology and law: An empirical evaluation of experiential learning. Teaching
in Psychology,44(3), 222–231.
Zhang, C. Q., Gangyan, S., Yanping, D., Lyu, Y., Keatley, D. A., & Chan, D. K. C.
(2016). The effects of mindfulness training on beginners’skill acquisition in
dart throwing: A randomized control trial. Psychology of Sport & Exercise,
22, 279–285.
Zhang, D., & Campbell, T. (2012). An exploration of the potential impact of the
integrated experiential learning curriculum in Beijing, China. International
Journal of Science Education,34(7), 1093–1123.
Zigmont, J. J., Wade, A., Edwards, T., Hayes, K., Mitchell, J., & Oocumma,
N. (2015). Utilization of experiential learning, and the learning outcomes
model reduces RN orientation time by more than 35%. Clinical Simulation
in Nursing,11(2), 79–94.
Gerald (Jerry) Burch is an Assistant Professor at Texas A&M University at Com-
merce. He has published research in the areas of emotional labor, entrepreneurship,
operations management, leadership, and management education.
Robert Giambatista is an Associate Professor for the Kania School of Manage-
ment at the University of Scranton. His scholarly interests include experiential
learning and behavioral skill acquisition in management education, leadership,
and team effectiveness. His teaching interests include organizational behavior,
management principles, negotiation, and ethics and social responsibility.
John H. Batchelor is an Associate Professor in the Management/MIS Department
at the University of West Florida. His research interests include entrepreneurship,
leadership, emotion, and meta-analysis.
Burch et al. 273
Jana Burch is a creativity consultant with an EdD from Tarleton State University.
She has provided education and creativity training in both academic and commer-
cial settings. Her research interests are creativity, student engagement/retention,
and entrepreneurship.
Duane Hoover is a Professor of Practice at the Rawls College of Business at
Texas Tech University. His research interests include organizational behavior,
management education, and negotiation.
Nate Heller is an Associate Professor at Tarleton State University. He has con-
ducted research in management education, marketing, and leadership.