Content uploaded by Matthew J Koehler
Author content
All content in this area was uploaded by Matthew J Koehler on Feb 18, 2014
Content may be subject to copyright.
The Interdisciplinary Journal of Problem-based Learning • volume 4, no. 1 (Spring 2010)
57–82
Using Questioning to Facilitate Discussion of Science Teaching
Problems in Teacher Professional Development
Meilan Zhang,1 Mary Lundeberg,1 Tom J. McConnell,2 Matthew J. Koehler,1
and Jan Eberhardt1
Abstract
Previous research has shown that questioning is a key strategy that facilitators use to
promote discussion in Problem-Based Learning (PBL). Yet, there is a lack of detailed
understanding on what questions facilitators ask and how those questions a ect
discussion. In this study we examined di erent types of questions that experienced
facilitators asked to promote discussion of teaching problems in professional develop-
ment for science teachers. We videotaped six PBL sessions facilitated by three pairs of
experienced facilitators. Data analysis showed that facilitators asked a set of questions
to initiate and advance PBL discourse, including questions to solicit ideas, to reframe
ideas, to clarify ideas, to push for elaboration, to check for interpretation, and to con-
nect to teachers’ classroom practice. This study has implications for the development
of PBL facilitators.
A sociocultural view of learning places great emphasis on the role of language and dis-
cussion in the process of knowledge construction (Dillon, 1994; Lemke, 1990). Discussion
is a key feature of Problem-Based Learning (PBL) (Barrows, 1988; 1996). In PBL group
discussion, collaborative knowledge construction is achieved through activating learn-
ers’ prior knowledge, identifying knowledge de cits, questioning each other, reasoning
with evidence, and reconciling multiple perspectives (Dolmans & Schmidt, 2006; Hmelo-
Silver, 2004). However, such discussion rarely occurs spontaneously. Problems and cases,
however well designed, do not teach themselves (Shulman, 1996). Facilitators play an
essential role in structuring and guiding PBL discussion (Dolmans et al., 2002; Savery,
1. Michigan State University
2. Ball State University
The Interdisciplinary Journal of Problem-based Learning •
58 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
2006). The important role of facilitation has been documented in a few detailed analyses
of PBL tutorial processes (Glenn, Koschmann, & Conlee, 1999; Hmelo-Silver & Barrows,
2006; 2008; Palincsar, 1999).
Our previous research has shown that experienced facilitators used a variety of
strategies during PBL group discussions (Zhang, Lundeberg, McConnell, Koehler, & Eber-
hardt, 2009). Particularly, we found questioning was the strategy most frequently used
by all facilitators in all PBL sessions. Just as “Teacher questions are frequent, pervasive,
and universal phenomena” (Roth, 1996, p. 710), facilitators often use questioning to
engage participants in discussion. Given the prominence of questioning, in this study
we aimed to understand how experienced facilitators used questioning to guide PBL
group discussions.
Research on questioning has mainly occurred in K-12 classroom settings focusing on
the types of questions asked by teachers (e.g., Chin, 2007; van Zee, Iwasyk, Kurose, Simpson,
& Wild, 2001), with only a few studies that examined questions asked by PBL facilitators in
medical education (e.g., Hmelo-Silver & Barrows, 2008). Student-centered, inquiry-based
classroom teaching shares important principles with PBL because both emphasize that
learning is actively constructed by learners and students should take responsibility for
their learning. Therefore, understanding of e ective teacher questioning has important
implications for PBL facilitation. In the following section, we brie y review the ndings
on teacher questioning in classroom instruction, and then turn to studies on questioning
in PBL facilitation.
Using questioning to guide student thinking in classrooms
Teacher questioning is a key component of classroom discourse. According to Graesser
and Person (1994), teachers asked roughly 30-120 questions per hour, depending on the
activity types (e.g., whole-group discussion or routine seat work). Questions are not uni-
versally e ective, however. Early research identi ed a typical classroom discourse pattern
that contains a three-part exchange: Initiation-Response-Evaluation (IRE) (Cazden, 1986;
Mehan, 1979). That is, a teacher initiates a question that typically aims to ask students to
recite what has been taught, and then a student responds to the question, which is fol-
lowed by an evaluation from the teacher. As prevalent as the IRE pattern is in classroom
discourse, it is generally considered ine ective in leading to meaningful discussion, thus
providing limited learning opportunities (Lemke, 1990).
Later, a number of studies explored the characteristics of teacher questioning that
had potential to foster productive discussion (Nystrand, Wu, Gamoran, Zeiser, & Long,
2003; van Zee et al., 2001; van Zee & Minstrell, 1997; Wells & Arauz, 2006). First, van Zee
and Minstrell (1997) described a particular kind of questioning, used by an experienced
high school physics teacher to stimulate student thinking, that they called re ective toss,
Using Questioning to Facilitate Discussion 59
• volume 4, no. 1 (Spring 2010)
in which the teacher tried to “catch” the meaning of a student’s idea in the previous turns
and “throw” a question back to students to elicit further thinking. Unlike IRE, a re ective
toss consists of a di erent three-part exchange: a student statement, teacher question,
and additional student statements.
In another study of her own teaching in undergraduate courses and four other K-12
science teachers’ practice, van Zee et al. (2001) reported that during science discussion,
teachers asked questions to elicit student ideas, to diagnose and re ne those ideas, to
clarify meaning, to consider multiple perspectives, and to monitor discussion and student
thinking.
Other studies explored the conditions under which questioning might lead to pro-
ductive discussion, or dialogic discourse. Researchers who studied classroom questioning
typically operationalized dialogic discourse by the number of participants involved and the
duration of sustained discussion. For example, Nystrand et al. (2003) de ned productive
discussion as “free exchange of information among at least three students and the teacher
that lasted at least a half-minute during a classroom instructional episode” (p. 174).
Drawing upon 872 class observations in 112 eighth- and ninth-grade English and
social studies classes in 16 schools, Nystrand et al. (2003) identi ed three important
conditions that were likely to lead to dialogic discourse: 1) authentic questions asked by
teachers that did not have a predetermined answer, 2) uptake of students’ ideas (i.e., a
teacher’s question incorporated students’ contributions in previous utterances), and 3)
student questions. This study suggested the importance of teachers asking authentic
questions and building on student ideas.
Another longitudinal study on classroom discourse in elementary and middle school
science classes revealed similar insights (Wells & Arauz, 2006). Wells and Arauz studied
whole-class discussions from nine teachers’ classrooms who participated in a project
that spanned seven years. They distinguished two types of teacher questioning: Known
information questions that aimed at getting students to display what was supposed to
be known, and negotiatory questions that led to open-ended discussion. They found
that over an extended period of time of adopting an inquiry-based approach to teaching,
teachers were more likely to ask negotiatory questions than known information ques-
tions, and their follow-up moves after initial questions were more likely to show uptake
of students’ ideas. Such negotiatory questions and uptake often led to student-initiated
discussion and meaningful classroom dialogues.
In sum, two important implications can be drawn from the classroom research on
questioning. First, open-ended questions, or authentic (Nystrand et al., 2003), negotiatory
questions (Wells & Arauz, 2006) that focus on eliciting and extending student ideas, tend to
be more conducive to productive discussion than close-ended, known information ques-
tions that seek predetermined correct answers. Second, although researchers have named
the notion of contingency (Boyd & Rubin, 2006) di erently, such as re ective toss (van
The Interdisciplinary Journal of Problem-based Learning •
60 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
Zee & Minstrell, 1997), uptake (Nystrand et al., 2003), and contingent and nested queries
(Roth, 1996), they generally agree that questions that are contingent on previous student
utterances have great potential to promote rich discussions. Correspondingly, research
on questioning in PBL facilitation should not only examine the types of questions being
asked, but also the contingency of the questions on previous utterances.
In addition, much of what has been learned about questioning in classroom is based
on studies of questioning practice of experienced teachers. For example, the study of van
Zee and Minstrell (1997) focused on an experienced high school physics teacher; Boyd and
Rubin (2006) focused on an experienced fourth- and fth-grade ELL teacher; Roth (1996)
studied an expert elementary teacher. Other studies focused on several experienced teach-
ers (e.g., van Zee et al., 2001; Chin, 2007). Similarly, to improve PBL facilitation, the eld can
bene t from studies that examine questioning practice of experienced facilitators.
Using questioning to facilitate discussion in problem-based learning
There have been relatively few detailed analyses of PBL facilitation. One such example
is a study by Hmelo-Silver and Barrows (2008) that analyzed two PBL group meetings guided
by an experienced facilitator, in which ve second-year medical students discussed a clinic
problem concerning pernicious anemia. The authors di erentiated the questions and state-
ments made by the students and the facilitator. Three types of questions raised in discus-
sion were categorized, including short-answer questions, long-answer questions, and task
oriented and meta questions. They found that the facilitator asked a total of 343 questions
in the two PBL meetings. Speci cally, the facilitator asked short-answer questions (11%) to
focus student attention, long-answer questions (13%) to push for clari cations and elabora-
tions, and meta questions (75%) to evaluate hypotheses, check understanding, and monitor
group dynamics.
The studies by Hmelo-Silver and Barrows (2006; 2008) made an important contribu-
tion to understanding the complexity of e ective facilitation. However, the framework
that Hmelo-Silver and Barrows (2008) used to characterize the facilitator’s questioning was
mainly originated from the question taxonomy developed by Graesser and Person (1994),
with an addition of task-oriented and meta questions. Graesser and Person’s taxonomy was
developed to classify the types of questions asked during one-on-one tutoring sessions,
in which the student tutors had only a modest amount of tutoring experience. We argue
that small group discussion is di erent from conversation in one-on-one tutoring, and
questioning by experienced facilitators is di erent from questioning by inexperienced
student tutors. Therefore, new research is needed to develop a questioning framework
that is sensitive to the context of PBL.
In addition, in their studies, the students had used PBL curricula for two years, so
they were experienced with the PBL process. Yet, it remains unclear how to facilitate
Using Questioning to Facilitate Discussion 61
• volume 4, no. 1 (Spring 2010)
learners who are new to PBL. Moreover, most PBL studies, including the few that focused
on facilitators’ questioning, were conducted in medical education, which is not surpris-
ing considering that PBL was originated in medical schools. However, as the use of PBL is
increasingly expanding in other elds, more studies on PBL facilitation in other contexts
are clearly needed.
In this study, we examined questioning in PBL facilitation in the context of teacher
professional development (PD). We designed a professional development model using
the PBL approach to improve teachers’ content knowledge and pedagogical content
knowledge. This PD model included a two-week summer workshop and a year-long ac-
tion research project. In summer, the rst week focused on developing teachers’ content
knowledge by using the PBL process to solve science problems in areas such as physics,
earth science and biology. In the second week, teachers used the PBL process to analyze
teaching problems developed by the PD designers. During the school year, teachers
conducted an action research project to investigate a problem from their own classroom
practice. This PD model, with some variations, had been implemented for four years to four
cohorts of teachers. Drawing on data collected from the second year, this study focused
on the second week in summer when teachers analyzed teaching problems guided by
facilitators. We asked two research questions: What types of questions did experienced
facilitators ask to promote discussion of science teaching problems in professional devel-
opment? How were the questions contingent on teachers’ ideas in previous utterances?
Methods
Participants
Participants were six facilitators who paired up to facilitate two groups of teachers using
the same teaching problem, as shown in table 1. One served as the lead facilitator with the
other as assistant. The lead facilitator assumed major responsibility for guiding the group
discussion, while the assistant facilitator mainly helped to record teachers’ ideas on charts
and sometimes asked questions or made statements. On average, the lead facilitators used
about 32% of speaking turns in the group meetings, and assistant facilitators used about
8%. A total of 35 K-12 science teachers participated in the study, including 14 elementary
teachers, 13 middle school teachers, and 8 high school teachers. Of the 35 teachers, 12
had 1-3 years of teaching experience, 10 had 4-10 years, and 13 had 11 or more years.
Except for one assistant facilitator (Facilitator 4), who was a doctoral student with
ve years of teaching experience, the facilitators were experienced science teachers or
teacher educators, with teaching experience ranging from 15 to 34 years. Of more than 20
facilitators who were involved in the PD, these ve facilitators were the leading designers
and implementers. In the rst year of the PD project, all facilitators received training from
The Interdisciplinary Journal of Problem-based Learning •
62 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
PBL experts from the medical school at a large Midwestern university. They also observed
PBL meetings in the medical school. Prior to the summer workshop, they practiced facili-
tation of content and pedagogical problems during weekly facilitator meetings. Because
of the prominent role that the lead facilitators played in guiding the group discussions,
we describe in detail their backgrounds in the following paragraph. Overall, given their
extensive experience in science teaching, leading small group discussions, using PBL or
inquiry-based teaching, and facilitating teacher professional development, we considered
them to be experienced facilitators.
Facilitator 1 held a doctoral degree in education and had 25 years of teaching ex-
perience, including 7 years at the high school level and 18 years at the college level. She
had led small group discussions in multiple professional development projects. She had
also used PBL in her preservice teacher education courses for two years. Facilitator 2 held
a doctoral degree in biochemistry and had 15 years of teaching experience, including 2
years at the middle school level and 13 years at a large Midwestern university. She had
also facilitated multiple professional development sessions and had implemented PBL
in her science methods courses for preservice teachers for two years. Facilitator 3 held
a master degree in curriculum and instruction and had 34 years of teaching experience,
including 29 years at the high school level and 5 years at a large Midwestern university.
She had used an approach similar to PBL in her teaching for more than 20 years and had
led multiple professional development workshops. She was also director of a state-wide
nonpro t organization that involved more than 550 middle schools and high schools in
its science tournament events.
Problems
Three instructional problems related to science teaching were used in this study, as shown
in the appendix. In a problem called Circuits,1 the central issue was how to help students
move from vague ideas to scienti c understanding of electric circuits. In a problem called
Lead
facilitator
Assistant
facilitator Problem Day 1 Day 2
Facilitator 1 Facilitator 4 Circuits Group 1 (9 teachers) Group 2 (10 teachers)
Facilitator 2 Facilitator 5 Falling object Group 2 (10 teachers) Group 1 (9 teachers)
Facilitator 3 Facilitator 6 Weather map Group 3 (8 teachers) Group 4 (8 teachers)
Facilitator 7* Facilitator 8* Evolution Group 4 (8 teachers) Group 3 (8 teachers)
Table 1. Facilitators, problems and groups
* We did not study Facilitators 7 and 8’s questioning practice due to the poor quality of video recordings of their
group meetings.
Using Questioning to Facilitate Discussion 63
• volume 4, no. 1 (Spring 2010)
Falling object, the teacher in the scenario was struggling to help her students handle dis-
crepant data from experiments of dropping objects to oor (Mikeska & Stanaway, 2006).
In a problem called Weather map,1 the main issue was how to design group tasks that
could promote collaboration among students.
The group meeting started with reading the text-based problem scenario. Then the
facilitators guided teachers to analyze the problem, discussing what was known (facts),
what was unknown and needed to be learned (learning issues), and what accounted for the
problematic phenomena (hypotheses). During discussion the assistant facilitator recorded
teachers’ ideas on charts. Next, teachers watched about 10 minutes of video vignettes
concerning the classroom practice of the teacher in the problem scenario. The video clips
were selected from published resources, from which facilitators designed relevant teach-
ing problems. Teachers continued to discuss the problem after video watching, adding
new facts, learning issues, or hypotheses. They then conducted individual research for
about half an hour on some learning issues using online resources and books. Finally,
they shared what they learned with the group and proposed solutions for the problem
under consideration.
Data sources and analysis
The major data sources for this study were videotapes of the six PBL group meetings. Each
group meeting lasted about 2-3 hours, resulting in a total of about 15 hours of videotapes.
The videos and associated transcripts were entered into a database in Transana software,
which allows synchronization between videos and transcripts. Analysis was based on the
written transcripts, with the video used to examine speci c segments. The transcripts
were prepared and entered into a spreadsheet for analysis (Meyer & Avery, 2009). The
entire turn of each speaker appeared in one row in one spreadsheet with a label for the
speaker. A change of speaker generated a new speaking turn. The unit of analysis was the
facilitators’ speaking turn. Coding was applied in the same row but a di erent column in
the spreadsheet.
Coding of facilitators’ questioning strategies was an iterative process. The rst author
generated some initial categories of questioning when she observed some of the group
meetings during the summer PD. After the videos of the six group discussions were tran-
scribed, the rst author reviewed the video and read the transcripts several times to get
familiar with the content in the video. Interaction analysis (Jordan & Henderson, 1995)
suggested that to facilitate video analysis, it is useful to identify the temporal structure
of events in the video. Accordingly, the transcript of each group meeting was parsed
into major phases including getting started, problem identi cation, problem analysis,
and sharing research ndings. The stage of teachers conducting individual research on
learning issues was not videotaped. We identi ed facilitators’ questions in each phase.
The Interdisciplinary Journal of Problem-based Learning •
64 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
Nonetheless, problem analysis was the major phase in each group meeting and consumed
the majority of facilitators’ speaking turns.
In labeling questions in the transcripts, we included all facilitators’ utterances that
contained a grammatical form of questions, for example, beginning with interrogative
words such as what, when, where, why, who, how, do/does, and are/am/is, or ended with
a rising intonation, such as “other ideas?” Sometimes a question took the form of a state-
Table 2. Coding scheme for di erent types of questioning
T=Teacher; F=Facilitator
Type of questioning Description Example
More frequent
Soliciting ideas Solicit ideas from the whole group
on problems, facts, hypotheses,
learning issues, research ndings, and
recommendations.
F: Any facts, observations or
learning issues?
F: Any other ideas?
Reframing ideas Reframe a teacher’s idea into a
learning issue or hypothesis, or
occasionally a problem or a fact,
which mainly occurred in the problem
analysis phrase.
F: Is that a learning issue?
F: I am hearing an if/then
statement. Because?
Clarifying ideas Ask teachers to clarify their idea that
was unclear to the facilitator.
T: So, as the person plan the
inquiry, then present it, then
initiate...
F: What does it mean by
present?
Pushing for
elaboration
Ask teachers to elaborate on an idea. F: I’d like to push your
thoughts.
F: I am not sure I understand
that. Can you elaborate that?
Checking for
interpretation
Ask a teacher to con rm whether a
facilitator’s interpretation of his or her
ideas was accurate.
F: Is that what you are
saying?
F: Did I restate that OK?
Less frequent
Calling on
individuals
Invite responses from an individual
teacher, to engage less vocal teachers
in the discussion.
F: Mindy, what do you think
about this?
Connecting to
practice
Ask teachers to share experience
on how to handle problems in their
practice.
F: How do you handle that in
the classroom? What do you
do?
Tossing back Throw a question that was addressed
to a facilitator back to the group.
F: Can anyone answer that
question?
Using Questioning to Facilitate Discussion 65
• volume 4, no. 1 (Spring 2010)
ment, such as “tell me more about it.” All facilitators’ questions were identi ed and coded
except for procedural questions, such as questions that facilitators asked a teacher to read
aloud the problem scenario, to repeat what he or she just said, and to assign learning
issues for individual research. We also did not code instances in which facilitators called
a teacher’s name, who had already indicated an intention to talk, to grant him or her a
speaking turn. These questions helped to maintain the process but did not bear much
pedagogical connotations and only occurred a few times in one group discussion. In total
we coded 435 facilitators’ questions in the six discussions.
Through multiple viewings and readings, the rst author developed an initial coding
scheme to characterize the types of questions asked by the facilitators. The scheme was
discussed in the project research meetings, in which other researchers provided feedback
on the scheme. The rst author used the re ned coding scheme, as shown in table 2, to
code all of the facilitators’ questions identi ed. Another researcher who was familiar with
the coding scheme independently coded 17% of entire dataset and the inter-rater reli-
ability was 91%. Disagreement was resolved through discussion and video viewing.
We identi ed ve types of questions that occurred relatively frequently (greater
than 10% of total questions) and three types of questions less frequently (lower than
2%). In naming the questions, we considered both the context of this study and relevant
literature. For example, we found facilitators frequently asked one type of question in
the phases of problem identi cation, problem analysis, and sharing research ndings to
solicit ideas from the group. Van Zee et al. (2001) described a similar type of question that
teachers asked to elicit students’ experiences. Thus we labeled this type of questioning
“soliciting ideas.”
Coding for contingency was relatively straightforward and was conducted by the rst
author only. Similar to what Boyd and Rubin (2006) reported, the process of determining
a question’s contingency “was rarely di cult” (p. 152), as the contingency of a facilitator’s
question was often inherent in the type of question. For example, the questions that facili-
tators asked teachers to clarify or elaborate on ideas were always contingent on teachers’
previous utterances. Through contingency analysis, we found that facilitators often asked
a sequence of contingent questions to access and advance teachers’ ideas.
In addition, at the end of the summer PD, a questionnaire was administered to 35
teachers to measure how well the PD was implemented, in which facilitation was one of
the PD components that was evaluated (Science and Mathematics Program Improve-
ment, 2006). From the survey, we identi ed items related to the PD objectives and the
performance of facilitators. We present descriptive data such as mean scores and standard
deviations based on teachers’ ratings on those items on a 5-point Likert scale, along with
examples of teachers’ comments. Teachers’ evaluation was one way to triangulate the
nding of facilitators’ questioning practice as the ultimate goal of facilitators’ discourse
moves was to achieve the PD objectives.
The Interdisciplinary Journal of Problem-based Learning •
66 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
Results
Questioning was frequently used by the facilitators in each of the six PBL sessions we
studied. The facilitators asked questions to access, probe, and deepen teacher thinking.
Speci cally, we found the following types of questioning commonly used in all sessions:
1) soliciting ideas, 2) reframing ideas, 3) clarifying ideas, 4) pushing for elaboration, and 5)
Table 3. The percentages of questioning strategies used by facilitators in the six PBL sessions
Freq. = Frequency; %=Percentage of questions in total
Figure 1. Average percentage of questioning types used by facilitators in the six PBL sessions
Strategy Circuits 1 Circuits 2 Falling object
1
Falling object
2
Weather
map 1
Weather
map 2
Freq. % Freq. % Freq. % Freq. % Freq. % Freq. %
More frequent
Soliciting ideas 28 27.7% 20 27.4% 22 36.7% 29 33.3% 23 36.5% 12 23.5%
Reframing ideas 28 27.7% 20 27.4% 5 8.3% 7 8.0% 11 17.5% 19 37.3%
Clarifying ideas 12 11.9% 16 21.9% 14 23.3% 14 16.1% 18 28.6% 11 21.6%
Pushing for
elaboration 16 15.8% 8 11.0% 9 15.0% 15 17.2% 1 1.6% 4 7.8%
Checking for
interpretation 8 7.9% 9 12.3% 8 13.3% 17 19.5% 6 9.5% 4 7.8%
Less frequent
Calling on
individuals 4 4.0% 0 0.0% 2 3.3% 1 1.1% 1 1.6% 0 0.0%
Connecting to
practice 5 5.0% 0 0.0% 0 0.0% 4 4.6% 0 0.0% 0 0.0%
Tossing back 0 0.0% 0 0.0% 0 0.0% 0 0.0% 3 4.8% 1 2.0%
Total 101 73 60 87 63 51
30.9%
21.0%
20.6%
11.4%
11.7%
1.7%
1.6%
1.1%
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
Soliciting
ideas
Reframing
ideas
Clarifying
ideas
Pushing for
elaboration
Checking for
interpretation
Calling on
individuals
Connecting
to practice
Tossing back
More frequent Less frequent
Using Questioning to Facilitate Discussion 67
• volume 4, no. 1 (Spring 2010)
checking for interpretation. As shown in table 3 and gure 1, these ve types of question-
ing accounted for more than 90% of total questions asked by the facilitators.
In addition, the following types of questioning either appeared less frequently or were
used only by some facilitators in some sessions: 6) calling on individuals, 7) connecting to
practice, and 8) tossing back. Although these questions were infrequently asked overall,
they still had important pedagogical value in guiding the PBL discussion. Facilitators asked
these questions to engage less vocal teachers, to connect to classroom practice, and to
maintain a role as facilitators rather than content experts.
Figure 1 shows the average frequency of di erent questioning types. Figure 2 shows
the percentage of contingent questions and noncontingent questions. About two thirds
of facilitators’ questions were contingent questions. Speci cally, of the eight types of ques-
tions, soliciting ideas (from the group) and calling on individuals were not contingent on
previous teachers’ utterances. Questions of reframing ideas, clarifying ideas, pushing for
elaboration, checking for interpretation, connecting to practice, and tossing back were
contingent upon previous teachers’ utterances.
In addition, the content of questions of soliciting ideas and reframing ideas were
unique to PBL discourse. In other types of discourse, facilitators or teachers could also
solicit and reframe ideas, but not necessarily ideas on problem analysis and reasoning.
Also, reframing ideas mainly occurred in the problem analysis phase, while other questions
32.5%
67.5%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
Non-contingent questions Contingent questions
(Note. Noncontingent questions included Soliciting ideas and Calling on individuals. Contingent questions included
Reframing ideas, Clarifying ideas, Pushing for elaboration, Checking for interpretation, Connecting to practice,
and Tossing back.)
Figure 2. Average percentage of contingent and non-contingent questioning types
used by facilitators in the six PBL sessions
The Interdisciplinary Journal of Problem-based Learning •
68 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
could occur in all phases. Table 4 summarizes the characteristics of di erent questions in
terms of type, frequency, contingency, uniqueness to PBL discourse, and phase.
Next, we explain each type of questions and provide illustrative examples. Previous
research de ned productive discussion as involving three or more participants and lasting
half a minute or longer (Nystrand et al., 2003). Given space limitations, we provide one
such episode for the major types of questions to show that the facilitators’ questions had
potential to stimulate productive discussion. Pseudonyms were used for teachers.
Soliciting ideas
Questioning to solicit ideas was frequently used by the facilitators at each PBL phase to
start, continue, or redirect a discussion. However, facilitators did not just solicit ideas in
general, but ideas that re ected characteristics of PBL, that is, ideas on problems, facts,
learning issues, hypotheses, research ndings, and recommendations. It was used at the
beginning of a PBL session to start a discussion by asking teachers to restate the problem,
to identify facts, or to generate learning issues and hypotheses. Soliciting new ideas was
also used to continue discussion. In general, this type of question was not contingent on
previous teachers’ utterances. It often signaled a transition to a new discussion topic.
First, at the beginning of a PBL session, usually after the group read the text-based
problem scenario, facilitators asked the group to clarify the problem, as shown in the fol-
lowing example. Because PBL often starts with an ill-structured problem, clari cation of the
problem at the outset is critical, which directly in uences the subsequent discussion.
Facilitator 2: Let’s restate the problem rst. Let’s agree on what the problem is. What
is Nancy (the case teacher) asking us? [Falling object 1]
Type Frequency Contingency Uniqueness of PBL
discourse Phase
Soliciting ideas Frequent Non-contingent Unique All*
Reframing ideas Frequent Contingent Unique Problem
analysis
Clarifying ideas Frequent Contingent General All
Pushing for elaboration Frequent Contingent General All
Checking for
interpretation Frequent Contingent General All
Calling on individuals Infrequent Non-contingent General All
Connecting to practice Infrequent Contingent General All
Tossing back Infrequent Contingent General All
Table 4. Summary of questioning strategies used by facilitators in the six PBL sessions
* All refers to phases of problem identi cation, problem analysis, and sharing research ndings. We did not study
the individual research phase.
Using Questioning to Facilitate Discussion 69
• volume 4, no. 1 (Spring 2010)
Second, during the problem analysis phase, facilitators asked questions to solicit
teachers’ ideas about facts, learning issues, and hypotheses in relation to the problem, as
shown below. These questions often started a discussion e ectively.
Facilitator 1: Are there some learning issues that you’d like to address or some hy-
potheses? [Circuits 2]
Third, when discussion appeared to have reached an impasse, facilitators often initi-
ated another call for ideas on facts, learning issues, or hypotheses, commonly expressed
as “What else do we know?” or “Any other ideas?” Depending on the group dynamics,
facilitators could ask these questions several times to continue the discussion.
Finally, in the sharing research phase, facilitators asked questions to solicit ndings
about the learning issues that teachers studied. Facilitators often named a speci c learning
issue (e.g., checking students’ understanding) and asked the group for responses.
Facilitator 3: Anybody else came across anything that deals with helping me check
my students’ understanding or anything else? [Weather map 2]
In sum, questions of soliciting ideas were used in each PBL phase to initiate and con-
tinue discussion. These questions helped to access teacher thinking and make it explicit,
which laid the groundwork for further clari cation and elaboration. Facilitators also used
this type of question to structure a session and align the discussion with PBL discourse,
which was characterized by problem identi cation, problem analysis, identifying knowl-
edge de cits, and reasoning through hypothesizing.
Reframing ideas
Reframing ideas referred to the questions that facilitators asked to frame a teacher’s idea
as a learning issue or a hypothesis, and occasionally a fact, or a problem. Because teachers
in this study were new to PBL, they were unfamiliar with the process of forming learning
issues or developing hypotheses. As a result, teachers themselves rarely spontaneously
posed a learning issue or hypothesis. Typically, facilitators recognized an idea emerged
in discussion as a potential learning issue or hypothesis and asked teachers to con rm
or restate. The reframing questions were often expressed as: Is that a learning issue (or
hypothesis)? Or, how could you rephrase it as a learning issue (or hypothesis)? This type
of question almost exclusively occurred in the problem analysis phase.
One such example was from Circuits 1. After Karen raised an idea of how to help
students develop an understanding from doing the circuits activities, Facilitator 1 asked
if it could be a learning issue. Nina then further explained the idea.
Karen: Well, they had all of these activities so that, with the activities, how do you
then make it an understanding of the current, of the circuits?
Facilitator 1: So, is that a learning issue?
Nina: You want to know how to apply those together if you want to connect them.
The Interdisciplinary Journal of Problem-based Learning •
70 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
A critical practice in the PBL discussion was to develop hypotheses, which included
three parts of pedagogical reasoning: If (a condition or solution presents), then (expected
results will show), because (reasons that the condition or solution will lead to the results).
Teachers often had di culty articulating a complete hypothesis, particularly the “because”
part. Therefore, facilitators often pushed teachers to explain why a condition or solution
might lead to the expected results. One such example was from Circuits 1, in which Facilita-
tor 1 reframed teachers’ idea into a hypothesis (turn 6) and pushed teachers to complete
the “because” part of a hypothesis. The practice of articulating “because” helped teachers
deepen their original thinking.
Episode 1 (2.5 minutes): “If…, then…, because…?” (Circuits 1)
1. George: I have another question. I’m sorry, but do we see hands when she [the
teacher in the video] asked the questions or did she just call on people?
2. Bob: She just called on people.
3. George: And that’s what I think the di erence too, because she was making
them be responsible. Ok, let the certain person answer all the questions or
whatever.
4. Kate: She gave them the opportunity to respond.
5. George: Yeah, because they knew that they had to be on their toes or whatever.
6. Facilitator 1: So, that’s really another hypothesis, isn’t it? I heard an “if” and
“then.” I heard you say “if teachers call on students, then that makes them
more responsible for their learning and being on task.” Because?
7. George: They’ll be prepared to be on their toes.
8. Karen: How do we give students time to formulate thinking about the learning
issue?
…
9. Kate: Well, she gave them time to think and I’m making an assumption here
that she might go back to them and say “Ok, were you right or were you
wrong?” So students are given the opportunity to test their own hypothesis
and come to their own conclusions, so they gain a greater understanding.
10. Facilitator 1: Because?
11. Kate: Because they had to nd out the answer themselves.
12. George: Well, if she doesn’t tell them the answer.
13. Bella: Because they’ve taken more responsibility.
14. Kate: And because they, yeah, they’ve taken a greater control in their learning.
15. Facilitator 1: That works nice. A greater control in their learning. Write that
down.
Developing hypotheses and generating learning issues are critical skills in PBL, and
these skills take time to develop. Because the teachers were inexperienced with the PBL
approach, it was essential for the facilitators to model the process at the early stage.
Using Questioning to Facilitate Discussion 71
• volume 4, no. 1 (Spring 2010)
Clarifying ideas
Questioning to clarify ideas was frequently used in each PBL session. The questions ranged
from clarifying meaning of speci c words to the whole utterance by a speaker. This type
of question was often expressed as “What does it mean by…?” “Let’s be more speci c,”
or “I am not clear what you meant, could you tell me more?” The clarifying question was
usually followed by teachers’ clari cation. The contingency of a clarifying question was
inherent because facilitators had to refer to a teacher’s idea in previous utterances to ask
such a question.
The following excerpt from Weather map 2 illustrated how clarifying questions fos-
tered meaningful discussion in problem analysis and eventually led to the development
of an important learning issue. In this episode, Facilitator 3 played the role of the teacher
in the video. First, a teacher, Kevin, made an observation that in the problem scenario
there was no direction for group work. Two other teachers, Julie and Leslie, agreed with
the idea and added to it. However, their statements were somewhat vague, so Facilitator
3 asked them to clarify their meaning (turn 7). Then Julie explained that although four
students worked together as a group, their task was not collaborative in nature. Another
teacher, Kara, paraphrased the idea that it was an individualized task which occurred in a
group setting. Next, the group developed a learning issue of how to design a group task
that could promote meaningful collaboration.
Episode 2 (1.5 minutes): Individual task in group work (Weather map 2)
1. Kevin: One thing I see in our problem too, is she asked them to get in a group
and work, but there’s no rubric, no directions on how to work, there’s no
leader, who’s going to do the task, or there’s just no clear directions on how
they are supposed to work together in a group.
2. Facilitator 6: Kevin, phrase that as a problem.
3. Kevin: Problem, no directions for group.
4. Julie: Well, that is what I was thinking too, that they are working as a group but
really individualize on an aspect.
5. Leslie: It looks like it could easily be an individual thing. And now you’ve got
four kids sitting together.
6. Julie: Right.
7. Facilitator 3: See, now, let’s focus on that a little bit, because I am not really clear
by what you meant by that. Could you tell me more?
8. Julie: For me to look at this, I mean, I think that if one student can do it on their
own without having to work with the three other students.
9. Facilitator 3: So my design of my [group] task isn’t really [collaborative].
10. Julie: Right. Did each person have a di erent role and things like that? Did
they have to gather di erent information?
The Interdisciplinary Journal of Problem-based Learning •
72 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
11. Leslie: Or do you want them to [inaudible]?
12. Julie: Right. Some of the kids—I know that in my class some want to do it on
their own because they want to get, you know they’re going to put in all their
work and they want to make certain they have it perfect, and, so,
13. Facilitator 6: Those are good thoughts, I’d I like to capture them too, so tell me
what to write.
14. Kara: It’s an individualized task but in a group setting.
It was very common that teachers’ initial ideas tended to be vague and incomplete.
Clarifying questions helped teachers articulate their thinking and develop complete and
speci c learning issues and hypotheses. Yet, facilitators should be mindful in deciding
which ideas to focus on, and to what extent to push for clari cation.
Pushing for elaboration
Unlike clarifying questions, which focused on helping teachers articulate their vague ideas,
questions for elaboration aimed to deepen or expand teachers’ original ideas, which were
often clearly stated already. This type of question had great potential for stimulating rich
discussion. The following excerpt was such an example, in which Facilitator 1 pushed teach-
ers to elaborate on an idea that emerged in discussion. The original idea was about how
the teacher in the problem scenario grouped students. Because grouping was a common
problem facing teachers in their practice, the facilitator pushed teachers to elaborate on
their ideas. Through elaboration, teachers shared multiple perspectives on dealing with
the grouping problem.
Episode 3 (2.5 minutes): How did the teacher group students? (Circuits 2)
1. Lily: I wonder if she grouped students intentionally because I noticed one
boy didn’t get it and the other boy was explaining, and I wonder if she had
students working together to help each other and then she was also coming
in to help.
2. Facilitator 1: So is that a learning issue or is that, is that,
3. Lily: Oh, I think that is “what do you need to know.” I think it would be
interesting to know if that’s part of the grouping.
4. Sarah: She needs to carry it further, like what they say in the parallel they don't
share wires but they do share the same battery. Where is the “why” there?
“Well, why would that make a di erence?” “But why, why do you think that’s,”
…
5. Facilitator 1: I’d like to push your thought.
6. Lily: Which part of our thought?
7. Facilitator 1: The last thought that you had, how does she group her students?
8. Lily: Yeah.
Using Questioning to Facilitate Discussion 73
• volume 4, no. 1 (Spring 2010)
9. Facilitator 1: Is that really a learning issue that is worth some research? How is it
you group students to a ect relationships?
10. Lily: Yeah, because I think you group kids, but they help other kids. You know
that kids aren’t going to learn everything from you. Some are going to learn
from other kids and they are going to learn by doing and,
11. Facilitator 1: Right.
12. Lily: And then there comes that issue where kids choose their groups for them
and then do you want them at the same level? I mean it is,
13. Ariel. Sometimes you do want them at the same level like their reading group.
14. Lily: Yeah, but there are other times you want them to be, to help each other
out. That was an interesting interaction [in the video] where one boy was
explaining it, and I wonder if the other groups were like that.
… [More discussion on this topic was omitted due to space constraints.]
Checking for interpretation
This type of question was used after a facilitator restated a teacher’s idea to con rm
whether her interpretation was accurate. As shown in the following examples, these
con rming questions were typically stated as: “Is that what you are thinking?” or “Is that
what you said?”
Facilitator 1: Yes, that is powerful. So what we don’t know helps drive us to learn. Is
that what I heard you just say? [Circuits 1]
Facilitator 5: So, if there is some discrepancy, then do a whole group [discussion]
again. Is that what you are thinking? [Falling object 1]
The checking for interpretation questions seemed to contribute to two goals. First, in
combination with a paraphrase, it helped to clarify meaning to other teachers. Second, it
showed respect to the teachers by asking them to con rm whether the restatement was
accurate, which might enhance the teachers’ sense of ownership for the original idea.
In addition to the questions above, facilitators sometimes called on individual teach-
ers to engage less vocal members in discussion (e.g., “So, Mike, did you have things that
you wanted to get up here?”) or to seek help from strong members (e.g., “George, you
want to help out?”). Also, occasionally, facilitators connected discussion to teachers’ class-
room practice by asking: How do you handle this problem in your own teaching? These
questions helped to make the discussion relevant to teachers. Typically, the facilitators
recognized an instructional issue that emerged in discussion as important and asked the
teachers how they handled it in their classroom. In addition, teachers sometimes asked a
facilitator a content question. However, the facilitator did not answer it herself but asked
the group to respond, which we referred to as tossing back. Tossing back was a straight-
forward but useful technique that facilitators could use when they did not want to be
The Interdisciplinary Journal of Problem-based Learning •
74 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
seen as “content experts.” Research showed that such an expert role had a detrimental
e ect on group discussion (Kaufmann & Holmes, 1998). Participants were less willing to
contribute ideas when they considered their facilitator experts.
Teachers’ evaluation of the group discussion and facilitation
According to the evaluation survey, the teachers perceived the major objectives of the PBL
sessions were well met, as shown in table 5. Speci cally, the teachers reported that the PD
increased their reasoning skills, encouraged them to think deeply about their teaching,
and improved their ability to identify issues and challenges in science teaching and apply
appropriate solutions. They also felt prepared to use PBL as a tool to analyze their own sci-
ence teaching practices. In addition, teachers highly valued the performance of facilitators.
They reported that the facilitators were e ective in organizing sessions so that they were
actively involved and the facilitators were e ective in communicating ideas and issues.
Discussion
In this study we described a set of questions that PBL facilitators asked to promote dis-
cussion of teaching problems in professional development for science teachers. Through
multiple examples, we illustrated the context under which each type of question func-
tioned. In sum, facilitator questioning played a vital role in getting discussion started and
advancing discussion. Next, we discuss the ndings around three themes.
First, facilitators structured the discussion in line with PBL discourse that centered on
problem analysis and pedagogical reasoning through soliciting and reframing ideas on
problems, facts, learning issues, hypotheses, ndings, and recommendations. Such questions
made the discussion in this study distinct from video-supported case discussion in other PD
contexts (e.g., Borko, Jacobs, Eiteljorg, & Pittman, 2008; Sherin & van Es, 2009). For example, in
the study of Sherin and van Es (2009), the facilitators primarily focused on sharpening teachers’
“professional vision”—the ability to notice student thinking. Such discussion was not problem-
driven and did not necessarily involve identifying knowledge gaps (learning issues).
This study provides an interesting comparison to the study of Hmelo-Silver and Bar-
rows (2008), in which the medical students were experienced PBL learners. Teachers in
this study were new to PBL. For most of them, it was the rst time that they encountered
the PBL approach. Thus, they were unfamiliar with the PBL process, such as generating
learning issues or developing hypotheses. As a result, facilitators needed to help them
reframe their ideas into learning issues and hypotheses. Hmelo-Silver and Barrows (2008)
characterized the facilitator’s questions of de ning learning issues as self-directed learning.
They found about 5% of the facilitator’s questions were related to self-directed learning,
while in this study we found about 21% of facilitators’ questions were reframing questions.
The ndings of these two studies suggest that facilitators’ guidance should be exible
Using Questioning to Facilitate Discussion 75
• volume 4, no. 1 (Spring 2010)
Table 5. Teachers’ evaluation on learning and facilitation (N=35)
1. Teachers were asked to rate to what degree each of the PD objectives was accomplished. 1=not achieved at all;
5=very well achieved.
2. Teachers were asked to rate to what degree they agreed with the statement. 1=strongly disagree; 5=strongly
agree.
Questionnaire items Mean Std. Teachers’ comments
Increasing reasoning skills
and encouraging teachers
to think deeply about their
teaching.1
4.7 .64
I examined each piece of my teaching style
and assumptions closely.
I thought more deeply at this conference
than any other PD I have done!
Identifying issues and
challenges in science
teaching and applying
appropriate solutions.1
4.4 .78
It really makes you look at your strategies and
how you can improve them.
Requires thinking, explaining, listening, and
researching. A good situation.
Increasing participants’
abilities to analyze and
re ne science teaching
practices using teaching
problems.1
4.4 .84
We had to think a lot about our current
practice.
I think I am better able to analyze than when
I started.
Applying problem-based
learning and science
knowledge to teaching
problems.1
4.2 .86
Develops deeper understanding
This is a work in progress. I am working on it
and thinking how.
Too much information in too little time.
Providing a professional
learning environment for
teachers to acquire new
knowledge of instructional
practice.1
4.7 .66
Awesome!! I wish the PDs at my district were
half as good.
Excellent at this. I think there is a learning
community, a fun outlook, and facilitators
that are knowledgeable and helpful.
Preparing participants
to use problem-based
learning as a tool to analyze
their science teaching.1
4.1 .99
I feel very con dent. The facilitators walked
us through the steps to analyze.
I am red up to go to my classroom, video
myself and analyze my strengths and
weaknesses.
Facilitators were e ective in
communicating ideas and
issues.2
4.7 .68
I interacted with most of the facilitators
and found them extremely positive,
knowledgeable, and friendly.
The facilitators were exceptional people
with great ideas and empowered my own
con dence in teaching science.
Facilitators were e ective in
organizing sessions so that
I was actively involved.2
4.7 .68
Everything was so well-organized. No time
was wasted.
We were all busy and valued.
The Interdisciplinary Journal of Problem-based Learning •
76 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
and take into account learners’ prior knowledge and the stage of a PBL group. As shown
in this study, perhaps facilitators need to provide considerable guidance to learners who
have limited prior experience with PBL at an early stage of a PBL group. After learners
gain more experience and are capable of conducting problem analysis and reasoning
independently, facilitators can gradually fade their sca olding. As shown in the study of
Hmelo-Silver and Barrows (2008), experienced PBL students were more able to develop
hypotheses and generate learning issues without the facilitator’s prompting. Students
were also more able to sustain their discussion without the facilitator’s involvement.
Second, facilitators in this study helped teachers develop ideas through reframing,
clari cation and elaboration. Facilitators asked contingent questions that were built on teach-
ers’ original ideas. The majority (67%) of facilitators’ questions in this study were contingent
on teachers’ ideas. Such contingent questions helped teachers articulate vague thoughts,
consider alternative perspectives, and engage in pedagogical reasoning. Classroom research
on questioning showed that contingency is an important feature of teacher questions that
lead to productive classroom discussion (Boyd & Rubin, 2006; Nystrand et al., 2003). This
study showed that, like experienced teachers in leading classroom discussion, PBL facilita-
tors also asked contingent questions that were built on participants’ responses.
Regarding the nding that facilitators asked di erent types of questions, one natu-
rally wonders, “Which question is more e ective?” However, this study suggests that dif-
ferent types of questions have di erent functions. For example, questions of soliciting
ideas helped to get discussion started. Questions of reframing ideas helped to structure
the discussion in line with PBL discourse. Questions of clarifying and elaborating ideas
helped to improve weak and incomplete ideas. Thus, perhaps a more productive ques-
tion about questioning is, “How should questions work together to advance ideas?” This
study suggests a sequence of questions that facilitators can ask to access, probe, clarify
and deepen teacher thinking. In short, facilitators can access teachers’ initial thinking by
asking questions to solicit ideas, clarify thinking by asking them to articulate, and deepen
thinking by asking them to explain or elaborate.
Third, facilitators in this study rarely evaluated teachers’ responses as right or wrong,
perhaps because the problem was fairly ill structured, and facilitators’ questions were
open ended that did not necessarily have a correct answer. Instead, they often interpreted
teachers’ ideas and asked them to con rm the accuracy of their interpretations. This
follow-up move to participants’ responses to their previous questions helped to break
the IRE sequence found prevalent in classroom discourse (Lemke, 1990; Mehan, 1979).
Such a follow-up move, together with other questioning strategies discussed earlier, had
potential to promote dialogic discourse advocated by classroom discourse researchers
(Wells & Arauz, 2006), as shown in the episodes we provided in the results section.
Some limitations of this study should be noted, however. First, this study is essentially
a descriptive qualitative study based on a small sample size using a convenience sampling
Using Questioning to Facilitate Discussion 77
• volume 4, no. 1 (Spring 2010)
approach (Patton, 1990). We did not rst evaluate each facilitator’s questioning practice and
then choose them to be the subjects of the study. We described what types of questions
were asked under what contexts and provided illustrative examples. We did not assume
every question was useful or even necessary. Although we presented teachers’ evaluation
as one way to triangulate the ndings on facilitators’ questioning practice, we acknowledge
the limited nature of such self report data. It is possible that teachers did not improve their
reasoning skills although they perceived so. Additional research is needed to determine to
what extent the ndings of this exploratory study apply to other contexts.
Nonetheless, few studies on PBL facilitation have been conducted outside medical
education. This study contributes to the eld by revealing a variety of questioning strate-
gies that had potential to promote discussion for science teachers who were new to the
PBL approach. In addition, the context of teacher professional development imposed
tighter time constraints on implementing PBL than most medical education programs,
in which medical students typically have an extended period (e.g., a semester) to study
a PBL curriculum. In this study, teachers had to nish the entire PBL process, including
problem identi cation, problem analysis, individual research, and problem solving within
three hours. In this sense, PBL facilitation in professional development is even more chal-
lenging than in typical medical settings. This study illustrates a picture of what is possible
in PBL facilitation through questioning with tight time constraints.
Because we only examined questioning in facilitation at an early stage of a PBL group,
future research should investigate how facilitators gradually fade their sca olding when
a group becomes more experienced with the PBL process. Future research should also
examine how group members acquire and internalize the questioning strategies used by
facilitators and develop their own questioning skills.
Acknowledgments
This material is based upon work supported in part by the National Science Foundation,
under special project number ESI – 0353406 as part of the Teacher Professional Con-
tinuum program. Any opinion, nding, conclusions or recommendations expressed in
this publication are those of the authors and do not necessarily re ect the views of any
of the supporting institutions.
Note
1. The Circuits and Weather map problems were created by the facilitators in the PBL Project for
Teachers, 2006.
The Interdisciplinary Journal of Problem-based Learning •
78 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
References
Barrows, H. S. (1988). The Tutorial Process. Spring eld, Illinois: Southern Illinois University
School of Medicine.
Barrows, H. S. (1996). Problem-based learning in medicine and beyond. In L. Wilkerson & W. H.
Gijselaers (Eds.), New directions for teaching and learning. (Vol. 68). Bringing problem-based
learning to higher education: Theory and practice (pp. 3-13). San Francisco: Jossey-Bass.
Borko, H., Jacobs, J., Eiteljorg, E., & Pittman, M. E. (2008). Video as a tool for fostering productive
discussions in mathematics professional development. Teaching and Teacher Education,
24(2), 417-436.
Boyd, M., & Rubin, D. (2006). How contingent questioning promotes extended student talk: A
function of display questions. Journal of Literacy Research, 38(2), 141-169.
Cazden, C. (1986). Classroom discourse. In M. C. Wittrock (Ed.), Handbook of research on teach-
ing (3rd ed., pp. 432-463). New York: MacMillan.
Chin, C. (2007). Teacher questioning in science classrooms: Approaches that stimulate produc-
tive thinking. Journal of Research in Science Teaching, 44(6), 815-843.
Dillon, J. T. (1994). Using discussion in classrooms. Buckingham: Open University Press.
Dolmans, D. H. J. M., Gijselaers, W. H., Moust, J. H. C., Grave, W. S. D., Wolfhagen, I. H. A. P., & Vleuten,
C. P. M. V. D. (2002). Trends in research on the tutor in problem-based learning: Conclusions
and implications for educational practice and research. Medical Teacher, 24(2), 173.
Dolmans, D. H. J. M., & Schmidt, H. G. (2006). What do we know about cognitive and moti-
vational e ects of small group tutorials in problem-based learning? Advances in Health
Sciences Education, 11(4), 321-336.
Glenn, P. J., Koschmann, T., & Conlee, M. (1999). Theory presentation and assessment in a
problem-based learning group. Discourse Processes, 27(2), 119-133.
Graesser, A. C., & Person, N. K. (1994). Question asking during tutoring. American Educational
Research Journal, 31(1), 104-137.
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educa-
tional Psychology Review, 16(3), 235-266.
Hmelo-Silver, C. E., & Barrows, H. S. (2006). Goals and strategies of a problem-based learning
facilitator. Interdisciplinary Journal of Problem-based Learning, 1(1), 21-39.
Hmelo-Silver, C. E., & Barrows, H. S. (2008). Facilitating collaborative knowledge building.
Cognition and Instruction, 26(1), 48-94.
Jordan, B., & Henderson, A. (1995). Interaction analysis: Foundations and practice. Journal of
the Learning Sciences, 4(1), 39-103.
Kaufmann, K., & Holmes, D. B. (1998). The relationship of tutors’ content expertise to interven-
tions and perceptions in a PBL medical curriculum. Medical Education, 32, 255-261.
Lemke, J. L. (1990). Talking science: Language, learning and values. Norwood, NJ: Ablex.
Mehan, H. (1979). Learning lessons: Social organization in the classroom. Cambridge, MA: Har-
vard University Press.
Meyer, D. Z., & Avery, L. M. (2009). Excel as a qualitative data analysis tool. Field Methods, 21(1),
91-112.
Mikeska, J., & Stanaway, J. (2006). How Things Move. The PBL Project for Teachers. East Lansing,
MI: Michigan State University.
Using Questioning to Facilitate Discussion 79
• volume 4, no. 1 (Spring 2010)
Nystrand, M., Wu, L. L., Gamoran, A., Zeiser, S., & Long, D. A. (2003). Questions in time: Investigat-
ing the structure and dynamics of unfolding classroom discourse. Discourse Processes, 35(2),
135-198.
Palincsar, A. S. (1999). Applying a sociocultural lens to the work of a transition community.
Discourse Processes, 27(2), 161-171.
Patton, M. Q. (1990). Qualitative evaluation and research method. Newbury Park, CA: Sage.
Roth, W.-M. (1996). Teacher questioning in an open-inquiry learning environment: Interac-
tions of context, content, and student responses. Journal of Research in Science Teaching,
33(7), 709-736.
Savery, J. S. (2006). Overview of PBL: De nitions and distinctions. The Interdisciplinary Journal
of Problem-based Learning, 1(1), 9-20.
Science and Mathematics Program Improvement (2006). The PBL Project for Teachers, Focus on Prac-
tice End-of-Session evaluation questionnaire. Kalamazoo, MI: Western Michigan University.
Sherin, M. G., & van Es, E. A. (2009). E ects of video club participation on teachers' professional
vision. Journal of Teacher Education, 60(1), 20-37.
Shulman, J. H. (1996). Tender feelings, hidden thoughts: Confronting bias, innocence and
racism through case discussion. In J. A. Colbert, P. Desberg & K. Trimbel (Eds.), The case
for education: Contemporary approaches for using case methods (pp. 137-158). Needham
Heights, MA: Allyn & Bacon.
van Zee, E. H., Iwasyk, M., Kurose, A., Simpson, D., & Wild, J. (2001). Student and teacher ques-
tioning during conversations about science. Journal of Research in Science Teaching,
38(2), 159-190.
van Zee, E. H., & Minstrell, J. (1997). Using questioning to guide student thinking. Journal of
the Learning Sciences, 6(2), 227-269.
Wells, G., & Arauz, R. M. (2006). Dialogue in the classroom. Journal of the Learning Sciences,
15(3), 379-428.
Zhang, M., Lundeberg, M., McConnell, T. J., Koehler, M. J., & Eberhardt, J. (2009, April). Strate-
gic facilitation in problem-based professional development to promote science teachers'
pedagogical learning. Paper presented at the annual meeting of American Educational
Research Association, San Diego, CA.
Meilan Zhang is Research Associate, Division of Science and Math Education, Michigan State University
Mary Lundeberg is Professor in the Department of Teacher Education, Michigan State University
Tom J. McConnell is Assistant Professor in the Biology Department, Ball State University
Matthew J. Koehler is Associate Professor in the Department of Educational Psychology and Educa-
tional Technology, Michigan State University
Jan Eberhardt is Assistant Director of Division of Science and Math Education, Michigan State University
The Interdisciplinary Journal of Problem-based Learning •
80 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
Appendix
Three teaching problems
Circuits and Pathways
Context: The principle of electricity was the focus for my group of 30 fourth grade
students in a public elementary school in Castro Valley, California during the month of
March. I began the unit with a questionnaire asking students, “Where in your house do
you nd electricity? How do you use it? What might happen if your ashlight stops work-
ing?” I started by having the students learn about things that were more familiar to them
and then moved to more complex ideas. First, the students made posters of ways that
they use electricity in their lives. Then, students experimented with a variety of materials
and focused on one challenge: lighting a bulb using a battery, bulb, and wire. They also
used a battery, wires, and motor to make the drive shaft on the motor turn in a clockwise
and counterclockwise fashion. After that, they moved to learning about and constructing
series and parallel circuits. My goal was for students to come away with an understand-
ing of some of the basic principles of electricity, including how circuits work, how circuits
do not work, and something about the ow of current, as well as have the experience of
designing and carrying out their own experiments.
Objective: Students will be able to construct a simple electric circuit that provides a
pathway so that energy can move between a source (battery) and an object (bulb and/or bell).
Students will be able to identify and describe how various types of electrical circuits (i.e. series
and parallel) provide a means of transferring and using electrical energy to produce light.
Teaching dilemma: I think that it’s important for students to take responsibility for their
own learning and to learn to think critically, to learn to question and to become excited about
learning and excited about what they see happening in the world. When they’re able to
have their hands on the materials and when they’re able to speak with one another, they’re
in control. After the students had an opportunity to create parallel and series circuits, they
noticed that the bulbs in the parallel circuit were brighter than the bulbs in the series circuit.
Asking the students to explain their thinking led to a variety of ideas for this observation.
Focus question: How might a teacher move his or her students from vague ideas to
a more scienti c understanding?
Product: A recommendation for how this teacher might move her students to a more
scienti c understanding of electricity.
How Things Move (Falling objects)
Context: As a rst grade teacher, I try to embed process skills into our science investiga-
tions as often as possible. Making observations and drawing conclusions are two essential
components of my county’s science curriculum. I felt that they students would not be
Using Questioning to Facilitate Discussion 81
• volume 4, no. 1 (Spring 2010)
challenged by the county objectives that they be able to “give examples to demonstrate
that things fall to the ground unless something holds them up” and to “describe the dif-
ferent ways that objects move (e.g. straight, round and round, fast and slow)” (Curriculum
Framework, MCPS, 2001). I wanted them to go further than these basic objectives, to use
their abilities to make observations and draw conclusions to delve deeper by asking "why"
questions. I consulted the science specialist for ideas, and he suggested a classic experi-
ment: Drop a at piece of paper and a book at the same time from the same height, and
then drop a crumpled piece of paper and a book at the same time from the same height,
and compare the di erence in results. That seemed perfect.
The students were comfortable both working in small groups with hands-on materials
and sharing their thoughts and results in a whole-class setting, so I planned some of each
for the activity. I assumed that they would rst observe the book hitting the ground before
the at piece of paper, and then the crumpled paper and the book hitting the ground at the
same time. I could then ask them to explain the di erence in results, and I was interested
to see how they would do. But the lesson did not work out quite as I planned.
Objective: Students will be able to communicate ndings from group observations
and investigations to the class and the teacher and provide supporting evidence when
forming conclusions. Students will be able to describe di erent ways that objects move
and give examples to demonstrate that things fall to the ground unless something holds
them up.
Teaching dilemma: On the rst day, the students made predictions for what they
thought would happen when you drop a book and a piece of paper at the same time
from the same height. On the second day, the students used the materials to complete
their rst part of their investigation in small groups (book and at piece of paper) and
then discussed what happened. However, the students returned with a variety of results
from the same investigation. Most surprisingly, it seemed as though the students were
accepting these ndings with little controversy. How can the students make predictions
that seem very logical and reasonable one day and then return with observed phenom-
ena that seems to completely refute their sensible ideas the next day? How did they not
question these observations? It seemed as though their sense-making was completely
cast aside as they shared their observations with the group.
Focus question: What interactions (student to student, student to teacher, and teacher
to student) might you set-up to resolve discrepant data?
Product: A recommendation of how this teacher might facilitate the conversation
and learning experience to help students notice and resolve discrepant data.
Weather Map
Context: Eighth graders in the state of Michigan are required to take a state assess-
ment test for science and social studies every October based on state standards and
The Interdisciplinary Journal of Problem-based Learning •
82 M. Zhang, M. Lundeberg, T. J. McConnell, M. J. Koehler, and J. Eberhardt
benchmarks. The meteorology section of the science test requires students to analyze
weather maps. I teach in a small rural school district and one of the challenges we face is
the in ux of transient students. This was a group of students who found learning very chal-
lenging. As a result, I felt I needed to keep the frustration level low. To do this, I reviewed
much of what had been taught in previous classes on a daily basis.
In this lesson, students are nishing a unit on Meteorology. The two-day culminating
activity was designed to allow students an opportunity to apply what they had learned
by creating a weather map. Students used an information sheet of weather data from a
variety of U.S. cities and a printed satellite weather map from a popular cable weather
channel to complete a weather map.
Objective: Students, working as a team of meteorologists will be able to create a
national weather map, correctly placing weather stations models at the appropriate cities,
as well as warm and cold fronts, precipitation, and high and low pressure centers. They
will be able to forecast the weather for the twelve cities for the following day.
Teaching dilemma: I use many di erent resources when I teach a unit. For my me-
teorology unit, I got information from the National Oceanographic and Atmospheric
Administration, cable weather stations, teacher resources, and other Internet resources.
Unfortunately, not all the information is exactly the same, so students are often expected
to interpret the similarities of each of the di erent resources. I believe this encourages
them to be more exible and better problem solvers when trying to accomplish a task.
Prior to staring work on this activity, the students were directed to work as a group on
their weather symbol diagrams. However, as I observed the students, they were interact-
ing only occasionally by talking quietly or sharing maps.
Focus question: How could this task have included structures that might have stimu-
lated more collaboration among students?
Product: A recommendation of how this teacher might structure this activity to
promote more collaboration among students.