ArticlePDF Available

Cracking the Resource Nut With Distributed Problem-Based Learning in Secondary Teacher Education

Authors:

Abstract and Figures

In this article, we focus on the features and functions of the STEP pbl System that enable us to support novice tutors and thereby address the human resource challenge that implementing a pbl course in a typical undergraduate setting poses. We describe the activities students in our course engage in and present preliminary findings from our first trial of the system. We then describe our strategies for distributing the functions of the tutor based on the first trial and previous course implementations. We conclude with a description of the research methodology we are using to shepherd our site development efforts. Distributed pbl in Secondary Teacher Education 4 Implementing problem-based learning (PBL) in a traditional undergraduate setting is really a problem of resources. At many universities, large undergraduate courses continue to be held in vast lecture halls typically equipped with an elevated stage in front and desks bolted to an inclined floor. In a good semester, there is one teaching assistant (TA) to roughly every 30 students. There are not
Content may be subject to copyright.
Distance Education, Vol. 23, No. 1, 2002
Cracking the Resource Nut With Distributed
Problem-Based Learning in Secondary Teacher
Education
Constance A. Steinkuehler & Sharon J. Derry
University of Wisconsin-Madison, Wisconsin, USA
Cindy E. Hmelo-Silver
Rutgers University, New Brunswick, USA
Matt Delmarcelle
University of Wisconsin-Madison, Wisconsin, USA
ABSTRACT In this article, we focus on the features and functions of the STEP pbl system that enable
us to support novice tutors and thereby address the human resource challenge that implementing a PBL
course in a typical undergraduate setting poses. We describe the activities that students in our course
engage in and present preliminary  ndings from our  rst trial of the system. We then describe our
strategies for distributing the functions of the tutor based on the  rst trial and previous course
implementations. We conclude with a description of the research methodology we are using to shepherd
our site development efforts.
Implementing problem-based learning (PBL) in a traditional undergraduate setting is really a
problem of resources. At many universities, large undergraduate courses continue to be held
in vast lecture halls typically equipped with an elevated stage in front and desks bolted to an
inclined  oor. In a good semester, there is one teaching assistant (TA) to roughly every 30
students. There are not enough rooms available during scheduled class time for uninterrupted
small-group work, not enough computers available to enable everyone to conduct research
simultaneously, and not enough time in the typical undergraduate’s schedule to allow groups
to meet frequently and conveniently outside of class. In larger courses, TAs with minimal
training often serve as tutors and, given the high turnover rate endemic to such positions, each
semester the instructor, who may also have limited experience with PBL, contends with the
prospect of starting over with a completely new staff. In smaller courses, a single roaming
instructor struggles to distribute their time and attention across all groups equally, at once. If
PBL is to serve effectively as the primary vehicle for learning in such undergraduate settings,
the problem of resources must be addressed.
In the spring of 2000 we redesigned how preservice teachers in our program engage with
the Learning Sciences [1]. We took a traditional, lecture-style undergraduate course in
educational psycholog y and restructured it into a PBL experience (pbl [2]) with the expectation
ISSN 0158-7919 print; 1475-0198 online/02/010023-17 Ó2002 Open and Distance Learning Association of Australia, Inc.
DOI: 10.1080/01587910220123964
24 C. A. Steinkuehler et al.
that students who took our course would construct useful knowledge about the Learning
Sciences to guide their subsequent instructional decisions and design. Ongoin g analysis of data
collected during implementation of the redesigned course indicates that our expectations about
what students might gain through such activities were well founded; however, the pervasive
resource challenges we, like others (Kirkwood, 1998), faced in accomplishing them raise the
issue of sustainability (Derry & STEP Project Group, 2000). Despite the success of our
curricular design, if we were to continue using pbl as the primary vehicle through which
preservice teachers engage in the content of the domain, then we had to develop a solution to
this pervasive problem of resource constraints.
Restructuring and distributing the pbl activities via the Web is a best- t solution to the
resource problem . By putting pbl activities online (www.eSTEPweb.org), we are able to avert
the physical and temporal constraints that previously served as potential barriers to our goals
of continuing pbl instruction in our course and creating a viable national model for preservice
teacher education. We can now provide students ample space (albeit virtual) for collaborative
work, reduce the necessity to coordinate schedules, and give students greater freedom in
choosing where they work from and when (“anytime/anywhere” interaction, Benbunan-Fich &
Hiltz, 1999). Most importantly, however, our system enables us to address the human resource
challenge—How do you implement pbl without a full staff of experienced tutors to facilitate
students’ collaborative work? Our strategy: You distribute the tutor’s functions across the
system, the students and the staff. Table 1 outlines our approach.
In this article, we focus on the features and functions of the STEP pbl system which enable
us to support TAs in their work as pbl tutors, and thereby address the human resource
challenge that implementing this type of course poses. First, we outline the activities that
students in our course engage in and present preliminary  ndings from our  rst trial of the
system. We then describe some of our strategies for distributing the functions of the tutor
based on these data from the  rst trial, informal discussion with the tutor, and previous
implementations of our course. We conclude this paper with a description of the research
methodology we are using to shepherd our site development efforts.
Overview of Students’ Activities
Students in our course learn to apply the Learning Sciences to teaching through collaborative
problem solving that involves the study of videocases [3] of actual classroom instruction. Each
videocase presents the story of a particular piece of instruction, either model instruction to be
emulated and/or adapted (e.g., the successful use of cognitive modeling procedures) or popular
instruction in need of redesign (e.g., the proverbial “chalk and talk” techniques). The problem
that students face is to adapt or redesign the instruction based on Learning Sciences research.
In order to accomplish this, students conduct an individual preliminary analysis of the
videocase and then meet with their pbl group online to share and negotiate their ideas, generate
learning issues (Barrows, 1985), conduct research, and then reason through their preliminary
ideas in light of what they investigate. Once the group work is completed, each student
composes his or her own  nal solution proposal, compares it to an expert analysis [4], and then
re ects back on the products and processes so generated (see Fig. 1 for more detail). This
process is supported by the Knowledge Web (STEP Project Group, 2000), a richly interlinked
network of Learning Science concepts connected to each videocase (cf., Spiro et al.,
Distributed pbl in Secondary Teacher Education 25
TABLE 1. Distribution of the tutor’s functions across modules within the STEP pbl system, students, and staff
Online system Students
Discussion
Student environment Tutor
Tutor’s functionsaModule and tools Module Tutoring TA Individual Group
To guide students through an appropriate sequence of phases 3 3 3 3
To insure adequate attention to each phase 3 3 3 3
To serve a metacognitive function for the group
To encourage students to make their knowledge and 3 3
reasoning public
To probe students’ knowledge and reasoning 3 3 3 3
To monitor the interpersonal dynamics
To insure equitable participation and interaction 3 3 3
To promote re ection on group process 3 3
To monitor the intrapersonal dynamics
To assist students’ self-directed study when necessary 3 3 3 3
To encourage re ection on process and products 3 3
To make educational diagnoses 3 3 3 3
aThis list of tutor’s functions is adapted from “Facilitating the group process,” a resource designed for PBL tutors by the Department of Medical
Education, Southern Illinois University (1997). These regulatory functions are modeled and then gradually relinquished to the students themselves
as their familiarity and competence with the pbl process increases.
26 C. A. Steinkuehler et al.
FIG. 1. Outline of students’ activities in the STEP pbl system.
1991) and the Student Module, a series of interactive Web pages and tools that scaffold
students through the pbl process [5].
These activities are designed to develop students’ reasoning and problem-solving skills in
addition to content knowledge. Our goal is more than simply transmission of the latest  ndings
in the Learning Sciences; our expectation is that students in our course will develop an ability
to use current theory and research on cognition to guide instructional decisions and design. By
situating instructional design and decision making in the context of collaboration rather than
Distributed pbl in Secondary Teacher Education 27
isolated independent practice, our activities afford preservice teachers the opportunity to
engage in sustained collaborative work of the type we want practicing teachers to engage in.
The nature of the problems in our pbl design re ects the unique nature of the profession of
teaching. We incorporate both “ideal” and “not-so-ideal” instruction in our repertoire of
problems so that preservice teachers taking our course are exposed not only to model
instruction that illustrates ideals of reform—ideals that are not always well illustrated in the
schools where our students observe and teach—but also to the kind of instructional problems
they will likely face once they enter the  eld. In so doing, we aim to improve students’
abilities to analyze classroom instruction (whether it be sound or shaky) on the basis of
research and then to design instruction on the basis of such analyses.
Results From the First Trial
Two groups of  ve preservice teachers, enrolled in an educational psychology course at a
major Eastern university, volunteered to participate in our  rst trial of the STEP pbl system
at a distance. Students were presented with a redesign problem containing a videocase story
of a traditional, from-the-textbook, largely lecture-based instructional unit taught by a popular
teacher in a Midwestern public high school science class. Though the instructor had taught the
unit several times, the attending assessment materials indicated that students did not grasp the
content. The challenge that the preservice teachers faced was the following: “You are part of
an online professional development community of which the teacher in the videocase is a
member; he has asked your community for advice on how to improve his lesson. Advise him
on how to proceed and justify your group’s redesign proposa l using Learning Science
concepts.” Students took approximately three weeks to complete the activities outlined in the
previous section. One of the authors with extensive experience in pbl facilitation served as
tutor for both groups.
Given that this was our  rst trial of the system beyond in-house user testing, our primary
interest was to obtain an overall picture of the feasibility of our system design and to solicit
suggestions from an expert tutor for functions and resources we might include in our system
to scaffold and enhance the tutoring process online. More in-depth analysis will be presented
elsewhere; here, we present students’ ratings of, and comments on, the system. Throughout the
remainder of the paper, we outline the strategies (including system features, tools, materials
and resources) we are developing to scaffold tutor performance, based on conversations with
the tutor, previous implementations of our course, and extensive online journal notes the tutor
kept throughout the trial.
Student ratings of various features of the system (see Fig. 2), collected at the end of the pbl
activities, were positive in regard to the nature of the activity and constructive in terms of the
technical design. At the time of the  rst trial, much of our system was in its early “prototype”
stage and we were in the process of dealing with various technical issues (i.e., insuring
consistent access under low-bandwidth connections, and integrating the security systems to
reduce the number of passwords necessary from three to one) that were not yet resolved.
Average group ratings ranged from 3.2 (with 3 “fair”) to 4.4 (with 4 “good” and 5 “excellent”)
on a Likert scale of 1–5, with no signi cant differences between groups. Combined with an
examination of what users actually did (and did not do) and comments made during and after
the trial, however, these data did suggest speci c modi cations that could be made to improve
overall usability; these changes are discussed below.
28 C. A. Steinkuehler et al.
FIG. 2. Mean Likert-scale responses (from 1 minimum to 5 maximum) to the pbl system
feedback survey.
Distributing the Tutor’s Functions
In traditional PBL, tutors play an important role in determining what and how students learn
throughout their activities. As illustrated in Table 1, tutors are responsible for monitoring the
ow of each student’s activities, playing a metacognitive function for the group by probing
students’ knowledge and reasoning, monitoring both interpersonal (e.g., the distribution of
participation) and intrapersonal (e.g., the level of engagement of each individual student)
dynamics of each group, and making educational diagnoses in terms of both product
(knowledge) and process (critical thinking). Accomplishing these responsibilities is a challenge
for the most seasoned tutors; for new TAs with little training and no experience teaching in
a PBL context, it is a tall order indeed.
The STEP pbl system we are developing is designed to provide TAs with the assistance,
scaffolding and support necessary for successfully tutoring multiple online groups at once. Our
strategy for accomplishing this includes, on the one hand, partially distributing the tutor’s
responsibilities across the system and the students themselves and, on the other hand,
providing tutors with a set of online tools and resources that can scaffold their tutoring
performance, as well as a working environment that affords “in situ” use of the tools.
Distributed pbl in Secondary Teacher Education 29
Guiding Students Through the Appropriate Sequence of Phases
In a face-to-face pbl setting, the tutor is responsible for regulating the sequence of all pbl
activities; in our online course, the three-session design of the Student Module enables
individual students to guide themselves through parts of the pbl activity. A sequence of
interactive Web pages steers each student through two of three sessions of activities: “Session
One: Individual Pre-Analysis,” in which students explicate their initial situation model (Derry,
1996) of the videocase using the “Individual Whiteboard” (see Fig. 1); and “Session Three:
Individual Final Analysis & Re ection,” in which students write their individual  nal solution
proposals to the problem , compare and contrast their arguments with an expert’s, and re ect
back on their work. The central session, “Session Two: Group Investigation,” is where the
tutor facilitates group work, guiding each group of students through a collaborative process
designed to help students use Learning Sciences concepts to construct a group situation model
and reconstruct their individual situation models (from Session One), based on what they
discover through investigation and online discussion with their peers.
A bank of online tools and resources scaffolds tutors as they, in turn, scaffold the students
through the collaborative process. This bank of online resources outlines a suggested sequence
of group activities based on what has worked in the past; each activity listed is linked to
additional information regarding the purpose of the given activity, an elaboration of what the
activity entails, and tips for when (and when not) to step in. Accessed via the Tutor Module—a
“digital dashboard” of sorts through which the tutor accesses and interacts with the students
and the system—this bank of resources provides novice tutors with practical strategies for how
to guide students’ collaborative work “from the side” without being intrusive, a challenge for
TAs who are accustomed to a more directive instructional role.
Previously, the outline of students’ collaborative activities was available only to tutors
because we wanted to avoid over-proceduralizing the group process. During the  rst trial,
however, both student and tutor comments indicated that more explicit explanations were
required. Students expressed confusion about the activities and their purpose. For example, one
student expressed confusion regarding the purpose of starting with their own ideas rather than
searching for the “right” answer in the research: “I don’t feel it’s right to post something on
the whiteboard until we get some really core research done … until then, I can’t say anything
but my opinion” (Karen, Group 2). In the words of the tutor, greater structure was called for:
I am still frustrated with the parallel play aspect of the activity [Group 2’s tendency to work
independently in parallel rather than collaborate]. I think that the  rst few times students do a
problem like this they will need a lot of structure in the task, in terms of milestones and
required numbers of notes … (Tutor, Online Tutor Journal)
In response, we are now creating more concrete Session Two explanations and suggestions for
the students that will mirror the descriptions provided to the tutor in the Tutor Module and are
developing a system that will enable tutors to set explicit posting requirements, expected group
milestones, and deadlines for accomplishing them. Once these criteria are set by the tutor, the
information will be relayed automatically to the students via the Student Module interface.
Although this may seem paradoxical since pbl is a student-centered approach, we expect that
communicating the structure and expectations more clearly to students in the beginning will
afford them greater autonomy in the long run; once students become familiar with the general
30 C. A. Steinkuehler et al.
procedures, they can make their own informed decisions on how to structure their group
activities and these scaffolds can be faded.
Insuring Adequate Attention to Each Phase
The three-session design of the pbl activity does more than structure students through
activities; it also insures that students pay adequate attention to the preliminary and follow-up
phases contained therein. By dynamically tracing what each student has completed and then
providing access to the next activity or session in the series based on this trace, the Student
Module monitors students’ progress by limiting forward access based on adequate completion
of prior tasks. By placing students’ collaborative work between two other sessions of
individual activities, we are able to ensure both adequate preparation prior to discussion and
adequate re ection and follow-up by individuals after the collaboration has occurred. Session
One, which precedes discussion, orients students to the kinds of “mindset” that pbl activities
require, familiarizes them with the overall learning objectives and how each activity helps
them meet those objectives, and engages them in thinking deeply about the problem before
discussing it with their peers. Session Three, completed after the discussion, ensures adequate
follow-up on what was learned individually, providing each student with the opportunity to
articulate his or her own individual  nal solution to the problem—a key product for assessment
purposes, since teacher certi cation is based on individual, not group, performance—and then
re ect back on how his or her initial ideas changed by investigating Learning Science concepts
and discussing their importance for instructional design.
Placing the students’ collaborative work between two individual sessions enables us to
distribute the scaffolding functions for which tutors are traditionally responsible, not just over
the system, but over the students as well. Adequately preparing individual group members
prior to group discussions enables them to monitor their own progress, in terms of both content
and process, in collaboration with the tutor once the group work begins. By orienting each
member to the mindset and basic structure of pbl, groups can take responsibility for their own
facilitation to some extent, given preliminary assistance from the system (Session One) and a
monitoring TA (overseeing several groups). The system modi cations described above should
facilitate this process; by clearly communicating the speci c milestones that each group is
expected to accomplish, and their deadlines, students can be held accountable for their own
progress.
Encouraging Students to Make Their Knowledg e and Reasoning Public
During their collaborative work, students make their knowledge and reasoning public through
a combination of an asynchronous discussion environment and strategically designed online
“Group Whiteboard,” which structures the group product. Group members share and negotiate
their pre-analysis ideas on the online threaded discussion board and then post their consensual
results to the Group Whiteboard (see Fig. 1). Assuming that students comply with the pbl
injunction that “silence is assent,” the text-based nature of these two collaborative spaces
translates each student’s reasoning about the discussed issues into a public document.
By moving the group’s negotiation from a synchronous face-to-face environment to an
asynchronous online one, we are able to transform the discussion from a temporal unfolding
Distributed pbl in Secondary Teacher Education 31
of talk to a cascade of inscriptions that, quite literally, “artifacts” the developmental trajectory
of each discussion topic over time (cf. Bailey & Luetkehans, 1998). Threaded discussions offer
distinct advantages over synchronous ones, fostering more serious and lengthy interactions
(Bonk et al., 1998), more re ective responses (Davidson-Shiver s et al., 2000), increased group
interaction (Eastmond, 1992), and more equitable communication patterns (Harasim, 1990).
Comparisons with face-to-face show no differences in terms of relational communication
(Walther & Burgoon, 1992), group cohesiveness, or quality of group product s (Burke &
Chidambaram, 1995). In fact, asynchronous discussion has been shown actually to enhance the
quantity and quality of the solutions in case-based instruction (Benbunan-Fich & Hiltz, 1999).
This is not to say, however, that threaded discussion is a panacea for collaborative work.
Used alone, the hierarchical organization inherent in such tools can obscure the main thrust
and development of the group’s reasoning rather than elucidate it, making consensus dif cult
to reach (Hiltz et al., 1986). Because new posts are added sequentially over time, the content
of each thread can become more and more diffuse, leading to “a sense of information overload
and confusion about the intellectual focus of the community” (Hewitt, 1997, p. 2). In order to
prevent this outcome, we combine such discussion with a shared workspace for recording the
group’s consensus argument, the Group Whiteboard.
In essence, the Group Whiteboard is a more elaborated version of the two columns of the
Individual Whiteboard that students completed individually during Session One (i.e., “what
should be done” and “why it should be done”). Using this shared tool, each group records their
consensus solution ideas and how the results of their pooled research into the Learning
Sciences bear on each idea. By providing the group space in which to cite both con rming and
discon rming evidence (for a classic discussion of “con rmation bias” see Wason & Johnson-
Laird, 1972) as well as the source of their claims, this tool enables students literally to see how
and where the Learning Science concepts they investigate bear on their solution ideas. In this
manner, the Group Whiteboard makes the group thinking visible for members and the tutor
alike.
Based on the  rst trial, however, we cannot yet determine whether this combination of
threaded discussion and online Group Whiteboard has the cognitive affordances (Gibson,
1979, p. 127) we, in theory, predict. Unanticipated complications arose during students’
Session Two activities that resulted from simple design  aws. Students didn’t know which tool
was for which purpose: “Are we supposed to be talking here and putting our research there
or the other way around? I am sorry but I am quite confused” (Karen, Group 2). As a result,
they did not use the two spaces as we intended; rather than deliberating in the discussion space
and then posting the results to the group product, participants treated both spaces as identical,
conducting discussions and posting proposals in both. As a result, the two spaces coalesced in
unproductive ways—on the one hand, bifurcating topical conversations as they developed; on
the other, eliminating the ability for groups to develop a distinct product representing the fruits
of their labor for all to see.
This amalgamation of the two collaborative spaces appears to have resulted from a basic
design  aw: the Session Two interface, as designed and implemented during the  rst
trial, made the Group Whiteboard more salient than the discussion space, with the former
embedded within the interface and the latter located in an external, second window. Our
directions did not distinguish between the purposes of the two spaces suf ciently and students
32 C. A. Steinkuehler et al.
simply used the  rst space they encountered to collaborate as instructed. Toward solving this,
we have now built a new threaded discussion tool, integrated directly into the Student Module
interface, that better coordinates with the Group Whiteboard and we have clari ed our
directions for using both tools. We suspend judgment on this design until our next trial.
Probing Students ’ Knowledge and Reasoning
Video data collected from the spring 2000 implementation of our course indicates that, in
face-to-face pbl, probing students’ knowledge and reasoning “on the  y” places considerabl e
burden on a novice tutor (Derry et al., 2001). In the online environment, our combination of
asynchronous discussion and Group Whiteboard plays a critical role in scaffolding tutors in
serving a metacognitive function for each group and distributing such regulatory processes
across the group members as well. The discussion space transforms student deliberations into
a cascade of inscriptions that can be perused, reviewed, and considered in context, enabling
both the tutor and the students to take a more re ective stance toward each individual posting
and the trajectory of the group work as a whole. In addition, the Group Whiteboard enables
students literally to see how and where the Learning Science concepts they investigate bear on
their solution ideas, thereby increasing their own metacognitive awareness of what they are
learning and whether/how it prompts revision of their initial beliefs about teaching, learning
and instructional design.
In metacognitive terms, the Group Whiteboard is vital; it structures not only the group
product but the group process as well (cf., Suthers, 2001). First, it makes the constituent
elements of the group argument (i.e., claims, pros and cons, evidence) salient and therefore
more likely to be attended to, negotiated and elaborated upon. Second, it makes the
relationships between these elements explicit, providing a framework within which group
members can negotiate the import of the results of their investigation. Finally, it makes the
gaps or absences within the argument conspicuous, hence a topic for discussion in their own
right. Individuals within the group must organize their activities via reference to the group
product; as a result, both individual and group activities get coordinated by the Group
Whiteboard structure we carefully designed. In this manner, the Group Whiteboard fosters
metacognition; it makes the line of reasoning inherent in the group argument explicit, hence
a topic for consideration.
Are these structural features suf cient? Probably not. Pbl, in its best moments, is a form of
cognitive apprenticeship (Collins et al., 1989) in which the tutor explicitly displays the
otherwise tacit cognitive strategies used by experts in the domain. As Hmelo and Guzdia l
(1996) argue, one of the key roles of the tutor is to model the thought processes and kinds of
questions in which students should engage . Questioning strategies are  rst demonstrated by the
tutor and then progressively faded as students internalize and use them on their own. How,
then, does our online system support tutors in this process?
Our strategy is to provide model questions to the tutors. The bank of tutoring resources
described earlier provides example expert “conversational moves” that the tutor can use to
probe students’ knowledge and reasoning. These materials provide tutors tangible ways to
guide the group discussion in conjunction with information about the purpose of each so that
they themselves can decide which “moves” to make and when. We found that many of the
postings the expert tutor made to both Group 1 and Group 2 during the  rst trial were similar
Distributed pbl in Secondary Teacher Education 33
if not identical. Based on these turns, we’ve created example postings for each activity phase.
Conversational moves such as “Lots of good ideas in here—how do you think that he might
get at students’ prior knowledge of static electricity? What do you mean by information
overload?” or “That’s an interesting idea. Why do you think that? Is there any evidence that
supports it?” or “Does everybody agree with this de nition of constructivism?” can be copied
and pasted into the discussion board or edited at will. Eventually group members internalize
these conversational strategies and the tutor’s scaffolding can be reduced, but by giving tutors
a set of explicit example expert questions that have been productive for group thinking in the
past, we hope to provide support that less experienced staff members can lean on.
Insuring Equitable Participation and Interactio n
Technologies “do not simply cross space and time; they also can cross hierarchical and
departmental barriers” (Sproull & Kiesler, 1991, p. ix). For this to happen in collaborative
settings, however, you must insure equitable participation and interaction among all group
members. Orchestrating each online asynchronous discussion in order to insure all voices are
heard is dif cult; tutors in our course monitor several groups simultaneously, thus making
accurate diagnosis of the patterns of interaction dif cult. In order to assist, we are developing
a diagnostic tool accessible via the Tutor Module: an “Interaction Matrix.”
An Interaction Matrix representing the distribution of discussion board postings within each
group provides the tutor with a snapshot of the level of engagement (i.e., who has/has not
posted, how many postings have been made, and by whom) and its “center of gravity” (for a
simple overview to this method, see Wortham, 1999). Our current thinking is that each
participant in the discussion will be represented as a vector containing the number of replies
he or she has made to every other participant, yielding a matrix representation of the
interaction occurring within the group. Unequal interaction, such as one person’s postings
receiving the majority of the responses, is designated by higher numbers within the matrix.
Information gleaned from the student pro le—biographical information each student enters at
rst login such as major, gender, year in school, native language, etc.—is used to highlight
potential sources of within-group status differences, enabling the tutor to see whether the
distribution of talk divides along status lines. Using this tool, the tutor can better monitor the
interpersonal dynamics within the group to insure group responsibility and equal participation
of all members and, when necessary, to promote re ection on group process when issues arise.
Promoting Re ection on Collaborative Learning and Group Process
The sole function of the third and  nal session of the Student Module is to help students come
to understand how their own arguments about teaching, learning, and instructional redesign has
been revised as a result of their group work. During this session, students are asked to re ect
on the products and processes resulting from their online collaboration, including the extent to
which the discussion led them to elaborate and revise their initial ideas. But while our original
design included a tool for helping individuals re ect on their learning from the group, it did
not include a group re ection tool that might guide students through joint re ection on the
group’s communal work.
Our current revisions are correcting this: Examination of students’ activities during the  rst
34 C. A. Steinkuehler et al.
trial revealed that an excellent opportunit y for learning had been missed. Peer evaluation of the
group collaborative process is critical if students are to improve their own collaborative and
negotiation skills over time. Toward this end, we are now developing a group feedback form
that both group members and the tutor can complete at the end of their collaboration and then
post to the discussion board. Using this form, each individual will be able to provide
constructive comments on their fellow group members, the tutor, and their own contributions
to the group work.
Assisting Students’ Self-Directed Study
The Student Module provides students with access to Learning Science content materials
contained in the Knowledge Web (STEP Project Group, 2000), a network of densely
interlinked concept pages linked to each videocase. Findings from the  rst trial indicate that
students who made use of this resource found it extremely useful; however, we underestimated
the slope of the learning curve required to acclimate to the site’s navigational complexity.
Some students simply struggled, feeling overwhelmed and lost in space:
Next time, I would use the Knowledge Web more. I had trouble using it and … got frustrated
and reverted to other sources. I would use it a lot more as other people in my group found it
very useful. (Connie, Group 2)
Although such problems are partly an issue of improving Knowledge Web navigation, tutors
must be able to aid students who are struggling with their self-directed investigation. In the
STEP pbl system, a “Use of Resources” report tool in the Tutor Module provides the tutor a
quick snapshot of how group members are faring in their online research. Our system’s ability
to provide a “trace” of each student’s online activities is now being extended into the
Knowledge Web. By recording the pages each student accesses in sequence and then
preprocessing this list in terms of each page’s “number of links” distance from the assigned
videocase, we can provide the tutor access to reports on whether group members successfully
accessed content materials, their strategies for research, the depth and breadth of the group’s
investigation, and whether critical concepts for the case or speci c resource suggestions were
found. Such information can provide the tutor, should they need it, a general sense of how
students are faring during their investigation through an admittedly complex hypertext space.
Making Educational Diagnoses
Finally, and most importantly, we are designing a system that enables the instructional staff
(tutors and the supervising course instructor) to assess both individual and group conceptual
change. If we take student learning via online collaboration seriously, then to some extent we
must measure the degree to which it prompts individual conceptual change. If students
complete their group work with the same understanding of cognition and instruction that they
originally brought to the activity, then the collaborative learning activity, on some level, failed;
we may have engaged students in joint problem solving, but we failed to engage them in
signi cant belief revision, which (as we’ve hopefully made clear by now) is one of our
course’s primary goals. In order to measure, then, the extent to which the collaborative
Distributed pbl in Secondary Teacher Education 35
activities are productive learning mechanisms as we intend, we must capture the evolution of
individual cognition over time (Derry & Hmelo, 2000).
The Group Whiteboard documents the evolution of the group’s shared thinking by making
visible (to students as well as the tutor) change in the group’s argument over time; the real
challenge, however, is documenting the cognitive development of each participating member.
Our strategic three-session design enables us to document such change. Over the course of the
three sessions of activities, students generate a cascade of inscriptions that capture their current
thinking about the problem, the Learning Science concepts, and the relationships among them.
This individual trace includes; (a) the Individual Whiteboard completed during Session One:
(b) the individual’s contributions to the discussion board and Grou p Whiteboard completed
during Session Two: and (c) their  nal solution proposal completed during Session Three.
Together, these inscriptions provide a trace of the evolution of the individual’s thinking over
the course of his or her pbl activities. Examination of this trace allows staff, researchers and
students themselves to assess what conceptual change took place at the individual level as a
result of the discussion. We are currently considering methods such as latent semantic analysis
(Landauer et al., 1998) for reducing and processing these data for easier interpretation.
Whether and what nature of individual cognitive development occurs is an important measure
of the potency of the given online collaboration as a learning activity in itself.
Accumulating and Distributing Wisdom and Practice
One of the beauties of online systems is their capacity to generate an ever-thickening history
of use. In this article, we have tried to show how preliminary analysis of the data from our
rst trial of the system has helped us identify tutor needs and implement a suite of online
materials and tools to scaffold and assist novice tutors. With every such trial on our system
we gain one more layer of description: What the students and tutors did and whether it was
successful, the kinds of problems they encountered and the ways they moved beyond those
challenges, unanticipated issues that arose … and this does not include the yet-to-be-harvested
data from the  rst trial that we have not explored. For better or worse, our monitoring and
assessment systems seem to database everything, from Jane Doe’s connection speed to whether
Group Q took Labor Day weekend off. The trick is putting all these data in the service of
future research and practice.
Our site development strategy is very simple: Accumulate wisdom and practical skill
through repeated trials and then distribute it across resources, tools, and artifacts. The online
resources we are building are artifacts of our own accumulated wisdom; they provide
“newcomers” (Lave, 1991), whether they are students or staff, with access to the experiential
knowledge our team has developed over time. This article is full of examples, yet there are
others we simply haven’t the space to detail: Worked examples of each “product” for the
students, a register of frequent misconceptions students bring to each problem and the common
dif culties students have had while working with various cases in the past, and a collection
of problem-speci c research suggestions that tutors can share with students who get stuck.
These resources represent the accretion of practical knowledge and skill over time; using the
rich “trace” of student and staff activities that our database generates in combination with tools
such as the online “Tutor Journal” where tutors record their observations on each group, we
36 C. A. Steinkuehler et al.
transform system use into system knowledge, thereby sharing experiences and growing
wisdom with future staff. By providing a way for each tutor to archive their current
experiences, reactions, and suggestions, we are able to continue building tutor wisdom into the
system and provide an ever-thickening history of each problem for future tutors to consult.
I am glad to have the journal linked here. I hope that my armchair quick view of the students
is helpful and provides some context for understanding these folks as individuals situated in my
pbl ed. psych. class. (Tutor, Online Tutor Journal)
Concluding Comments
Perhaps this paper might better be entitled “problem-based designing in secondary teacher
education.” Our work on the STEP pbl system is motivated by a real-world problem we have
encountered in our research: How do you implement a pbl-based course in an undergraduate
setting with limited tutor training capabilities and a meager instructional staff? Our strategy,
broadly stated, is to carefully design a set of resources and tools that enable us to distribute
the complex cognitive and pedagogical processing that tutoring pbl requires to the system, the
course staff, and the students (both individually and in groups). Working out precisely how to
accomplish this will require repeated design cycles of prototyping , testing, and revising our
initial ideas. Throughout this process, there will be glitches, snags and (technological) hurdles,
as this article demonstrates, yet the objective that motivates such trials and tribulations—devel-
oping preservice teachers’ abilities to use current Learning Science research to guide instruc-
tional decisions and design—are already partially realized.
One of my students accidentally showed up at class today—a student who had great dif culty
getting on. He stayed up most of the night  nishing up and said that he really liked doing this
online—he said that it is one of the hardest things he’s ever done but one of the best, that he
felt like he learned an awful lot, despite assorted technical problems (like losing what he had
been working on in the whiteboards when he moved to the knowledge web). So for what this
is worth, at least one student who had a very dif cult time  guring out what to do found this
a really worthwhile experience. (Tutor, Online Tutor Journal)
Acknowledgements
This work was supported by the National Institute for Science Education, a cooperative
agreement between the National Science Foundation and the University of Wisconsin-
Madison, and by a grant from the Joyce Foundation. Sharon J. Derry is principal investigator.
The ideas expressed herein are not endorsed by and may not be representative of positions
endorsed by the sponsoring agencies. The authors are grateful to David Woods, Chris
Fassnacht and Marcelle Siegel for their helpful comments on the design of our system and
useful feedback on earlier versions of this paper.
NOTES
[1] We use the term “Learning Sciences” to refer to all  elds of systematic, empirical study
of cognition and education, including work conducted from the full range of theoretical
Distributed pbl in Secondary Teacher Education 37
perspectives from Symbolic Processing Theory to Situated Cognition and Sociocultural
Theory.
[2] We use “PBL” to refer to the problem-base d learning technique originally designed by
Barrows (1985); we use “pbl” (all lowercase letters) to refer to the modi ed version of
problem-based learning used by STEP. We maintain this distinction throughout our work
in order to acknowledge the fact that we employ online asynchronous discussions while
Cameron, Barrows, and Crooks (1999) specify that such discussions should always occur
synchronously. Our use of asynchronous rather than synchronou s environments was a
deliberate design decision; the rationale behind this decision is discussed later in the paper.
[3] Videocases are the centerpiece of the STEP system. Each videocase includes not only the
video footage itself but also a written transcript of its contents and a collection of
supplementary “Inquiry Materials” that provide a richer picture of its context (e.g.,
demographic information, interviews with the teacher, examples of student work, item
analysis of assessments).
[4] Our system will eventually incorporate several different “expert analyses” for each
videocase. We feel that multiple expert analyses would better represent the range of
theoretical perspectives one might productively take in thinking about cognition and
instruction. There is no single correct solution to any of the problems we use in our course;
our bank of expert analyses will re ect this.
[5] Students’ pbl activities vary dependin g on problem type (i.e., whether they are redesigning
or adapting the instruction depicted in the videocase), therefore the interactive Web pages
and tools that scaffold students through them vary correspondingly . For ease of presen-
tation, however, we focus this discussion on the system that supports students’ redesign
activities.
REFERENCES
Bailey, M. L., & Luetkehans, L. (1998). Ten great tips for facilitating virtual learning teams.
Paper presented at the Fourteenth Annual Conference on Distance Teaching & Learning,
Madison, WI.
Barrows, H. S. (1985). How to design a problem-based curriculum for the preclinical years.
New York: Springer.
Benbunan-Fich, R., & Hiltz, S. R. (1999). Impacts of asynchronous learning networks on
individual and group problem solving: A  eld experiment. Group Decision and Negotiation,
8(5), 409–426.
Bonk, C. J., Hansen, E. J., Grabner-Hagen, M. M., Lazar, S. A., & Mirabelli, C. (1998). Time
to “connect”: Synchronous and asynchronous dialogue among preservice teachers. In
Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and
discourse (pp. 289–314). Mahwah, NJ: Lawrence Erlbaum Associates.
Burke, K., & Chidambaram, L. (1995). Developmental differences between distributed and
face-to-face groups in electronically supported meeting environments: An exploratory
investigation. Group Decision and Negotiation,4(3), 213–233.
Cameron, T., Barrows, H. S., & Crooks, S. M. (1999). Distributed problem-based learning at
Southern Illinois University School of Medicine. In C. M. Hoadley & J.Roschelle (Eds.)
Proceedings of the computer support for collaborative learning (CSCL) 1999 Conference
(pp. 86–93). Mahwah, NJ: Lawrence Erlbaum Associates.
38 C. A. Steinkuehler et al.
Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the
crafts of reading, writing, and mathematics. In Knowing, learning, and instruction: Essays
in honor of Robert Glaser (pp. 453–494). Hillsdale, NJ: Lawrence Erlbaum Associates.
Davidson-Shivers, G., Tanner, E., & Muilenburg, L. (2000, April). Online discussion: How do
students participate? Paper presented at the Annual Meeting of the American Educational
Research Association, New Orleans, LA.
Derry, S. J. (1996). Cognitive schema theory in the constructivist debate. Educational
Psychologist,31(3/4), 164–174.
Derry, S. J., & Hmelo, C. E. (2000). Video cases online: Cognitive studies of preservice
teacher learning. Proposal to the National Science Foundation ’s Research on Learning and
Education (ROLE) Program.
Derry, S. J., Seymour, J., Feltovich, P., & Fassnacht, C. (2001, March). Tutoring and
knowledge construction during problem-based learning: An interaction analysis. Paper
presented at the Annual Conference of the National Association for Research in Science
Teaching, St. Louis, MO.
Derry, S. J., & STEP Project Group.* (2000, April). Reconceptualizing professional develop-
ment: Collaborative video projects on the world-wide web. Paper presented at the Annual
Meeting of the American Educational Research Association, New Orleans, MO. (*Kim, J.
B., Steinkuehler, C. A., Siegel, M., Street, J. P., Canty, N. G., Fassnacht, C., Hewson, K.,
Hmelo, C. E., & Spiro, R.)
Eastmond, D. V. (1992). Adult distance study through computer conferencing. Distance
Education,15(1), 128–152.
Gibson, J. J. (1979). The ecologist approach to visual perception. Boston: Houghton Mif in.
Harasim, L. (1990). Online education: An environment for collaboration and intellectual
ampli cation. In Online education: Perspectives on a new environment (pp. 39–64). New
York: Praeger.
Hewitt, J. (1997, October). Beyond threaded discourse. Paper presented at the Annual WebNet
Conference, Toronto.
Hiltz, S. R. (1986). The “virtual classroom.” Using computer-mediated communication for
university teaching. Journal of Communication,36(2), 96–104.
Hiltz, S. R., Johnson, K., & Turoff, M. (1986). Experiments in group decision making:
Communication process and outcome in face-to-face versus computerized conferences.
Human Communication Research,13(2), 225–252.
Hmelo, C. E., & Guzdial, M. (1996). Of black and glass boxes: Scaffolding for learning and
doing. In Proceedings of the First International Conference of the Learning Sciences
(pp. 128–134). Charlottesville, VA: AACE.
Kirkwood, A. (1998). New media mania: Can information and communication technologies
enhance the quality of open and distance learning? Distance Education,19(2), 228–241.
Landauer, T. K., Foltz, P. W., & Laham, D. (1998). Introduction to latent semantic analysis.
Discourse Processes,25, 259–284.
Lave, J. (1991). In Perspectives on socially shared cognition (pp. 63–82). Washington DC:
American Psychological Association.
Spiro, R. J., Feltovich, P. J., Jacobson, M. J., & Coulson, R. L. (1991). Cognitive  exibility,
constructivism, and hypertext : Random access instruction for advanced knowledge acqui-
sition in ill-structured domains. Educational Technology,31(5), 24–33.
Distributed pbl in Secondary Teacher Education 39
Sproull, L. & Kiesler, S. (1991, Connections: New ways of working in the networked
organization. Cambridge, MA: MIT Press.
STEP Project Group.* (2000). Promoting teachers’  exible use of the learning sciences
through case-based problem solving on the WWW: A theoretical design approach. In
Proceedings of the Fourth International Conference of the Learning Sciences (pp. 273–279).
Mahwah, NJ: Lawrence Erlbaum Associates. (*Siegel, M., Derry, S. J., Kim, J. B.,
Steinkuehler, C. A., Street, J., Canty, N., Fassnacht, C., Hewson, K., Hmelo, C. E., & Spiro,
R.)
Suthers, D. (2001). Towards a systematic study of representational guidance for collaborative
learning discourse. Journal of Universal Computer Science,7(3), 254–277.
Walther, J. B., & Burgoon, J. K. (1992). Relational communication in computer-mediated
interaction. Human Communication Research,19(1), 50–88.
Wason, P. C., & Johnson-Laird, P. N. (1972). Psychology of reasoning: Structure and content.
Cambridge, MA: Harvard University Press.
Wortham, D. (1999). Nodal and matrix analyses of communication patterns in small groups.
In C. M. Hoadley & J. Roschelle (Eds.) Proceedings of the Computer Support for
Collaborative Learning (CSCL) 1999 Confernce (pp. 681–683). Mahwah, NJ: Lawrence
Erlbaum Associates.
Correspondence. Constance A. Steinkuehler, Research Assistant, University of Wis-
consin-Madison, 1025 West Johnson Street, Madison, WI 53706, USA. E-mail:
steinkuehler@facstaff.wisc.edu
Constance A. Steinkuehler is a Research Assistant at the University of Wisconsin-Madison.
Sharon J. Derry is a Professor at University of Wisconsin-Madison.
Cindy E. Hmelo-Silver is an Assistant Professor at Rutgers University.
Matt DelMarcelle is a Project Assistant at the University of Wisconsin-Madison.
... TA's with minimal training often serve as tutors as part of their own doctoral training. As noted by Steinkuehler, Derry, Hmelo & DelMarcelle (2002), the high turnover rate endemic to such positions means that each semester, the course manager may have to start over with a new group of tutors. As is typical, the beginning TA in this segment had little teaching experience. ...
... This system provides scaffolding to guide individual students and groups through pbl problems, and both training and scaffolding to support the tutoring process and help TA's function as online facilitators. An earlier pilot version of STEP pbl was described in Steinkuehler et al. (2002). The following description of the substantially revised system that is currently in use at UW-Madison and Rutgers University is taken from ...
Chapter
Full-text available
Our chapter relates to an ongoing and continuously evolving research and development project that has as its goal the design of a socio-technical system (a technical environment and related social structures and activities) that will constitute a good model for distributed teacher professional development programs conceptualized as knowledge-building communities. We focus primarily on a part of our work that is situated within the Secondary Teacher Education Program at the University of Wisconsin- Madison. We begin by describing the original ambitious vision for this pro- gram that we set out to implement, including its theoretical basis. Then we discuss how both our initial failures and the theoretical framework itself led us to more carefully consider how the historical and institutional contexts of such community-building efforts might influence the social processes of learning and teaching within the community. To illuminate this idea, we present a contextual analysis of the program as a prelude to an interaction analysis of a representative discourse from a group learning activity within the program. Throughout this chapter, we consider lessons learned from studies such as these and from our immersion in the experience of designing a sociotechnical environment for supporting community-based teacher education. Drawing on these lessons, we describe our modified goal and the latest results of our efforts to develop an online system for structuring and supporting group learning, including the online mentoring of such learning, within teacher education programs.
... Concretando más, hemos adoptado un enfoque activo de tipo «Problem/Project-Based Learning» -es decir, de tipo híbrido- (Prince & Felder, 2006), que ha mostrado su eficacia en contextos académicamente cercanos, como por ejemlo los programas de formación del profesorado (Steinkuehler, Derry, Hmelo-Silver & Delmarcelle, 2002;Goodnough & Hung, 2008). Así, el desarrollo de la asignatura de Didáctica de las Ciencias de la Naturaleza incorpora una selección de problemas abiertos contextualizados en el trabajo docente -coherentes con las categorías del CDC de Magnusson & otros (1999)-, que permiten abordar tanto los aspectos curriculares escolares (contenidos, competencias, metodología, evaluación…) como aspectos más teóricos de la DiCieNat. ...
Research Proposal
Full-text available
Proyecto docente presentado en un concurso a una plaza de profesor titular convocado por la U. Complutense el 24 de noviembre de 2020.
... Integrating case-based methodology in teacher professional development (PD) can be an effective way of communicating the detailed, interrelated processes necessary to unpack the multidimensional nature of what students and teachers do in classrooms (Briza et al., 2007). Case-based PD emphasizes the transformation of theoretical knowledge into theoretically informed practice (Briza et al., 2007) and involves the practice of using instructional cases (elaborate scenarios) as part of the curriculum or as the central focus of the curriculum in PD; specifically, the cases include teaching and learning situations that stress a variety of viewpoints and potential outcomes (Andrews, 2002;Merseth, 1996;Steinkuehler et al., 2002). ...
Article
This article is a qualitative study that explored bilingual/ESL science teachers’ reflections about subject matter knowledge (SMK) and instructional practice in the elementary grades. Thirty-three teachers were part of a 5-year professional development program designed to enhance instructional quality and effectiveness to teach English learners. Case-based instruction was used in their science coursework to present teachers with authentic learning experiences about science teaching and learning to foster critical reflection about theorizing SMK into instructional practice. Using semi-structured interview methods we collected data to analyze teachers’ reflections about science SMK and instructional practice. Teachers were interviewed using content-specific cases (vignettes) about density, circuits, food webs, and heat. Teachers were asked to identify underlying concepts case teachers/students held in each vignette. They were then asked to critically reflect on the instructional practices of the case teacher to reflect on how they would either teach the content similar to or different than the case teacher. Findings revealed that most teachers disagreed with the less than ideal teacher practices presented in the cases. After identifying underlying concepts, teachers critically reflected to suggest more effective ways of teaching science content to English Learners (ELs). Hands-on learning, systemic teaching, disciplinary language use in inquiry design, and differentiating content were emphasized as more effective instructional practices by cohort teachers. Implications suggest that even the most minimal reflection about content provides rich opportunities for reflection about pedagogy, making it an essential component of professional development. KEYWORDS: Case-based instruction, English learners, teacher reflection, subject matter knowledge, instructional practice
... These types of hints can be incorporated into an online tutor tool kit. Another solution would be using a distributed PBL system (Steinkuehler et al 2002). ...
Chapter
Full-text available
The tutorial process is at the heart of PBL. In addition to the acquisition of knowledge and conceptual understanding relevant to the given problem, we believe the tutorial group also has positive cognitive and motivational effects on students’ learning. There are variations of the PBL tutorial process, based on the Barrows’ model. This chapter describes the tutorial process we used for our students.
... Según se desprende del marco teórico presentado, la construcción del CDC se plantea a partir de la resolución de problemas profesionales, adoptándose una metodología indagativa con la intención de que el modelo formativo esté fuertemente vinculado al escolar propugnado para la enseñanza-aprendizaje de las ciencias a nivel de secundaria (Comisión Europea, 2007). Ello se concreta en un enfoque activo de tipo «Problem/Project-Based Learning» -es decir, de tipo híbrido- (Prince y Felder, 2006), un método que ha mostrado su eficacia en otros programas de formación del profesorado (Steinkuehler, Derry, Hmelo-Silver y Delmarcelle, 2002;Goodnough y Hung, 2008). Así, el desarrollo de las asignaturas de didáctica incorpora una selección de problemas abiertos contextualizados en el trabajo docente -coherentes con las categorías del CDC de Magnusson et al. (1999)-, que permiten abordar los aspectos curriculares escolares: contenidos, competencias, metodología, evaluación… Estos problemas a su vez forman parte de un proyecto (Unidad Didáctica, UD) que integra todos los aspectos abordados en las asignaturas. ...
Article
Full-text available
Research in science education has given rise to a variety of proposals and activities for students at different school levels. However, this is not the case for the educational training of future teachers, especially for Secondary Education. With the intention of contributing to this issue, this paper presents a grounded experience for the development of the Physics and Chemistry Education subjects of the Spanish Master's in Secondary Education. As such, we incorporate their specific schedules, together with a series of useful activities for the teacher training process. Starting with the future teachers' beliefs, this proposal is based on their construction of Pedagogical Content Knowledge by solving contextualized professional problems. With that purpose, the design of Teaching Units is considered as a reflective tool in the process.
... Según se desprende del marco teórico presentado, la construcción del CDC se plantea a partir de la resolución de problemas profesionales, adoptándose una metodología indagativa con la intención de que el modelo formativo esté fuertemente vinculado al escolar propugnado para la enseñanza-aprendizaje de las ciencias a nivel de secundaria (Comisión Europea, 2007). Ello se concreta en un enfoque activo de tipo «Problem/Project-Based Learning» -es decir, de tipo híbrido- (Prince y Felder, 2006), un método que ha mostrado su eficacia en otros programas de formación del profesorado (Steinkuehler, Derry, Hmelo-Silver y Delmarcelle, 2002;Goodnough y Hung, 2008). Así, el desarrollo de las asignaturas de didáctica incorpora una selección de problemas abiertos contextualizados en el trabajo docente -coherentes con las categorías del CDC de Magnusson et al. (1999)-, que permiten abordar los aspectos curriculares escolares: contenidos, competencias, metodología, evaluación… Estos problemas a su vez forman parte de un proyecto (Unidad Didáctica, UD) que integra todos los aspectos abordados en las asignaturas. ...
Article
Full-text available
La investigación en didáctica de las ciencias ha dado cuenta de múltiples propuestas y actividades para la formación científica de escolares de distintos niveles, pero la situación es distinta para la formación didáctica de los futuros profesores, y en especial para los de Secundaria. Con la intención de paliar estas carencias, este trabajo presenta una experiencia fundamentada de desarrollo de las asignaturas de Didáctica de la física y de la química del Máster en Formación del Profesorado de Secundaria (MFPS), y aporta su temporalización específica y la descripción de una serie de actividades que consideramos útiles para la formación inicial del profesorado. Esta propuesta, que parte de las creencias y concepciones alternativas, se centra en la construcción del Conocimiento Didáctico del Contenido (CDC) de los futuros profesores a través de la resolución de problemas profesionales contextualizados, y para ello se utiliza como herramienta reflexiva el diseño de Unidades Didácticas.
Article
In this information age, technology such as the internet has a profound effect on the peer relationships and interpersonal understanding.The study incorporates the views of authors on the subject. There are advantages and disadvantages in using electronic communication for interpersonal understanding. It promotes better understanding, cooperation and closer peer relationship among students and teens. However, it also has a darker side. It can lead to cyberbullying. Information and communication technology has transformed the classroom scenario by the use of videos, etc. in the teaching and learning process. Students have become more sophisticated in applying electronic devices for their academic performance. The findings show that majority of the students prefer to use e-mail in their interaction. It is recommended that teachers and parents monitor the students to ensure there is no abuse and misuse of technology.
Article
Self-directed learning is an important skill highlighted in the 21st century learning. Hence, developing this skill through problem-based learning (PBL) is deemed to be potentially effective. However, PBL is still not widely implemented in Malaysian classrooms. The integration of face-to-face and online learning known as blended problem-based learning (BPBL) is potentially effective in improving PBL by enhancing teacher’s and students’ roles in self-directed learning. Thus, this research aims to investigate the roles of teacher and students in the process of self-directed learning (formulating learning issues) in BPBL by using basic qualitative approach. Data was collected from observations, interviews and documents (FILA chart). By using purposive sampling, twenty-five (25) students and a teacher from a school in Johor district were selected as the sample. The students were divided into five groups. The teacher and students’ roles in two randomly selected groups were compared. The results indicated that the teacher faced difficulties in monitoring students’ progress and some students were passive in the BPBL.
Article
Full-text available
A 2 × 2 factorial design was used to explore the process and outcome of small group problem-solving discussions for two modes of communication (face-to-face and computerized conferencing) and two types of tasks (a qualitative human relations task and a scientific ranking test with a criterion solution). Interaction process was coded using Bales Interaction Process Analysis. There were two to three times as many communication units in the face to-face groups consisting of five members each as in the computerized conferencing mode of communication during the same elapsed time. Group decisions were equally good in the two modes, but the groups were less likely to reach agreement in the computerized conferencing mode. There were proportionately more of the types of task-oriented communication associated with decision quality in the computerized conferences.
Chapter
Collins, A., Brown, J.S., & Newman, S.E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.) Knowing, learning, and instruction: E...
Article
The use of computer conferencing by higher education institutions offering distance education courses has expanded rapidly since 1987. Despite a parallel growth in the academic scholarship, few studies examined students’ experience of learning in an on-line course. Using unstructured interviews and observations at adult students’ homes or worksites, the study investigated adult student perspectives of distance study by computer conferencing. It found these adults actively engaged in social relationships outside their distance studies which sustain their educational pursuits. The students provided insights into aspects of the on-line environment: asynchronicity, interactivity, textual communications, and collaboration. Their learning orientation suggested the value they placed on conference activities. They incidentally transferred or invented learning strategies to deal with the different dynamics of this instructional environment. The computer conference brought together widely dissimilar students and encouraged them to ‘talk’ with one another, while unaware of each others’ physical attributes. On-line relationships served meaningful purposes, but rarely continued beyond the course. Based on these findings, the study presents an Adult Distance Study Through Computer Conferencing Model to guide understanding of the student experience with this medium.
Article
"'Is} there any other point to which you would wish to draw my attention?' {'To} the curious incident of the dog in the night-time.' {'The} dog did nothing in the night-time.' {'That} was the curious incident,' remarked Sherlock Holmes." The quotation from A. Conan Doyle with which this book begins, is a delightfully appropriate summation of the authors' point of view garnered from their fifteen years of experiments on the psychology of {reasoning.Dr.} Wason and Dr. {Johnson-Laird} are intrigued by the extent to which most individuals can be considered naturally rational thinkers. They present here the surprising results of their comprehensive investigations of how humans draw explicit conclusions from evidence. {"Given} a set of assertions," the authors write, "to what extent can the individual appreciate all that follows from them by virtue of logic alone, and remain unseduced by plausible, but fallacious conclusions? We are not concerned with whether these assertions are true or false, nor with whether the individual holds them among his beliefs, nor with whether they are sane or {silly."At} the core of the Psychology of Reasoning is a vigorous discussion that incorporates various illustrations--some of them humorous, all of them fascinating--of the use of reason under a wide variety of different conditions. Particular emphasis is placed on the difficulties involved in dealing with negatively marked information that must be combined and used with other information for reaching conclusions. Thorough treatment is given as well to the search for plausible contexts that will render anomalous or ambiguous statements {"sensible."The} authors have strived to isolate the components of inference, the basic steps of any kind of deductive activity, in order to determine the psychological processes involved in them. What has been the outcome of this research? Dr. Wason and Dr. {Johnson-Laird} conclude, "our research has suggested that the individual's logical competence may be either enhanced or limited by performance variables. And, of these, content has turned out to be vitally important for revealing, or obscuring structure. At best, we can all think like logicians; at worst, logicians all think like us."