ArticlePDF Available

What Happens When Teachers Design Educational Technology? The Development of Technological Pedagogical Content Knowledge


Abstract and Figures

We introduce Technological Pedagogical Content Knowledge (TPCK) as a way of representing what teachers need to know about technology, and argue for the role of authentic design-based activities in the development of this knowledge. We report data from a faculty development design seminar in which faculty members worked together with masters students to develop online courses. We developed and administered a survey that assessed the evolution of student- and faculty-participants' learning and perceptions about the learning environment, theoretical and practical knowledge of technology, course content (the design of online courses), group dynamics, and the growth of TPCK. Analyses focused on observed changes between the beginning and end of the semester. Results indicate that participants perceived that working in design teams to solve authentic problems of practice to be useful, challenging and fun. More importantly, the participants, both as individuals and as a group, appeared to have developed significantly in their knowledge of technology application, as well as in their TPCK. In brief, learning by design appears to be an effective instructional technique to develop deeper understandings of the complex web of relationships between content, pedagogy and technology and the contexts in which they function.
Content may be subject to copyright.
Michigan State University
We introduce Technological Pedagogical Content Knowledge (TPCK) as a
way of representing what teachers need to know about technology, and argue
for the role of authentic design-based activities in the development of this
knowledge. We report data from a faculty development design seminar in
which faculty members worked together with masters students to develop
online courses. We developed and administered a survey that assessed the
evolution of student- and faculty-participants’ learning and perceptions about
the learning environment, theoretical and practical knowledge of technology,
course content (the design of online courses), group dynamics, and the growth
of TPCK. Analyses focused on observed changes between the beginning and
end of the semester. Results indicate that participants perceived that working
in design teams to solve authentic problems of practice to be useful,
challenging and fun. More importantly, the participants, both as individuals
and as a group, appeared to have developed significantly in their knowledge
of technology application, as well as in their TPCK. In brief, learning by
design appears to be an effective instructional technique to develop deeper
understandings of the complex web of relationships between content,
pedagogy and technology and the contexts in which they function.
What do teachers need to know about technology and how can they acquire
this knowledge? These questions have been at the center of intense debate in the
Ó 2005, Baywood Publishing Co., Inc.
recent past (e.g., Handler & Strudler, 1997; Wise, 2000; Zhao, 2003; Zhao
& Conway, 2001). There is, however, little clarity about what form this tech
nological knowledge should take, and how it should be acquired. We offer
one perspective that considers the development of Technological Pedagogical
Content Knowledge (TPCK) within a Learning Technology by Design seminar.
Our approach toward technology integration values rich knowledge about
how technology, pedagogy, and content interact with one another, as well as
an understanding of the unique affordances of the Learning by Design approach
to foster the development of these integrated knowledge structures. These ideas
have been covered in greater depth elsewhere (Koehler & Mishra, 2005;
Koehler, Mishra, Hershey, & Peruski, 2004; Koehler, Mishra, & Yahya, 2004;
Koehler, Mishra, Yahya, & Yadav, 2004; Mishra & Koehler, 2003, in press a,
in press b, in press c). However, because our rationale for conducting this
study requires an understanding of these multiple (and interrelated) ideas, we
use the following sections to broadly introduce these foundational strands before
presenting a more in-depth and detailed explanation of the design experiment
and our findings.
It is becoming increasingly clear that merely introducing technology to the
educational process is not enough to ensure technology integration since tech-
nology alone does not lead to change. Rather, it is the way in which teachers use
technology that has the potential to change education (Carr, Jonassen, Litzinger,
& Marra, 1998). For teachers to become fluent with educational technology
means going beyond mere competence with the latest tools (Zhao, 2003), to
developing an understanding of the complex web of relationships between
users, technologies, practices, and tools. Thus we view technology as a knowl
edge system (Hickman, 1990) that comes with its own biases, and affordances
(Bromley, 1998; Bruce, 1993) that make some technologies more applicable in
some situations than others. In summary, we view teacher knowledge about
technology as important, but not separate and unrelated from contexts of teaching
i.e., it is not only about what technology can do, but also, and perhaps more
importantly, what technology can do for them as teachers.
Consistent with this situated view of technology, we have proposed a frame
work describing teachers’ understanding of the complex interplay between
technology, content, and pedagogy (Koehler, Mishra, Hershey, & Peruski,
2004; Mishra & Koehler, in press a, in press b, in press c). In our framework,
we have built upon Shulman’s (1986, 1987) work describing Pedagogical
Content Knowledge, to highlight the importance of Technological Peda
gogical Content Knowledge (TPCK) for understanding effective teaching with
technology (see Mishra & Koehler, in press c, for a more complete discussion
of these issues). Our perspective is consistent with other approaches that have
attempted to extend Shulman’s idea of Pedagogical Content Knowledge (PCK)
to the domain of technology (for instance see Hughes, 2005; Keating & Evans,
2001; Lundeberg, Bergland, Klyczek, & Hoffman, 2003; Margerum-Leys, &
Marx, 2002).
At the core of our framework (see Figure 1), there are three areas of knowledge:
Content, Pedagogy and Technology.
Content (C) is the subject matter that is to be learned/taught. High school
mathematics, undergraduate poetry, 1st grade literacy, and 5th grade history are
all examples of content that are different from one another.
Technology (T) encompasses modern technologies such as computers, the
Internet, digital video, and more commonplace technologies including overhead
projectors, blackboards, and books.
Pedagogy (P) describes the collected practices, processes, strategies, proce
dures, and methods of teaching and learning. It also includes knowledge about
the aims of instruction, assessment, and student learning.
However, our approach goes beyond seeing C, P, and T as being useful
constructs in and of themselves. Our approach emphasizes the connections and
interactions between these three elements. For instance, considering P and C
together we get Pedagogical Content Knowledge. This is similar to Shulman’s
(1987) idea of knowledge of pedagogy that is applicable to the teaching of
Figure 1. The components of Technological Pedagogical
Content Knowledge.
specific content. This would include representation and formulation of con
cepts, pedagogical techniques, knowledge of what makes concepts difficult or
easy to learn, knowledge of students’ prior knowledge and theories of epis
temology. Similarly, T and C taken together yield the construct Technological
Content Knowledge, useful for describing teachers knowledge of how a subject
matter is transformed by the application of technology (e.g., the use of simulations
in physics). T and P together describe Technological Pedagogical Knowledge,
or knowledge of how technology can support pedagogical goals (e.g., fostering
Finally, if we jointly consider all three elements (T, P, and C), e get Techno
logical Pedagogical Content Knowledge (TPCK). True technology integration,
we argue, is understanding and negotiating the relationships between these three
components of knowledge (Bruce & Levin, 1997; Dewey & Bentley, 1949;
Rosenblatt, 1978). Good teaching is not simply adding technology to the existing
teaching and content domain. Rather, the introduction of technology causes
the representation of new concepts and requires developing a sensitivity to the
dynamic, transactional relationship between all three components suggested by
the TPCK framework.
Our conceptualization of teacher knowledge as being a complex web of
relationships between content, pedagogy and technology has significant impli
cations for teacher learning and teacher professional development. Clearly
instruction that focuses on only one of these items at a time would be relatively
ineffectual in helping teachers develop an understanding of how these knowledge
bases relate to each other. For instance, technology workshops that focus on
the development of software and hardware skills do not help teachers under
stand how technologies interact with particular pedagogies or specific subject
matters. We have argued that developing TPCK requires the design of a
coherent curricular system (Brown & Campione, 1996), not a collection of iso
lated modules that focus on just one of the three knowledge bases at a given
moment. Developing TPCK requires a curricular system that would honor
the complex, multi-dimensional relationships by treating all three components
in an epistemologically and conceptually integrated manner. In response to these
needs we have been experimenting with an approach we call Learning tech
nology by design.
The learning technology by design approach is a constructivist approach
that sees knowing as being situated in action and co-determined by individual-
environment interactions (Brown, Collins, & Duguid, 1989; Gibson, 1986;
Roschelle, & Clancey, 1992; Young, 1993).
Our approach builds on these ideas
by emphasizing the value of authentic and engaging ill-structured problems that
reflect the complexity of the real world (Marx, Blumenfeld, Krajcik, & Soloway,
1997; Pea, 1993). These problems serve as the context for learning about educa
tional technology. For instance, recent design-based seminars we have conducted
have focused on the design of online courses. The participants in the design
teams have to actively engage in inquiry, research and design, in collaborative
groups (that include higher education faculty members and graduate students)
to design tangible, meaningful artifacts (such as the website, syllabus and assign
ments for an online course) as end products of the learning process. The
open-ended nature of design problems prevent us (the instructors) from too
narrowly specifying what technologies will be needed. This means that
the participating teachers have to learn specific hardware and software skills
as and when needed by their evolving project. Design is the anchor around
which the class (and learning) happens. The evolving artifact is also the test
of the viability of individual and collective understandings as participants test
theirs, and others’, conceptions and ideas of the project. And finally, the main
role of the instructor in such an environment is that of a facilitator and problem
solving expert rather than an expert in the content. Learning in this context
involves becoming a practitioner, not just learning about practice (Brown &
Duguid, 1991).
Most significantly, by participating in design, teachers build something that
is sensitive to the subject matter (instead of learning the technology in general)
and the specific instructional goals (instead of general ones). Authentic tasks
do not respect disciplinary boundaries. Therefore, every act of design is always
a process of weaving together components of technology, content, and peda
gogy. Moreover, the ill-structured nature of most authentic pedagogical problems
ensures that there are multiple ways of interpreting and solving them. Thereby,
teachers are more likely to encounter the complex and multiple ways in which
technology, content, and pedagogy influence one another instead of thinking
about rigid rules that imply simple cause-effect relationships between these
components (Mishra, Spiro, & Feltovich, 1996).
In this, the learning by design approach is philosophically and pragmatically aligned to other
project-based approaches such as learning-by-doing, problem-based learning, collaborative learning
frameworks, and design-based-learning (Blumenfeld, Marx, Soloway, & Krajcik, 1996; Blumenfeld,
Soloway, Marx, Krajcik, Guzdial, & Palincsar, 1991; Dewey, 1934; Papert, 1991; Roth, 1995; Roup,
Gal, Drayton, & Pfister, 1993). Similarities can also be found between our approach and those adopted
by other advocates of design-based-learning (Perkins, 1986; Blumenfeld, Soloway, Marx, Krajcik,
Guzdial, & Palincsar, 1991; Brown, 1992; Harel & Papert; 1990, 1991; Kafai, 1996). Learning by
design has been shown to lead to rich, meaningful learning in a variety of contexts, including the
development of presentations, instructional software, simulations, publications, journals, and games
(Carver, Lehrer, Connell, & Erickson, 1992; Kafai, 1996; Kafai & Resnick, 1996; Lehrer, 1993).
So far, we have offered an argument for design-based approaches as a means of
helping teachers develop situated and nuanced understandings of the relationship
between pedagogy, content and technology. However, this is not a statement that
has to be accepted at face value. Whether or not students develop TPCK is an
empirical question and it is a question that we have addressed in our research.
The development of the TPCK framework has been part of a multi-year design
experiment (Brown, 1992; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003;
Designed Based Research Collective, 2003), aimed at helping us understand
teachers’ development toward rich uses of technology (i.e., develop theory)
while simultaneously helping teachers (both K-12 teachers and university faculty)
develop their teaching with technology (i.e., inform practice).
In a previous publication (Koehler, Mishra, Hershey, & Peruski, 2004), we
presented a case study of a college faculty member (Dr. Shaker) as she worked
with her design team to create an online course. Our analysis revealed important
changes in Dr. Shaker’s technological literacy and her thinking about her personal
relationship with technology. In accounting for these changes, we hypothesized
that the learning by design approach afforded rich opportunities for Dr. Shaker
(and her other team members) to deeply consider the relationships between
content, pedagogy, and technology.
In later work (Koehler, Mishra, Yahya, & Yadav, 2004), we looked more
closely at the manner in which TPCK develops through participation in a design-
based activity. Qualitative and quantitative analyses of 15 weeks of field notes
for two of the design teams showed that participants moved from considering
technology, pedagogy and content as being independent constructs toward a
richer conception that emphasized connections among the three knowledge bases.
Our analyses suggested that developing TPCK is a multigenerational process,
involving the development of deeper understandings of the complex web of
relationships between content, pedagogy and technology and the contexts in
which they function.
Though these efforts have offered rich and detailed information about the
phenomena (teacher knowledge around technology) such qualitative approaches
are time consuming and difficult to replicate. For this purpose we have attempted
to develop a survey instrument that would allow us to capture the essential
elements of the learning by design process. This is the focus on this article and
is described in greater detail below.
The Design of the Survey
We designed a survey that attempted to measure participants’ learning in one
of our learning by design courses. As part of the course, the design teams focused
on building an online course to be taught as part of the masters program in the
college of education (more about the design course later).
The survey instrument we designed attempts to address three broad questions:
Students’ perceptions of the learning environment (i.e., the learning tech
nology by design approach). In particular, this section focuses on (a) par
ticipants’ perceptions of the time and effort spent by them in this course;
(b) participants’ subjective judgments about the learning experience and the
amount and value of theoretical and practical knowledge; and (c) the manner
in which design teams function, with particular emphasis on the contributions
of various team members and their roles.
The evolution in participant thinking regarding different aspects of online
education. Because participants were charged with designing an online
course, one measure of success of the learning technology by design approach
is a change in participants thinking about the subject of their design chal
lenge—in this case, online teaching and learning. We would expect learners
to start with somewhat simplistic ideas about moving from face-to-face to
online teaching. However, we would also expect this to change with their
participation in the learning by design course.
The evolution over the course of a semester of the knowledge components
(and their relations) suggested by the TPCK framework. Our previous
research shows that participants’ thinking about technology integration gets
increasingly complex with time. However these previous studies studied
group learning (as opposed to individual learning). Our survey distinguishes
between learning about TPCK at both the individual and the group level.
The Design Course Context
We conducted this research within the context of a faculty development course
taught by the first author during the spring semester of 2003. In this class,
faculty members and graduate students worked collaboratively to develop online
courses to be taught the following year. The task of developing an online class
was an authentic one—the College of Education at our university began offering
an online Masters degree program, and courses had to be developed as part of
the online offerings. As we have described elsewhere, the format of this course
was created to integrate faculty development into our learning by design program
already in place in the educational technology masters program (Koehler, Mishra,
Hershey, & Peruski, 2004).
This particular instantiation of the “learning by design course” included four
faculty members and fourteen students. The faculty and students met once a
week for three hours in a computer lab. Students were assigned to groups led
by individual faculty members. A typical class period included a whole-group
component used to discuss readings and issues that applied to all groups, and a
small-group component in which the design teams worked on their semester-long
projects. In many ways, this design course was a typical graduate class experience
for the participants—they read articles, discussed ideas, and were responsible for
meeting course deadlines. However, there were some important differences. All
the participants (faculty members and teachers alike) worked collaboratively on
designing an online course. They were exposed to several technologies, assessed
their usefulness, and included some of them in the design of their online class.
The technologies used by the groups varied, depending on the content they
were covering and the pedagogical decisions they made. One group, for instance,
focused a great deal on researching potential ways for a faculty member to provide
audio feedback to online students. Another group investigated the use of Web-
based PowerPoint presentations to offer overviews of online lessons to be covered.
Groups also explored a range of pedagogical issues relevant to the course they
were designing, including techniques for developing online learning communities
and strategies for incorporating problem-based learning in online settings. All
of the groups learned about the principles of effective Web design as well as
issues related to copyright and privacy. This knowledge was shared with the
larger class through whole groups discussions as well as through online critiques
of work done by other groups. There were a few intermediary deadlines imposed
by the instructor, but for the most part, the groups worked at their own pace to
complete the design of the course by the end of the semester.
Clearly, the most important part of the class was the small group design
work aimed at developing a prototype of an online course. The design task went
beyond creating a Website for the course and required the faculty members
and students to work together to develop the syllabus, the course structure, the
readings, student assignments, and assessment rubrics. They had to determine the
nature of student interaction, how the course content would be offered and
delivered, how technology would be used to accomplish course goals, and how
the course Website would be designed to make it both user-friendly and fit
with course content and pedagogy.
Data for the present study comes from surveys completed by four faculty
members (2 male and 2 female) and 13 students (9 male and 4 female). One
student in the class chose not to participate in the research. Participants agreed
to allow artifacts created during the course to be used as research data following
the completion of the course. They were not reimbursed for their time.
As part of the course, participants completed an online survey four times during
the course of the semester (week 1, week 4, week 8, and week 13 of the course).
Part of students’ grades were dependent on completing the surveys (but not on the
content or quality of the answers). Surveys were submitted with participant
names to the teaching assistant for the purpose of grading (to see which students
completed the assignment or not). After the course was completed, the teaching
assistant permanently removed the names from the survey data (by deleting the
appropriate column in the database), and forwarded the data onto the instructor.
Thus the surveys were anonymous relative to the instructor and were not shared
until after the course was over. Students (and faculty) were aware of this procedure
and were encouraged to submit honest answers to better inform course designers
as to the processes underlying the course.
Each survey consisted of 35 questions, and took less than 15 minutes to
complete. Two questions were short answer (e.g., “Please write a short paragraph
summarizing what is your role in the group” and “Please write a short paragraph
summarizing how your group has been functioning”), and 33 questions used a
7-point Likert scale to rate the extent to which participants agreed or disagreed
with statements about the course (e.g., “Our group has had to find different ways
of teaching this content online”). The content of each survey question is detailed
in the results and discussion sections that follow.
Data Analysis
In this article, we report on the analysis of the data collected by the second
and fourth administration of the survey. The results of the third survey were
lost due to a server crash as a result of a virus/worm that spread throughout
the campus. We chose to include weeks 4 and 13 (and not choose week 1)
because these two weeks were more representative of the design process. It
took a few weeks for the groups to be formed and the participants to engage in
the design tasks. Since many of the questions in the survey were related to the
design process, we decided not to include the data from week 1, since at that
time the participants would not have had any experience with their design teams
and design acts.
Results were analyzed as matched-pair means (t-tests) for each of the 33
survey questions. For each pre-post difference, we also report P-values and
Cohen’s M measure of practical significance for descriptive purposes. However,
in order to control for overall experimental type-I error (because we conducted
33 comparisons), we also indicate which findings are significant if we set the
experimental error rate at alpha = .05, using a sequential Holm procedure, so
that the largest effect was tested at .05/33 (directional, 1-tailed test), the next effect
was tested at .05/32, and so on. Once one test fails to reach significance, testing
stops and the remaining contrasts are non-significant.
We report our findings for the survey by considering the questions organized
by the following themes: participants’ perceptions of the learning environment;
participant’s perceptions about online education; and the evolution of the knowl
edge components suggested by the TPCK framework both from an individual-
and group-level perspective. Where appropriate, we annotate our findings with
quotations from the two short-answer questions to help interpret the findings
supported by the quantitative data.
Students’ Perceptions of the Learning Environment
It has been our experience that the initial stages of the learning by design
approach are confusing, chaotic, and somewhat frustrating to participants. As
groups work to collaboratively define goals, set priorities, and achieve a vision
for their project, many students feel like very little is actually getting done.
We suspect that this has much to do with how students have become accustomed
to completing coursework—they expect to work on their own, to meet well-
defined goals that are clearly laid out in the syllabus, and turn coursework in to
be traded regularly.
The first couple of weeks of the course result in very few concrete accom-
plishments and, instead, is characterized more by group conversations. Several
survey questions asked participants to characterize the kinds (and amount) of
work they were doing (Table 1, Questions T1-T5). In general, at this early stage
of the course, participants do not feel like they are working hard, probably as
a result of their frustration with the chaotic design process (note, 1.0 indicates
strong agreement with the statement and 7.0 indicates strong disagreement with
the statement). This carries over to their effort into other phases of the course,
including their work with the course readings, and involvement in the online
discussion boards about the readings that were required each week. Participants
rate their activity as being individual or group related similarly. One participant
summed up this phase of the course in her response to a question about her role
in the group: “Undefined—I feel like we are not doing much.”
A similar effect can be found on participants’ ratings for the four questions
focused on participants’ perception of the amount, type, and enjoyment of their
learning in the course (questions L1-L4). Initially, the course is not enjoyable,
participants feel as though they are learning some theoretical knowledge, not
learning practical skills, and do not feel as though they are learning as much as
they expected. Again, this is not totally unexpected given the relatively ill-defined
nature of the early goal-setting period of the collaborative design process. Our
experiences with teaching this course have led us to believe that how a group
forms, develops, and learns to work together is very important in not only
developing a final design, but also in the learning that results from the process.
We are not surprised, based upon our prior experiences, that the results from
Table 1. Descriptive Statistics for Questions about Students’ Perceptions
of the Learning Environment
Week 4
(Ave &
Std. Dev.)
Week 13
(Ave &
Std. Dev.)
Pair t
(df = 14) p-Value
Time and Effort Questions
T1 – Overall, I have been working
very hard in this course
T2 – I have spent a lot of my time and
effort doing the readings in this course
T3 – I have spent a lot of time and effort
in the online discussions for the course
T4 – I have spent most of my time and
effort working alone and independently
T5 – I have spent most of my time
and effort working in groups
Learning and Enjoyment Questions
L1 – I am enjoying my experience in
this course
L2 – I am learning a lot of theoretical
knowledge in this course
L3 – I am learning a lot of practical
knowledge in this course
L4 – I am learning more than I
Group Functioning Questions
G1 – As a whole, my group values my
input, thoughts, efforts, and work
G2 – Overall, our group is functioning
very well
G3 – Everyone in our group is making
a significant contribution
G4 – Our group is getting a lot of
work done
G5 – Our group is not accomplishing
as much as we hoped
G6 – Our group is having a lot of fun
< .001*
< .001*
< .001*
= .067
< .01*
< .001*
= .087
< .001*
< .001*
< .001*
< .001*
< .001*
> .05
< .01*
*The change between Week 4 and Week 13 is statistically significant, using an overall
experimental error-rate of alpha = .05, using a sequential Holm procedure.
Week 4 for the six questions about group functioning (G1-G6) show that par
ticipants generally characterize their groups as: Not valuing members’ efforts, not
functioning well, not getting a lot of work done, not accomplishing according to
expectations, and not enjoying themselves. One participant wrote about this
portion of the course: “We have really had problems getting our work started. We
have worked with our instructor, but there isn’t much content developed for the
course yet, so our work seems very slow.” Another noted, “Vision . . . Vision . . .
Vision...lacking in this area a little.” Much of the work of the course instructor
at this point is structuring activities to keep groups on task, that help build
team skills, and lead to better things down the road.
Near the end of the course, it seems much has changed (Table 1 and Figure 2).
By week 13, participants rate themselves as working harder on projects, readings,
and discussions, and doing more collaboration, this change is statistically signifi
cant as well as practically significant. Here Cohen’s M is used as a measure of
practical significance, where M = .2 represents small effects, M = .5 represents
medium effects, and M = .8 indicates a large effect. On this measure of practical
significance, we see “very large” effects (M ranging 0.93 and 4.04) for the statis-
tically significant changes (Cohen, 1977).
These effects confirm our own experiences teaching in the learning by design
approach—the initial discomfort with the approach gradually is replaced with a
feeling of deep accomplishment, and a recognition that working collaboratively
on ill-defined problems is a legitimate (and rewarding) way to engage learning
about (and with) technology. It is interesting to note that the growing familiarity
and experience with the design approach leads to a more favorable engagement
with course readings and discussions. It is also worth noting, that participants’
perception of the work they do independently does not significantly change with
experience in the approach. The same student who noted the “undefined” work
the group was doing earlier by week 13 noted: “I started out doing everything
for the group and it got very taxing on me....What I learned was that I didn’t
need to do that, and that I could actually focus on the pedagogical issues that I
wanted to and get a really fulfilling experience.”
It has been our experience by week 8 or 9, the groups begin to “click” and
assignments come from within more than from without. This is reflected in the
change by week 13 of the course (Table 1 and Figure 2). On every measure, the
participants report better group functioning, more enjoyment, better participa
tion, and better fit within the group. The same student who reported the lack
of “Vision” at this point noted “Our group is functioning well....Everyone
has found a niche or place to fit in...[and] is making contributions toward
the progress and completion of our web course.”
Although we knew group functioning to be an important part of the learning
by design approach, we were surprised by the magnitude of its importance on
every aspect of student learning. Most of the comments written in the two short
answer questions were about group functioning (or not functioning). Furthermore,
Figure 2. Average rating by week for survey questions about students’
perceptions of the learning environment.
a post-hoc analysis showed that participants rating of “Overall, our group is
functioning well” statistically predicted every other rating in the survey except
for the item about “working individually” (performed as correlation among survey
items). That indicates that group functioning can be seen as the gateway by
which learning happened. Groups that got along and had more fun accomplished
more, learned more, and got more out of the class. With this in mind, it is impor
tant to note that not all groups functioned equally well. It is our experience
that some groups “gel” early, others not until the middle of the course, others
never seem to fully connect even once the course is over (Koehler, Mishra,
Hershey, & Peruski, 2004).
Thinking about Technology: The Difference
between Online and Face-to-Face Courses
One way to assess whether or not the learning by design approach leads
to deep learning about technology is to examine the extent to which partici-
pants’ thinking about how technology (T) related to teaching and learning
(Pedagogy—P) for a subject matter area. As argued earlier, designing an online
course introduces new technology into instruction and has the potential to intro-
duce “disequilibrium” in our TPCK model (Peruski & Mishra, 2004). That is,
it was our hope that designing an online course would allow students (and
faculty) to explore relationships between technology and pedagogy and tech-
nology. We hoped that participants would understand that the relationships are
not one way—technology is not merely applied to the pedagogy of the past,
but rather the introduction of technology has implications for how we teach
and what we teach.
Questions O1-O4 of the survey (Table 2) assessed this type of understanding
by asking participants about the differences between face-to-face and online
courses. Initially, participants see little or no difference between an online course
and a face-to-face course—both take about the same amount of time, there is
little need to change content or pedagogy, and the process of designing the two
types of courses is similar. In other words, the early survey results confirm our
suspicion that before designing an online course, participants have relatively
simple beliefs about the role of technology in education—technology is just a
new medium to be learned, and designing with technology is simply translating
previous content and pedagogy into that new medium.
Nine weeks later, participants have come to the opposite conclusion: They
agree that online courses require more time; Teaching online requires a changing
of content and pedagogy. And, that designing an online course is different than
designing a face-to-face course (see Table 2 and Figure 3).
As designers of this learning experience, we couldn’t be more pleased. Without
ever explicitly talking about developing a more nuanced understanding of the
role of technology in the way courses are taught, participants developed these
deeper connections on their own, as evidenced by their changed beliefs about
face-to-face and online teaching.
The Evolution of the Knowledge Components
Suggested by the TPCK Framework
In pursuing the extent to which participants were learning about the categories
of knowledge suggested by our TPCK framework, we designed several survey
Table 2. Descriptive Statistics for Questions about Participant Thinking
about Aspects of Online Education
Questions about Online Education
Week 4
(Ave &
Std. Dev.)
Week 13
(Ave &
Std. Dev.)
Pair t
(df = 14) p-Value
O10 – Designing an online course is a
lot like designing a face-to-face course
O11 – Designing an online course is
translating existing course content to
an online format
O12 – Designing an online course
requires changes in how we teach
and what we teach
O13 – Teaching online requires more
time than face-to-face
< .001*
< .001*
< .001*
< .001*
*The change between Week 4 and Week 13 is statistically significant, using an overall
experimental error-rate of alpha = .05, using a sequential Holm procedure.
Figure 3. Average rating by week for survey questions about
participant thinking about aspects of online education.
questions that asked participants to directly rate their engagement around these
ideas. We distinguish, however, between how individuals were functioning (and
reasoning), and how the design groups were interacting.
At the beginning, the class participants did not agree that they were thinking
differently about technology, nor did they feel as if they were gaining any
technology skills (Table 3, questions I-T1 and I-T2). They had difficulty
designating themselves as working on technology, content, or pedagogy of
the course they were designing (questions I-C, I-P, and I-T3). Here the trend
among individuals probably doesn’t characterize its members very well: when
asked to report their roles in the course, some clearly state their role as “tech
guru” or “developing content.” Likewise, the standard deviations of the ratings
are among the largest observed in this study. We take this to mean that there
is great variety as to what individuals are thinking about in any given group.
However, overall (as in average rating), there does not seem to be an initial
high degree of engagement in any of the three main categories (technology,
content, or pedagogy).
We also examined how participants’ perceived the issues and ideas their design
group were wrestling with, as measured by the TPCK framework (Figure 4). For
each of the knowledge components suggested in the framework, we designed at
least one question (Table 3: G-C, G-P, G-T, G-TC1, G-TC2, G-PC1, G-PC2,
G-TP, and G-TPC). Initially, participants’ did not see their group as grappling with
issues in any of these categories. We are somewhat surprised by the low ratings on
technology—it has been our impression that early discussions are dominated by
technology, since it is the new ingredient that is being considered in the design of
their course. Perhaps they don’t see these discussions as deep, or worth reporting.
Regardless, it would seem that at the early stages, the move to online teaching is
not forcing the design groups to think about how technology and pedagogy, for
example, are related.
By the end of the semester, however, participants are much more likely to
indicate changes in their own thinking about content (C), pedagogy (P), and
technology (P). They are also more able to identify the development of concrete
technology skills within themselves. These changes are also reflected in the group
level measures—there are large statistical, and “very large” practical changes on
every category of knowledge in the TPCK framework. Because our framework
emphasizes beyond seeing C, P, and T as being useful constructs in and of
themselves and stresses the importance of the connections and interactions
between these three elements of knowledge, the changes observed within the
complex relational forms are particularly relevant (i.e., G-TC1, G-TC2, G-PC1,
G-PC2, G-TP, and G-TPC).
Taken as a whole, our results indicate that the design approach in general,
or the task of developing an online course in particular, is well suited to
developing knowledge across the spectrum of reasoning suggested by the
TPCK framework.
Table 3. Summary Statistics for Questions about Individual and
Group Thinking about TPCK
Week 4
(Ave &
Std. Dev.)
Week 13
(Ave &
Std. Dev.)
Pair t
(df = 14) p-Value
Individuals TPCK Questions
I-T1 – I am learning a lot of practical
technology skills that I can use
I-T2 – I am thinking more critically
about technology than before
I-C – I have been thinking and
working a lot of the course content
I-P – I have been thinking and working
a lot on the pedagogy of the course
we are designing
I-T – I have been thinking and working
a lot on the technology of the course
we are designing
Group TPCK Questions
G-C – Our group has been thinking
and talking about the course content
G-P – Our group has been thinking
and talking course pedagogy
G-T – Our group has been thinking
and talking about technology
G-PC1 – Our group has been con-
sidering how course content and
pedagogy influence one another
G-TP – Our group has been con-
sidering how course pedagogy and
technology influence one another
G-TC1 – Our group has been con-
sidering how technology and course
content influence one another
G-TC2 – Our group has had to
modify course content in order to
adapt it to our online course
G-PC2 – Our group has had to find dif-
ferent ways of teaching content online
G-TPC – Our group has chosen tech-
nologies to fit our course content and
the faculty member’s teaching
< .01*
< .01*
= .019
< .01*
< .01
< .001*
< .001*
< .001*
< .001*
< .001*
< .001*
< .001*
< .001*
*The change between Week 4 and Week 13 is statistically significant, using an overall
experimental error-rate of alpha = .05, using a sequential Holm procedure.
The idea of TPCK has significant implications for teacher education and
teachers’ professional development. In order to go beyond the simple “skills
instruction” view offered by the traditional workshop approach, we have argued
that it is necessary to teach technology in contexts that honor the rich connections
between technology, the subject-matter (content) and the means of teaching it (the
pedagogy). We have offered one possibility (the learning by design approach),
that explicitly foregrounds these connections. By participating in design, teachers
are confronted with building a technological artifact while being sensitive to the
particular requirements of the subject matter to be taught, the instructional goals
to be achieved, and what is possible with the technology. The idea of learning
by design is not a new one. However, we believe that the TPCK framework
provides yet another argument for the pedagogical value of such activities, espe
cially when considering the integration of educational technology in pedagogy.
In particular, the findings of our study indicate that participants find learning
by design approaches as being challenging, and fun.
Figure 4. Average rating by week for survey questions about
individual and group thinking about TPCK.
More importantly, our data clearly show that participants in our design teams
moved from considering technology, pedagogy and content as being independent
constructs toward a more transactional and co-dependent construction that indi
cated a sensitivity to the nuances of technology integration. In other words they
showed a significant shift toward developing Technological Pedagogical Content
Knowledge, involving the development of deeper understandings of the complex
web of relationships between content, pedagogy and technology and the contexts
within which they function.
There are certain fundamental challenges in representing teacher knowl
edge around technology (Fenstermacher, 1994), particularly as it develops in
“learning-by-design” seminars. The first challenge is that any representation
of teacher knowledge needs to reflect its collaborative, co-constructed nature.
Furthermore, TPCK develops by doing and through the dialogues and interactions
between the participants in design teams as they grapple with issues surrounding
content, pedagogy and technology. Consequently, knowledge in such settings is
not static or fixed. In our previous work (Koehler, Mishra, Hershey, & Peruski,
2004; Mishra & Koehler in press a, b) we offered some representations of
teacher knowledge around technology. However there were often based on
detailed and time-intensive qualitative research. In this article we extend our work
by developing a survey questionnaire that allows us to observe both the process
and product of learning by design seminars. We see this survey instrument as
being a useful tool for future research on the development of TPCK as well as
allowing us to develop a better understanding of how design teams function.
Blumenfeld, P. C., Marx, R. W., Soloway, E., & Krajcik, J. (1996). Learning with peers:
From small group cooperation to collaborative communities. Educational Researcher,
25(8), 37-40.
Blumenfeld, P. C., Soloway, E., Marx, R., Krajcik, J., Guzdial, M., & Palincsar, A. (1991).
Motivating project-based learning: Sustaining the doing, supporting the learning.
Educational Psychologist, 26(3&4), 369-398.
Bromley, H. (1998). Introduction: Data-driven democracy? Social assessment of
educational computing. In H. Bromley & M. Apple (Eds.), Education, technology,
power (pp. 1-28). Albany, NY: SUNY Press.
Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in
creating complex interventions in classroom settings. The Journal of the Learning
Sciences, 2(2), 141-178.
Brown, A. L., & Campione, J. C. (1996). Guilded discovery in a community of learners.
In K. McGilly (Ed.), Classroom lessons: Integrating cognitive theory and classroom
practice (pp. 229-270). Cambridge: MIT Press/Bradford Books.
Brown, A. L., & Duguid (1991). Organisational learning and communities of practice:
Towards a unified view of working, learning, and innovation. Organisational Science,
2(1), 40-57.
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of
learning. Educational Researcher, 18(1), 32-42.
Bruce, B., & Levin, J. (1997). Educational technology: Media for inquiry, communication,
construction, and expression. Journal of Educational Computing Research, 17(1),
Bruce, B. C. (1993). Innovation and social change. In B. C. Bruce, J. K. Peyton, & T. Batson
(Eds.), Network-based classrooms (pp. 9-32). Cambridge, UK: Cambridge University
Carr, A. A., Jonassen, D. H., Litzinger, M. E., & Marra, R. M. (1998). Good ideas to
foment educational revolution: The role of systematic change in advancing
situated learning, constructivism, and feminist pedagogy. Educational Technology,
38(1), 5-14.
Carver, S. M., Lehrer, R., Connell, T., & Erickson, J. (1992). Learning by hypermedia
design: Issues of assessment and implementation. Educational Psychologist, 27(3),
Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments
in education research. Educational Researcher, 32(1), 9-13.
Cohen, J. (1977). Statistical power analysis for the behavioral sciences (rev. ed.). New
York: Academic Press.
Design-Based Research Collective (2003). Design-based research: An emerging paradigm
for educational inquiry. Educational Researcher, 32(1), 5-8.
Dewey, J. (1934). Art as experience. New York: Perigree.
Dewey, J., & Bentley, A. F. (1949). Knowing and the known. Boston: Beacon.
Fenstermacher, G. D. (1994). The knower and the known: The nature of knowledge
in research on teaching. In L. Darling-Hammond (Ed.), Review of research in
education, Vol. 20 (pp. 3-56). Washington, D.C.: American Educational Research
Gibson, J. J. (1986). The ecological approach to visual perception. Hillsdale, NJ: Erlbaum.
Handler, M. G., & Strudler, N. (1997). The ISTE foundation standards: Issues of imple
mentation. Journal of Computing in Teacher Education, 13(2), 16-23.
Harel, I., & Papert, S. (1990). Software design as a learning environment. Interactive
Learning Environments, 1(1), 1-32.
Harel, IS., & Papert, S. (1991). Constructionism. Norwood, NJ: Ablex Publishing.
Hickman, L. (1990). John Dewey’s pragmatic technology. Bloomington, IN: Indiana
University Press.
Hughes, J. E. (2005). The role of teacher knowledge and learning experiences in forming
technology-integrated pedagogy. Journal of Technology and Teacher Education,
13(2), 377-402.
Kafai, Y. (1996). Learning design by making games: Children’s development of design
strategies in the creation of a complex computational artifact. In Y. Kafai & M. Resnick
(Eds.), Constructionism in practice: Designing, thinking and learning in a digital
world (pp. 71-96). Mahwah, NJ: Lawrence Erlbaum Associates.
Kafai, Y. B., & Resnick, M. (1996). Constructionism in practice: Designing, thinking, and
learning in a digital world. Hillsdale, NJ: Lawrence Erlbaum Associates.
Keating, T., & Evans, E. (2001, April). Three computers in the back of the classroom:
Pre-service teachers’ conceptions of technology integration. Paper presented at the
annual meeting of the American Educational Research Association, Seattle.
Koehler, M. J., & Mishra, P. (2005). Teachers learning technology by design. Journal
of Computing in Teacher Education, 21(3), 94-102.
Koehler, M. J., Mishra, P., Hershey, K., & Peruski, L. (2004). With a little help from your
students: A new model for faculty development and online course design. Journal
of Technology and Teacher Education, 12(1), 25-55.
Koehler, M. J., Mishra, P., & Yahya, K. (2004). Content, pedagogy, and technology:
Testing a model of technology integration. Paper presented at the annual meeting of
the American Educational Research Association, April 2004, San Diego.
Koehler, M. J., Mishra, P., Yahya, K., & Yadav, A. (2004). Successful teaching with
technology: The complex interplay of content, pedagogy, and technology. Proceedings
from the Annual Meeting of the Society for Information Technology & Teacher
Education, Atlanta, GA. Charlottesville, VA: Association for the Advancement of
Computing in Education.
Lehrer, R. (1993). Authors of knowledge: Patterns of hypermedia design. In S. Lajoie &
S. J. Derry (Eds.), Computers as cognitive tools (pp. 197-227). Hillsdale, NJ: Erlbaum.
Lundeberg, M. A., Bergland, M., Klyczek, K., & Hoffman, D. (2003). Using action
research to develop preservice teachers’ beliefs, knowledge and confidence about
technology. Journal of Interactive Online Learning, 1(4). Retrieved July 15, 2004,
Margerum-Leys, J., & Marx, R. (2002). Teacher knowledge of educational technology:
A study of student teacher/mentor teacher pairs. Journal of Educational Computing
Research, 26(4), 427-462.
Marx, R. W., Blumenfeld, P. C., Krajcik, J. S., & Soloway, E. (1997). Enacting project-
based science: Challenges for practice and policy. Elementary School Journal, 97(4),
Mishra, P., & Koehler, M. J. (2003). Not “what” but “how”: Becoming design-wise
about educational technology. In Y. Zhao (Ed.), What teacher should know about
technology: Perspectives and practices (pp. 99-122). Greenwich, CT: Information
Age Publishing.
Mishra, P., & Koehler, M. J. (In press a). Introduction. To appear in P. Mishra, M. J.
Koehler, & Y. Zhao (Eds.), Faculty development by design: Integrating technology
in higher education. Greenwich, CT: Information Age Publishing.
Mishra, P., & Koehler, M. K. (In press b). Designing learning from day one: A first
day activity to foster design thinking about educational technology. Teachers College
Mishra, P., & Koehler, M. J. (In press c). Technological pedagogical content knowledge: A
framework for integrating technology in teacher knowledge. Teachers College Record.
Mishra, P., Spiro, R., & Feltovich, P. (1996). Technology representation, and cogni
tion: The prefiguring of knowledge in cognitive flexibility in hypertexts. In H. van
Oostendorp & A. de Mul (Eds.), Cognitive aspects of electronic text processing
(pp. 287-305). Norwood, NJ: Ablex.
Papert, S. (1991). Situating constructionism. In S. Papert and Is. Harel (Eds.), Construc
tionism (pp. 1-11). Norwood, NJ: Ablex.
Pea, R. D. (1993). Practices of distributed intelligence and designs for education.
In G. Salomon (Ed.). Distributed cognitions (pp. 47-87). New York: Cambridge
University Press.
Perkins, D. N. (1986). Knowledge as design. Hillsdale, NJ: Lawrence Erlbaum Associates.
Peruski, L., & Mishra, P. (2004). Webs of activity in online course design and teaching.
ALT-J: Research in Learning Technology, 12(1), 37-49.
Roschelle, J., & Clancey, W. J. (1992). Learning as social and neural. Educational Psychol
ogist, 27(4), 435-453.
Rosenblatt, L. N. (1978). The reader, the text, the poem: The transactional theory of literary
work. Carbondale, IL: Southern Illinois University Press.
Roth, W.-M. (1995). Authentic school science. The Netherlands: Kluwer Academic
Roup, R., Gal, S., Drayton, B., & Pfister, M. (Eds.). (1993). LabNet: Toward a community
of practice. New Jersey: Lawrence Erlbaum Associates, Inc.
Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4-14,
Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard
Educational Review, 57(1), 1-22.
Wise, A. (2000). The future of the teaching profession. In The American Association
of Colleges for Teacher Education (Eds.), Log on or lose out: Technology in 21st
century teacher education (pp. 219-224). Washington, D.C.: AACTE Publications.
Young, M. (1993). Instructional design for situated learning. Educational Technology
Research and Development, 41(1), 43-58.
Zhao, Y., & Conway, P. (2001). What’s in and what’s out?: An analysis of state technology
plans. Teachers College Record. Retrieved July 15, 2004, from
Zhao, Y. (Ed.). (2003). What teachers should know about technology: Perspectives and
practices. Greenwich, CT: Information Age Publishing.
Direct reprint requests to:
Matthew J. Koehler
Learning, Technology and Culture
Michigan State University
509 Erikson Hall
East Lansing, MI 48824
... TPCK was first introduced as "Good teaching is not simply adding technology to the existing teaching and content domain. Rather, the introduction of technology causes the representation of new concepts and requires developing a sensitivity to the dynamic, transactional relationship between all three components suggested by the TPCK framework" (Koehler, M.J. & Mishra, P. 2005). As stated by Roblyer (2006), what is expected from the teacher is not how often she/he uses the technology, but that she/he is able use it by choosing the technology suitable for the educational content and pedagogical approach. ...
Full-text available
In research on the use of technology in education, it is emphasized that it is an indispensable requirement of our age, therefore, educators should be developed in terms of techno-pedagogy. In this study, total 1735 school administrators and teachers' individual innovation qualifications and techno-pedagogical education competences were investigated, who are working at primary, middle school and preschool levels in Turkey's province Samsun. Within the scope of the research, personal information inventory, Technopedagogical Education Competence (TPACK-deep) Scale and Individual Innovativeness Scale were used. In the analysis of the data, the SPSS package program was used. According to the results of the research, it was seen that the techno-pedagogical education proficiency score of the participants was 4.01 which is in the advanced level. The average score that teachers got from the Individual Innovativeness scale was found to be 70.60 (category in the pioneer). According to the results of the correlation analysis, it was determined that both individual innovativeness and techno-pedagogical education competences levels have a significant correlation relationship with each other at the level of 0.01.
... Koehler and Mishra [12] introduced the integration of technology in teaching and learning by adding technological knowledge as its essential component. After adding technological knowledge to the two previous components (mastery of course content knowledge and pedagogic knowledge), Mishra and Koehler [14] developed and introduced three essential components regarding teacher professionalism, namely CK, PK, and technological knowledge (TK). ...
Full-text available
This study investigated technological pedagogical and content knowledge (TPACK) practices during the current COVID-19 pandemic and examined factors influencing students' perceptions of effective online learning. This study gathered the primary data from 90 university students from four study programs via the online questionnaire. Using an analysis of variance (ANOVA) statistical tool for data analysis, this study found that students' perceptions of their teachers' teaching performance in the fully online programs are not significantly different across four study programs. This finding implies that regardless of their study programs, the students had expected that teachers should pay attention to some vital aspects in online learning: using the same learning management system (LMS) for all study programs, and preparing designing practical online modules, taking care of the organization of teaching inputs that promote students' critical thinking, delivering various teaching inputs and methods, intensifying teachers' presence in monitoring students' learning progress, motivating students to learn, and promoting teacher-student mutual respect through effective communication. This study also revealed that teachers play a pivotal role in achieving effective online learning during the pandemic.
... In terms of intervention measures, method has the best effect, followed by tool, while the important technical intervention has the most effect. Pure technical intervention is not as effective as a systematic method, which also shows that the learning and application of TPACK should emphasize the technology and information environment. of "teaching and learning theories" and methods [98]. Technology interventions may be used with other teaching strategies to facilitate teachers' TPACK development [99]. ...
Full-text available
Teacher education is an important strategy for developing teachers’ technological pedagogical content knowledge (TPACK). Many schools in the world have incorporated the training into teacher education plans. However, there has been controversy in academic circles concerning the effects of teacher education intervention in promoting the development of teacher TPACK. Therefore, this study used a meta-analysis approach to review the published literature on teacher education programs to determine the impact on TPACK. The results showed that teacher education intervention positively affected TPACK (d = 0.839, p < 0.0001). Besides cultural background, experimental participants, types, sample types, intervention durations, differences in measurement methods, intervention types, and learning environments are the reasons for the differences in the effects of the interventions. The research design using random experiments had a significant positive effect on the size, which was significantly higher than that of the quasi-experiment. The longer the duration of teaching intervention, the stronger the improvement effect of teachers’ TPACK. There are significant differences in improving TPACK between teaching interventions, and the effect is more obvious. Teacher education intervention has a greater and slightly smaller impact on theoretical and practical knowledge. However, cultural background, experimental participant, sample type, and learning environment have no significant effect on teacher education intervention.
... It's worth noting that 90%of the research sample lost contact with their students throughout the quarantine period, that is about 4 months in the academic school year 2020/2021 and more than one month in the academic school year 2021/2022 because they are not used working in contents design teams to solve problems of practice like remote teaching during crisis Koehler and Mishra (2005).Too teachers communications with the pedagogical team (the inspector , the director...) were limited to the phone for subjective reasons such as the inability to the use of e-mail, for example, and other technological reasons related to poor Internet flow. ...
Full-text available
Digital education is one of the most important tools that should be employed in the normal situation in all societies because of its role in facilitating the learning environment and the education process. Currently, with the spread of the Corona virus epidemic, all educational institutions, including schools, universities and scientific institutes, have been closed in an emergency and sudden manner which has made it necessary for educational institutions to search for new and innovative ways and methods for the continuity of the educational process. The best way to achieve this is to resort distance education which many universities and scientific institutes were able to use and became required by this emergency situation. But were these institutions are have been prepared for this sudden use of digital education tools? Did they perform their role? Are the teachers and students ready for this major and qualitative change in education?This article offers an analysis of the use of information and communication technology in primary school in Tunisia between theory and practice. The research showed that the teachers in our sample are not attracted by digital culture in their teaching practices because they have not acquired techno-pédagogical skills that allow them to script and evaluate learning via digital technology and they don’t receive a real support from pédagogical team.
... This is the reason that since the early 2000s, researchers have noticed a deficiency of relevant theoretical framework in the age of technology. Therefore, various researchers (mainly Koehler & Mishra, 2005;Mishara & Koehler, 2006) designed an experiment to understand teachers' professional development via integrating ICT in their teaching. ...
Background: A study to reveal existing pedagogical or technological pedagogical content knowledge frameworks is crucial to inform and their effectiveness in teaching mathematics. This review study intended to explore the trends of the pedagogical content knowledge (PCK) framework, how it has changed over time until the most recent version of technological and pedagogical content knowledge (TPACK) was developed, and its effectiveness in teaching mathematics. Methods: We initially downloaded 273 articles from the first 30 Google Scholar pages and analyzed 229 journal articles. We got 24 frameworks from 64 journal articles since Shulman’s first model in 1986. About 52 out of 229 were mathematics studies. Among these studies, we found that 18 studies have extensively investigated the use of identified frameworks. Results: The frameworks were presented and descriptively discussed in chronological order. The empirical studies that compared the role of pedagogical and technological pedagogical content knowledge models among classrooms with teachers who possess and do not possess such skills were demonstrated. Conclusions: The gap in empirical studies was identified, and further studies about the intervention of PCK and TPACK models were suggested to gain more insight into the mathematics classroom.
... Solutions often require the use of expert knowledge to author solutions which will achieve a good outcome. Therefore the mere use of technology in the classroom most often results in no change in outcomes [12]. Instructor technology integration involves much more than software and hardware fluency -it requires a grasp of interconnections between students, technologies, and pedagogical practices. ...
... Today's teachers are expected to be a "learner" and "designer" with some skills such as having a good command of instructional design for digital learning environments, exploring and applying pedagogical approaches via technology, being able to plan individualized learning environments, participating in learning networks, and ensuring efficiency of teaching and learning by following up-to-date research (ISTE, 2020). In fact, Koehler and Mishra brought about the TPACK framework in emergence with the professional development programs carried out with the learning-bydoing approach (Koehler & Mishra, 2005;Mishra & Koehler, 2006). Also, studies show that teachers' participation in technology integration activities such as courses, and workshops increase their TPACK and affects their technology-supported lesson designs positively (Abbitt, 2011;Kafyulilo et al., 2015;Koh, 2019;Tai, 2013). ...
Full-text available
A three-phase exploratory sequential mixed-method study was conducted to propose and test a model showing the interrelationships among the contextual factors influencing the science teachers’ Technological Pedagogical Content Knowledge (TPACK). Though developing teachers’ TPACK is critical for technology integration in education, the contextual factors influencing TPACK were mostly neglected and this study aimed to fill this gap in the literature. The first phase of the study was aimed to determine the contextual factors by interviewing science teachers and educational technology experts. The findings revealed nine common factors involving student influence, teachers’ beliefs and attitudes, technological infrastructure, administrative support, technical support, colleague interaction, lack of time, professional development, educational technology experience. In the second phase, a path model was hypothesized based on qualitative study results and related literature. In the third phase, a valid and reliable scale, named Contextual Factors Scale was developed to measure these contextual factors. Then, a questionnaire involving this scale and TPACK was administrated to 348 science teachers. Showing complex interrelationships among contextual factors, the model explained 45% of the variance in science teachers’ TPACK, with particular importance placed on professional development, teachers’ beliefs and attitudes, administrative support, and student influence. Looking from a holistic perspective, this study provided a valuable model for guiding decision-makers, researchers, and practitioners about how to improve teachers’ TPACK and technology integration in the schools. Considering the complex interrelationships in the model, some simultaneous strategies dealing with each factor should be applied to improve teachers’ TPACK and technology-related practices in the schools.
The purpose of this chapter is to assess the extent of technological pedagogical content knowledge of preservice teachers. Using the TPACK survey developed by Schmidt et al., participating preservice teachers self-assessed their TPACK knowledge. Participating preservice teachers were part of the first cohort required to take an instructional technology course in the undergraduate initial certification program due to revised program requirements. Results indicated that more than 80% of the survey participants agreed that they “can teach lessons that appropriately combine content area, technologies, and teaching approaches.” To further understand these self-reported data, TPACK constructs discussed in this exploration focus on three of the seven domains—TCK, TPK, and TPACK. Actionable considerations for preservice teachers are proposed, including a professional development learning module to build teachers' TPACK knowledge.
Conference Paper
Full-text available
School districts, administrators, and teachers faced an unprecedented challenge as schools closed due to COVID-19. SBAE teachers were no exception to this and faced the unique challenge of teaching technical content through virtual platforms. The purpose of this study was to explore SBAE teacher emotional exhaustion amid the shared trauma of the COVID-19 pandemic. This study examined emotional exhaustion, teacher self-efficacy, technological pedagogical content knowledge (TPACK), and perceived supportive actions from administration in Ohio SBAE teachers. Paired samples t-tests indicated increases in emotional exhaustion-with a medium effect size-in participants from spring to autumn of 2020. A serial multiple mediator model indicated that supportive administrator actions significantly predicted TPACK and emotional exhaustion, while TPACK significantly predicted teacher self-efficacy at the time of measurement. Within the mediation model, no significant indirect effects were found. Further research should examine the factors behind administrative support for teachers to help mitigate emotional exhaustion in the teacher population.
Full-text available
In this study, we followed three faculty members' experiences with designing and teaching online courses for the first time. In order to complete the activity, the faculty members had to work collaboratively with others across the university. Activity theory provided a framework within which to study faculty members' collaborative activities with members of different activity systems that had different goals, tools, divisions of labor and accountabilities. In concordance with activity theory, such differences led to contradictions, disturbances, and transformations in thinking and work activities. The results of the study have implications for individuals and systems undertaking technology integration in teaching.
Project-based learning is a comprehensive approach to classroom teaching and learning that is designed to engage students in investigation of authentic problems. In this article, we present an argument for why projects have the potential to help people learn; indicate factors in project design that affect motivation and thought; examine difficulties that students and teachers may encounter with projects; and describe how technology can support students and teachers as they work on projects, so that motivation and thought are sustained.
This study had two purposes. The first was to explore the construct of teacher knowledge of educational technology through the lens of three components of Shulman's model of teachers' knowledge-content knowledge, pedagogical knowledge, and pedagogical content knowledge. A second purpose was to investigate the ways in which teacher knowledge is acquired, shared, and used by student teachers and their mentors in the context of the student teaching placement. The literature in educational technology takes, for the most part, a limited view of educational technology knowledge, reporting on teachers' awareness of technological applications and affordances. By using Shulman's model, this study constructed and considered a more comprehensive depiction of teacher knowledge. Teacher knowledge of educational technology as thus depicted was explored as it developed within a particular setting. Data were drawn from a three-month observation and interview period in the spring of 1999. Six participants-three student teachers and three mentor teachers-were observed and interviewed at a middle school in a working-class suburb of a large Midwestern city. From observations of teacher practice, inferences were made about the underlying body of knowledge evidenced by the participants. The perspective of student and mentor teacher participants was gained through a quasi-ethnographic interview process. Observation and interview data were analyzed using a shared coding system, allowing a rich description to be created. Results of the study indicated that employment of Shulman's model revealed a set of knowledge derived from and applicable to practice with educational technology. This knowledge could be considered a Pedagogical Content Knowledge of technology, corresponding to Shulman's identification of a particular understanding by teachers of content in service of teaching and teaming. Within the context of the mentor/student teacher pairs, both knowledge acquired in and brought to the setting was shared in a multi-year cycle from student teacher to mentor to subsequent student teacher. Impact on the field includes a broadening sense of the nature of knowledge of educational technology, as well as increased attention to the importance of the student teaching placement and student and mentor teachers' roles within that environment.
Recommendations for reform in science education place a premium on students' understanding of scientific concepts and their ability to identify problems, conduct inquiry, and use information flexibly. They call for an appreciation for how ideas evolve and are validated. In this article we discuss changes in ideas about learning that underpin the reforms. We then describe our experiences with project-based science, a pedagogy that addresses the reform recommendations. Project-based science focuses on student-designed inquiry that is organized by investigations to answer driving questions, includes collaboration among learners and others, the use of new technology, and the creation of authentic artifacts that represent student understanding. Finally, we illustrate the challenges this type of innovation poses for teachers' classroom practice, for professional development, and for policy.