Int. J. Learning Technology, Vol. 6, No. 2, 2011 201
Copyright © 2011 Inderscience Enterprises Ltd.
simSchool: an online dynamic simulator for
enhancing teacher preparation
Rhonda Christensen*, Gerald Knezek and
University of North Texas,
3940 Elm St. #G-150, Denton TX 76207, USA
The Equity Alliance at ASU,
Arizona State University,
Interdisciplinary B353, P.O. Box 876103,
Tempe, AZ 85287-6103, USA
Abstract: A rationale for using a simulated teaching environment to train
pre-service teacher candidates is presented, followed by the key components of
the simSchool dynamic simulator created to accomplish this task. Results of
analyses of two sets of data, for the areas of pedagogical practices and teaching
skills, are used to illustrate that changes in pre-service educators can be
assessed as a direct outcome of activities completed within the simulated
environment. Major outcomes to date indicate that teacher candidates gain a
sense of instructional self-efficacy (confidence in their competence) more
rapidly using the simulator, compared to traditional teacher preparation classes
and related activities. This outcome is true for pre-service candidates working
with simulated students spanning the normal range of personality attributes and
sensory abilities, as well as pre-service teacher candidates working with
simulated students with disabilities.
Keywords: games and simulations; pre-service teachers; learning theory;
special populations; teaching; learning; teacher preparation; professional
development; modelling; self-efficacy.
Reference to this paper should be made as follows: Christensen, R.,
Knezek, G., Tyler-Wood, T. and Gibson, D. (2011) ‘simSchool: an online
dynamic simulator for enhancing teacher preparation’, Int. J. Learning
Technology, Vol. 6, No. 2, pp.201–220.
Biographical notes: Rhonda Christensen is Research Scientist in the College
of Information at the University of North Texas and Associate Director for the
Integration of Technology into Teaching and Learning. She was Principal
Investigator and Project Director of the US Fund for the Improvement for
Post-Secondary Education simMentoring project which is the focus of this
paper. She has taught pre-service and in-service teachers for ten years. She
202 R. Christensen et al.
received her BS in Curriculum and Instruction from Texas A&M University,
MEd in Computer Education from University of North Texas (UNT) and PhD
in Information Science from UNT.
Gerald Knezek is Regents Professor of Learning Technologies at the University
of North Texas and Director of the Institute for the Integration of Technology
into Teaching & Learning (IITTL) at UNT. He was the Co-principal
Investigator of the US Fund for Improvement for Post-Secondary Education
simMentoring project. He led the process of developing instruments for
measuring learning within the simulated teaching environment. He received his
BA in Mathematics and the Social Sciences from Dartmouth College, and his
MEd and PhD in Educational Psychology from the University of Hawaii.
Tandra Tyler-Wood is an Associate Professor in the Department of Learning
Technologies at the University of North Texas. She serves on the executive
board for the Council for Learning Disabilities and as the Chair of the Special
Education Division of the Society for Information Technology and Teacher
Education. Her areas of interest include using technology to facilitate learning
for special populations and assessing learning and attitudes changes through the
use of technology. She received her Bachelor in Special Education and
Psychology from Converse College. She obtained her MEd and PhD in Special
Education from the University of North Carolina at Chapel Hill.
David Gibson is an Associate Research Professor in the School of Social
Transformation at Arizona State University and leads evaluation research for
the Equity Alliance. He is the Executive Director of The Global Challenge
Award, a team and project-based learning programme engaging teens in
solving global problems using science, technology, engineering and
mathematics. His research focuses on complex systems analysis and modelling
of education, web applications and the future of learning, and the use of
technology to personalise education via cognitive modelling, design and
implementation. He is the creator of simSchool, a classroom flight simulator
for training teachers, and eFolio a performance-based assessment system.
Good teachers constantly negotiate a balance between technology, pedagogy, and content
in ways that are appropriate to the specific parameters of an ever-changing educational
context (Bull et al., 2007). A major challenge facing beginning teachers is how to juggle
teaching and learning parameters in an often-overwhelming context of a new classroom,
given a particular mix of the students and the available tools at hand. A four-year project
at a large South Western university in the USA was initiated in November 2006 to
address beginning teacher challenges. An initiative to include special populations was
added in October 2007. The main goal was to improve the capacity for resilience among
pre-service teachers, thereby enhancing teacher retention once candidates enter the
classroom. The purposes of the current paper are: to present the rationale for using a
simulated teaching environment; to determine the key components of the simulator that
have evolved; to report major project findings to date and convey conclusions regarding
what pre-service teachers learn; and to review what can be assessed regarding pre-service
teacher learning – when teacher candidates are working within a simulated teaching
simSchool: an online dynamic simulator for enhancing teacher preparation 203
2 Conceptual rationale
The use of digital games and simulations to help prepare teachers is inspired by the
dramatic rise and growing appreciation of the potential for games and simulation-based
learning to help prepare future teachers (Aldrich, 2004; Foreman et al., 2004; Prensky,
2001). Research and development of teacher education games and simulations is just
beginning. The new field has the twin goals of producing better teachers and building
operational models of physical, emotional, cognitive, social and organisational theories
involved in teaching and learning (Gibson, 2007, 2008, 2009). These considerations are
situated in the broader arena of the role of technology in field experiences for pre-service
teachers, since the goal of simulation as construed here is to provide learning and training
opportunities that can transfer to the real classroom and if possible, improve teacher
preparation. Specific benefits of technology use in field experiences are identified in a
recent review of literature (Hixon and So, 2009) including:
a exposure to various teaching/learning environments
b creation of shared experiences
c promoting reflectivity
d preparing students cognitively.
Our research illustrates some results of these benefits.
In addition, the Hixon and So (2009) review identified three types of experiences
categorised according to the degree to which the experiences are situated in reality. In
Type 1 experiences, technology tools are used to facilitate supervision, reflection, and
communication. Type 2 experiences provide vicarious experiences by remotely observing
teachers and students in real classrooms. Type 3, which includes simSchool, utilises
simulated environments. Zibit and Gibson (2005) call this type of field experience a
‘virtual practicum’ based on simulated apprenticeship models.
There are many challenges and issues that need to be addressed as we integrate games
and simulations into teacher training. Here, we will concentrate only on the broadly
configured conceptual frameworks for cognition, assessment of learning, and teaching
actions that become encoded in computer languages for the purpose of controlling a game
or simulation (Gibson et al., 2007).
Encoding operational definitions of teaching and learning into computer languages
provides a new way for educators to engage in a conversation about the science of
teaching and learning. With appropriate and effective models, reproducible contexts can
be presented for problem-solving by future teachers. Classroom contexts with complex
relationships can model many of the key aspects of the evolving dynamics of individual
learners interacting with tasks, the teacher, and other students. Hypothesised internal
dynamics of emotional and motivational variables involved in learning can be assessed,
tested, and adjusted. As these applications indicate, the potential for digital game and
simulation-based teacher education is just beginning to be explored and understood.
2.1 Modelling paradigm
simSchool was conceptualised from initial design stages to operate as an ‘on-demand,
in-flight’ practice arena to stimulate and shape the dialog between novice and expert
204 R. Christensen et al.
teachers – the latter of whom traditionally serve as the novices’ mentors. The solution
integrates well with existing best practices and can be transferred to many additional
settings in both pre-service and in-service education.
simSchool promotes pedagogical expertise by re-creating the complexities of
classroom decisions through mathematical representations of how people learn and what
teachers do when teaching.
The most fully developed tier of the simSchool cognitive model are the five
components of the student’s emotional make-up, built on the OCEAN or Big Five model
of personality (McCrae and Costa, 1996; Srivastava, 2006):
• openness to experience – appreciation for art, emotion, adventure, unusual ideas;
imagination and curiosity
• conscientiousness – a tendency to show self-discipline, act dutifully, and aim for
• extroversion – energy, urgency, and the tendency to seek stimulation and the
company of others
• agreeableness – a tendency to be compassionate and cooperative rather than
suspicious and antagonistic towards others
• neuroticism – a tendency to easily experience unpleasant emotions such as anger,
anxiety, depression, or vulnerability.
The academic components are currently represented by a single variable representing
overall academic performance. In the planning stages are multiple sub-elements
within any selected domain of knowledge; for example if mathematics, then the
sub-elements engaged might be computation, problem solving, and communication. The
physical variables include auditory, visual and kinaesthetic awareness. All student
variables change during the simulation so that learning, making no academic or
behavioural progress, or even ‘going downhill’ in academics or behaviour can all occur
simSchool also contains a verbal interaction model built on the ‘interpersonal
circumplex theory’ (Kiesler, 1983) which proposes that verbal interactions involve both
power and affiliation negotiations. The power component ranges from dominant to
submissive and the affiliation component ranges from friendly to distant or hostile. The
interactions of the variables give rise to 16 pairs of opposites such as ‘sociable to aloof’
that are used to model dispositions in teacher-student interactions.
The model’s dynamic equations combine variables in different ways depending on
the context and intention of the user, made evident through the range of available options
for action. This gives rise to highly differentiated behaviours in the students that are not
strictly reproducible from simulation to simulation, but which follow heuristics that can
be learned – such as the need to individualise instruction for some students in order to
have all students succeed.
The modelling paradigm in simSchool works by computing a time series evolution of
the classroom as a system. This modelling allows novel dynamics to evolve moment by
moment as the user, a teacher candidate, makes decisions. simSchool promotes thinking
on one’s feet because class time waits for no one. The experimental logic model
framework (Figure 1) is also relevant in simSchool but instead of each state of the system
simSchool: an online dynamic simulator for enhancing teacher preparation 205
waiting upon a user’s action as in a customer satisfaction degree (CSD) model, in
simSchool, the classroom evolves whether or not the teacher takes actions. The dynamic
modelling approach uses initial conditions, attractors, and multiple layers of dynamic
interactions to simulate learning by individuals in a classroom.
As a result of the dynamic modelling approach, there is a continuous production
of moment-by-moment evidence of what the teacher candidate is attempting to do as he
or she ‘teaches’ the class. A conceptual assessment framework (Mislevy et al.,
2003) guides the analysis of that evidence so that inferences about the growth and
development of teaching skill can be made based on the evidence of ‘game play’ in
simSchool (Figure 1). The task model as well as the student model has been presented as
‘landscapes’ of factors that are changing over time. The evidence model is comprised of
the actions that each pre-service teacher uses while playing simSchool, as well as the
analysis we bring to understanding the teachers intention and their actions’ impacts on
Figure 1 Conceptual assessment framework for analysis of user actions
The cognitive model of the student in simSchool is built around a three-tiered model of
the physical, emotional and academic performance variables of learning as explained
above. No other hidden variables exist, so all of the effects of the user’s decisions are
directly attributable to interaction effects, as opposed to randomly generated settings as in
CSD model. The down side of this approach is the extra cost in forming an analysis,
because for any particular resulting end condition at any point in time, all of the previous
actions have had some causative impact. Luckily, a simple visual interface can directly
present the results of the pre-service teacher’s actions on the simulated students for
reflection and summation (Figure 2) where a quantitative analysis is sometimes less
206 R. Christensen et al.
Figure 2 Post-game report of simSchool dynamics over the course of one simulation (see online
version for colours)
2.2 How simSchool works
The simSchool programme is driven by an artificial intelligence (AI) engine. The engine
uses a simple form of knowledge representation and table lookup in which learning is
taken for granted (e.g., all learners try to learn), but learning is also impacted by barriers
(e.g., cognitive load issues including both external and internal reactions to changing
situations). Complexity in the system arises as the teacher provides occasional input in
the form of changes to the task environment (e.g., the teacher’s choice of lesson activity
and whether and what to say to the students). Each simulated student responds differently
to every input, based on their current but malleable psychological and cognitive profile
and how that relates to the constant settings of the task environment.
Each simStudent has an initial individual personality profile with settings on one
cognitive dimension (expected academic performance), and five psychological
dimensions divided into a ‘power’ dimension with three variables:
• openness to learning
• conscientiousness toward tasks
• extroversion or introversion
• an ‘affiliation’ dimension with two variables:
b neuroticism or emotional stability.
On each of these 6 dimensions, the settings range from very negative (–1) to very positive
(1) with about 20 different intermediate points. Thus the engine can define 20 ^ 6 or
about 64 million students. Each student can also have settings on three additional
physical-perceptual variables: vision, hearing and kinaesthesia. These variables range
from 0 to 1, with about 20 more niches per variable, bringing the total number of students
that can be represented to 20 ^ 9 or about 512 billion quantitatively different students.
simSchool: an online dynamic simulator for enhancing teacher preparation 207
Each student’s profile settings are invisible to the player, and determine exactly
how the student learns. Correlated with five clusters of settings around the points
(–1, –.5, 0, .5, 1) is a set of narrative hints for each dimension concerning the student as a
learner. This can be read by clicking on a computer on the desktop in the classroom.
There are five possible points for each of the nine dimensions that translate into
characteristics. For example, on the academic dimension the student might be one of the
• very academically capable (.8 to 1)
• moderately capable (.5 to .7)
• expected to be on grade level (0 to .4).
• has a few difficulties (–.5 to 0)
• has many difficulties (–1 to –.4).
So with 5 ^ 9 narrative variations, simSchool can narratively describe over 1.9 million
students. These narratives are assembled ‘on the fly’, as needed, from the database, to
form a readable student record that the teacher can access before and during class.
Each task environment (a lesson activity, being talked to, or both) is characterised by
settings on the same nine dimensions as above, and interacts with the student profile to
produce classroom and academic behaviour. The dimensions set performance goals
(e.g., they act like ‘attractors’) for each student’s current characteristics. For example, the
task has a setting for intellectual openness that acts on the student’s setting for intellectual
Task environments exert performance requirements independently on each student
and each dimension of each student, causing some to learn and others to be stymied or
get bored. As each task requirement dimension is compared to each student profile
setting, the distance to the goal of the task may or may not be within the learner’s zone of
proximal development. In addition, if the task environment presents too many cognitive
barriers, the student’s progress may be very slow or never get off the ground. If the task
is too low, then the learner gets bored. The difficulty is, with so many dimensions for
each task, parts of some tasks are good for only parts of some of the student personalities.
The programme has complex underpinnings.
Teacher talk is organised into two areas – behaviours and academics – and
then further subdivided into questions, observations and assertions. For example, the
teacher might want to speak to a student about how they are doing on a task, and
would select ‘academic – question’. Then the teacher selects from a range of 16 attitude
options with which to approach the student. The attitudes come from a theory of personal
interaction, which says that we negotiate between ‘power’ and ‘affiliation’ in our
interactions. Translating power and affiliation attitudes into the classroom, when
the teacher uses power statements, e.g., a dominant statement such as “evaluate the
question first!” it is comparable to a teacher-centred classroom. A more submissive
question, e.g., “can you describe your thinking?” is comparable to a student-centred
classroom. Choosing to talk to a student about behaviour slightly favours the affiliation
dimension, and choosing to talk about academic matters slightly favours the power
208 R. Christensen et al.
2.3 Enhancements to simSchool
To address new audiences by expanding the simulator’s range of components to be
modelled to include ‘create a student’ and ‘create a task’. Feedback is provided to users
(pre-service teachers) regarding student progress during the simulation. The simSchool
screen allows a pre-service teacher to use preset (system generated) simStudents or
custom generated simStudents when preparing a classroom to teach (see Table 1).
Table 1 ‘Create a student’ menu describing each variable that can be manipulated by the user
to create a unique student with a cognitive or physical disability
‘Create a custom student’ variable descriptions
Academic Below grade level
Above grade level
Extroversion Pays attention to private
thoughts and feelings or
Pays attention to things going
on around oneself or
Agreeableness Works alone or avoids others
Works with or
depends on others
Persistence Works creatively or with
or with persistence
Emotion Shows unrestrained emotional
response or is highly sensitive
Tempers emotion or is
Intellect Solves well-defined problems
or likes to do repetitive tasks
Solves ill-defined problems
or likes to change approaches
Visual Complete absence of vision
Full use of vision
Auditory Complete absence of hearing
Full use of hearing
Kinaesthetic Complete absence of
Full use of movement
Both ‘create a student’ and ‘create a task’ options were launched to allow users to have
more control over the characteristics of students as well as the types of teaching
activities. These functions allow the users to create a student based on selected attributes
that change how the simulated student reacts to given tasks and comments from the
‘teacher’. Also users may create tasks of their own to assign to their simStudents. Both
the created students and created tasks are saved in a simulated environment to be used in
current or future simulations. Pre-service teachers who are currently working with
students in a real classroom have created simStudents who mirrored attributes of actual
students in those classrooms.
2.4 Instrumentation for assessment of outcomes
One difficulty measuring the effectiveness of using a simulator for pre-service teacher
preparation is the long lag time between pre-service teacher preparation, induction year
simSchool: an online dynamic simulator for enhancing teacher preparation 209
activities and assessment of retention. It takes too many years to produce an authentic
assessment of whether or not the simulator worked. Because of this difficulty the authors
began exploring alternative ways to assess learning within the simulated environment.
The self-report instruments were developed for the purpose of assessing learning that
occurs within the simSchool environment. Three scales were developed to measure
pre-service students’ development when using simSchool: instructional self-efficacy
α = .77 (five items); learning locus of control α = .68 (five items); and teaching skill
α = .95 (15 items). The development of these instruments, including reliability and
validation measures, is described elsewhere (Riedel, 2000; Knezek and Christensen,
2009). The instruments can be downloaded from http://www.iittl.unt.edu/.
3 Outcomes to date
3.1 Study 1: findings from matched treatment and comparison groups
During the spring of 2007, simSchool was introduced to 32 pre-service teacher
candidates in one section of a reading/language arts methods course for professional
development school students. These students were in early childhood – Grade 4 or
Grades 4–8 teacher preparation programmes. Students at this intern stage, which precedes
student teaching, spent two days per week taking courses and two days per week in a
classroom, observing teacher and student activities and assisting the classroom teacher.
Pre-post instruments assessing teaching beliefs, perceived level of teacher preparation,
level of technology proficiency, level of technology integration, and attitudes toward
computers were administered at the beginning and end of the class.
Pre-post data were also gathered from a parallel section of the reading/language arts
methods course (30 students), taught by the same instructor, but not incorporating
simSchool. This group was targeted as the comparison group for the treatment class.
Students in the treatment classroom took part in seven, 90-minute simSchool sessions in
the computer lab (nine contact hours total) with their instructor and a project staff trainer
from the project. This activity spanned approximately one half of the 15-week semester.
Each session focused on a specific goal such as getting started in simSchool (session 1)
with ‘everly’s bad day’, matching instructional tasks to simulated student personalities
and learning styles to improve student learning, initiating teacher dialog with the
simulated students to assess reactions, and moving from a one student classroom to a five
student classroom as proficiency with working in the simulator improved. Although
sufficient computers were available for each student to run a simulation alone, sessions
quickly evolved to have students working in pairs. Once the university instructor
described and demonstrated the task, pre-service candidates planned in pairs and then
carried out the tasks by having one participant function as the pilot, and the other as a
navigator. A reflective discussion led by the instructor typically followed. Frequently,
pre-service candidates were asked to record their reactions to a session in the class blog
in journal entry style.
210 R. Christensen et al.
3.2 Findings and conclusion for study 1
3.2.1 Treatment classroom
As shown in Table 2, according to the guidelines provided by Cohen (1988) of small
effect = .2, moderate = .5, and large = .8, there were large pre-post gains on two of the
three pedagogical indices for the treatment classroom. Teaching skill (ES = 1.0) and
instructional self-efficacy (ES = .95) exhibited large gains. Learning locus of control,
which appears to have a small-to-moderate negative effect, actually changed from a
stronger agreement that “a teacher is very limited in what he/she can achieve because a
student’s home environment is a large influence on his/her achievement” (for example),
toward the belief that the teacher can make a difference in the child’s life. The overall
image conveyed by changes in the three pedagogical indicators is very positive.
However, it is important to examine changes in the matched comparison group before
drawing conclusions regarding probable causality. Analysis of the comparison group will
be presented in the following section.
Table 2 Treatment classroom using simSchool, reading/language arts methods course
Measurement indices N Mean Std. dev. Signif. Cohen’s d
Pre 28 4.81 0.40 <.001 0.95 Instructional self efficacy
Post 23 5.23 0.40
Pre 29 3.49 0.79 0.37 –0.25 Learning locus of control
Post 25 3.30 0.78
Pre 28 4.73 0.56 <.001 1.00 Teaching skill
Post 23 5.35 0.52
3.2.2 Comparison classroom
As shown in Table 3, there was a large pre-post gain (ES = .96) in teaching skill for the
comparison group. The gain in this area was almost identical to that of the treatment
group. There was a small-to-moderate pre-post gain (ES = .40) in instructional self
efficacy for the matched comparison group. This gain was much smaller than the gain
(ES = .95) displayed by the treatment group, and, in fact the gain was sufficiently small
that it could likely have been due to chance (p = .14). There was almost no pre-post
change (ES = .07) in learning locus of control for the comparison group. The learning
locus of control group mean moved slightly in the direction of less belief that the teacher
(rather than home and outside-of-school constraints) could influence the achievement
potential of the student.
The strongest findings from matched treatment versus comparison analyses for
general preparation pre-service educators using simSchool were found in the area of
instructional self efficacy, a kind of resilience against ‘giving up’ when a strategy or
activity attempted by a teacher does not succeed in the classroom. The pre-post gain in
this area for the treatment classroom (pre-post ES = .96) was sufficiently greater than the
gain for the comparison group (pre-post ES = .40). Thus the effect of simSchool can be
said to be educationally meaningful (Bialo and Sivin-Kachala, 1996). Treatment versus
comparison gains are graphically displayed in Figure 3.
simSchool: an online dynamic simulator for enhancing teacher preparation 211
Table 3 Comparison group classroom not using simSchool, reading/language arts methods
course spring 2007 (same instructor as treatment classroom)
Measurement indices N Mean Std. dev. Signif. Cohen’s d
Pre 29 4.88 0.75 0.14 0.40 Instructional self efficacy
Post 25 5.17 0.67
Pre 28 3.20 0.63 0.80 0.07 Learning locus of control
Post 25 3.26 0.95
Pre 25 4.82 0.59 <.001 0.96 Teaching skill
Post 22 5.45 0.57
Figure 3 Treatment vs. comparison group pre-post gains in instructional self-efficacy (see online
version for colours)
Viewing these findings collectively we conclude:
1 The teacher educator leading both treatment and control classes produced almost
equally large gains in self reported teaching skills for treatment and comparison
2 The simSchool centred activities (seven total for 90 minutes) of the treatment class
produced gains in instructional self-efficacy that were roughly twice as large as gains
in the comparison group, when both groups had comparable class time exposure and
duration between pre-and post questionnaires.
3 Using the simulator as a class activity was possibly responsible for learning locus of
control movement by the treatment group in the direction of stronger belief that the
teacher can influence a student’s achievement potential. Replication studies are
needed in this area.
212 R. Christensen et al.
3.3 Study 2: findings from simSchool and research in disabilities education
3.3.1 Special populations and simSchool
The initiative to help current and future teachers learn more about special populations
was added to the initial project goals. The primary purpose of the project is to explore the
effectiveness of simSchool for improving pre-service teachers’ scores in teacher
preparation and attitudes toward inclusion of special needs students.
simSchool’s ‘create a student’ feature was used by participants to input academic,
personality, and physical attributes into the system to ‘create’ a student with a disability
modelled after a student found in their textbook readings or a real-life pupil in a
classroom. The participants created their simStudent based on the nine dimensions
available in simSchool, by moving sliders back and forth on a horizontal number line.
Participants ran multiple simulation sessions with the virtual student, making changes in
academic requirements based on prompt system feedback presented in a graph form
(Hettler et al., 2008). Teacher candidates with prior experience working with actual K-12
students with disabilities found this activity especially rewarding. The behaviours that the
constructed simStudents exhibited often mirrored those they had seen by the K-12
students the simulators were designed to emulate. In addition, teacher candidates felt free
to try strategies they believed to be poor choices with simulated students, so they could
analyse the resulting behavioural outcomes.
3.3.2 Analyses for simSchool and special populations
During the 2008–2009 academic year, simSchool participants (n = 157) exploring how to
accommodate the unique learning needs of a simulated student with disabilities in an
inclusion classroom setting, made significant gains (p < .001) in instructional
self-efficacy, with an effect size of .44. Additionally, findings confirmed significant
gains from pre- to post-assessment in the teaching skills subscale (p < .001), with an
effect size of .44. The comparison groups made no significant gains on either subscale
of the teacher preparation survey (see the Appendix for subscales and items). These
findings are generally consistent with the spring 2007 findings reported in the previous
section. Findings for the disaggregated 2008–2009 groups of undergraduate pre-service
teachers, versus graduate students, are compared and contrasted in the following
• Pre-service teachers. A paired t-test revealed significant gains (p < .001) for
undergraduate pre-service teachers (n = 104) in instructional self-efficacy, with an
effect size of .68, and in teaching skills (p < .03), with an effect size of .47.
• Graduate students (n = 47) showed a significant gain in teaching skills (p < .01) but
not in instructional self-efficacy (NS). In sum, the undergraduates posted significant
gains on two subscales, pre to post, whereas graduates exhibited significant gains on
only one. Note that the graduate students were practicing classroom teachers taking
courses for additional certification(s), and were acknowledged as having high
instructional self efficacy at the time of pre-test assessment.
simSchool: an online dynamic simulator for enhancing teacher preparation 213
Overall, analysis of data from this study involving general preparation educators
using simSchool to learn to accommodate learning disabilities, has shown that
simSchool activities result in gains in teaching skills and instructional self-efficacy.
We conclude there is potential for simSchool to help teachers train for inclusion
classrooms, due to its capacity to depict a wide range of student characteristics within one
3.4 Study 3: validation of simulated student behaviours though comparisons
with actual students
With funding from the US National Science Foundation, learning activities from an
environmental science unit featuring ‘life on a pond’ were incorporated into simSchool
(Tyler-Wood et al., 2010). A goal of the study was to validate the simSchool model by
comparing and contrasting students in simSchool to students in a real live inclusion
classroom where the same content, ‘life on a pond’ was taught. The goal was to
determine the validity of the simSchool model by comparing student behaviours within
the simulated classroom to an actual inclusion classroom teaching the same
environmental science unit. Real student attributes were rated by their teacher. Using
those ratings, simulated students were constructed in simSchool to emulate the real
students that were taught within the inclusion classroom. Student behaviours from the
simulated classroom were compared to those in the actual inclusion classroom to
determine the validity of the simulated model. Data were collected from five students and
their simulated counterparts including measures of academic ability, off-task behaviour,
and five personality variables including agreeableness, emotion, extroversion, intellect,
and persistence. Academic ability was estimated using the cumulative achievement test
measure developed for the unit. Personality variables were selected based on their
inclusion in the Big Five model (Costa and McCrae, 1992) and coded using the ten item
personality inventory (TIPI). Data were collected from five individuals, but only data
from one is presented in this paper for ease of interpretation. The student (James) selected
for analysis was a student with a severe hearing impairment. Because such an impairment
impacts almost all classroom behaviour, researchers elected to determine if James could
indeed be accurately represented in the simSchool environment. Descriptive analyses
were conducted to detect differences in the academic ability, behavioural, and personality
variables of a student and his simulated counterpart. The findings demonstrated that, in
spite of some variation between the student and his simulated counterpart, the qualities of
the simulated counterpart matched up moderately well with the actual student qualities.
The simulation worked moderately well in modelling this particular student’s qualities.
Given these results, it is hopeful that the simulation would model other students’ qualities
moderately well as well. Results indicate that the simSchool model, when considering
data from one student across tasks, was able to simulate the behaviours of that student
moderately well. Additional analyses are underway to determine how closely students
participating in the ‘life on a pond’ unit match with their simulated counterparts. It is
hypothesised that if simSchool can adequately depict the behaviour of a student with a
significant disability, students with lesser disabilities might be easier to emulate. Data
analysis of four students with mild disabilities is currently underway.
214 R. Christensen et al.
4.1 Implications of the findings to date
The pre-service teacher preparation candidates involved in the project during the spring
of 2007 exhibited moderate to large gains (Cohen, 1988) on many of the teacher
preparation indices produced from the data. The areas in which the treatment group of
pre-service teacher candidates exhibited the largest gain in comparison to their peers who
did not receive simSchool access and training, were on items related to instructional
self-efficacy. Items comprising this indicator reflected pre-service educators’ confidence
in their competence to bring about positive learning outcomes even in adverse learning
conditions. Findings imply that simSchool activities were successful in fostering
instructional self-efficacy in pre-service students. Findings also imply that the
programme is able to simulate real students moderately well.
simSchool was designed to provide pre-service teachers with a safe environment for
experimenting and practicing techniques, especially methods of addressing different
learning styles, and wide variations in academic and behavioural performance of
students. After completing simSchool training, the pre-service teachers were asked to
reflect on their experiences with the simulation. Analysis of the pre-service teacher
reflections indicate that one of the first revelations of a participant in the simulator is that
K-12 students do not always react the way the teacher candidates think they should. For
example, the girl (in the simulator) sitting with her legs crossed, chewing gum, seemingly
disconnected from the task, might be learning. The student that teacher candidates
thought was a very good student, does not seem to learn very much from a task. The boy
with headphones on – is he learning or distracted? These visible signs of student
behaviour may not be the best or only clues to performance found by observing
pre-service teachers interacting with simSchool. As pre-service teachers learn how to
read the student descriptions and learning style indicators better, and how to make
appropriate adjustments in task sequence and complexity, they see better results and gain
confidence in their abilities. The findings of different gains in treatment versus
comparison group indicators on the scale of instructional self-efficacy (confidence in
their competence) can be interpreted not only as evidence that the instrument works, but
also that the simulator is useful, as well.
Quantifying gains in learning how to teach is a difficult task. Self-report is a practical
means of gathering data and has been shown through this analysis to yield reasonably
reliable data. The instruments examined in this study have been found to have good
construct validity and the ability to separate groups known to differ, as well. An
instrument capable of showing gains from a simulator can help advance the field of
teaching and learning, and especially the field of teaching and learning through
4.2 Comparisons with other simulators
Studies conducted with two other products similar to simSchool have produced findings
generally consistent with those reported in this paper. In a study of the Virtual
Kindergarten Classroom developed at the University of Wollongong, Australia,
researchers found three features of the simulated environment were perceived as
simSchool: an online dynamic simulator for enhancing teacher preparation 215
especially useful to the 24 pre-service teacher candidates in their study (Ferry et al.,
2004). As a result of the study, several observations were made:
1 Safety of the simulated environment. Teacher candidates felt comfortable trying
teaching strategies with simulated students without fear of serious consequences on
the learning of actual children.
2 Support materials. Information sheets, web resources, and textbook resources were
perceived by the teacher candidates as useful in developing their own pedagogical
knowledge and applying theory to classroom practice.
3 Embedded thinking tools. An open comment section allowed students to ‘blog’
directly into the system and provide reflections on what they were learning.
Researchers examining the impact of the product simClass in Korea (Cheong and
Kim, 2008, 2009) found gains in teaching skills resulting from self-guided use of
the simulator alone, and in combination with classroom instruction – to be greater
than gains resulting from traditional classroom instruction not employing simClass
[F (1, 87) = 9.94, p = .002]. Furthermore, the difference in pre-post gains between
the self-guided teacher candidate group and instructor-guided teacher candidate
group was not significant [F (1, 57) = 1.789, p = .186]. Both groups using simClass
exhibited greater gains than the traditional classroom instruction group (Kim and Cheong,
When comparing the findings of the studies from Australia and Korea with
those currently presented for simSchool use in the USA, one can observe the outcomes
to be similar in most respects. The importance of being able to try out teaching
strategies without fear of ‘breaking a real student’ was strong in both Australia and
the USA. The importance of support materials was also apparent in both the Australia
and USA implementations; however, one nuance in the USA was the added
importance of human instructor guidance during the post-simulation debriefing stage. In
the area of embedded thinking tools, the Australia implementation had a window
explicitly included in the simulator for questions and reflections, while the USA
simSchool applications used blogging after a run as a means of addressing this area.
Both were deemed valuable. Regarding assessment of measurable gains, the Korea
simClass study found extensive improvements in self-reported teaching skills,
compared to traditional instruction, while the US simSchool study of a similar design
found the classroom methods instructor was just as effective in fostering gains
in teaching skills, with or without the simulator. The most noticeable difference in
the USA was the added value of the simulator in the area of instructional
self-efficacy (resilience to giving up as the result of having a bad day). Further research
is needed to determine whether these differences were due to local factors such as
the instructors or the local culture; or due to differences in the simulators and
procedures followed. Overall, the major findings were similar across different
4.3 Prospects for virtual field experiences
simSchool has recently been approved by the National Council for the Accreditation of
Teacher Education (NCATE) for use by the South Western University noted in this paper
216 R. Christensen et al.
as a pre-observation, virtual field experience tool. Teacher candidates are permitted to
count use of the simulator for up to ten hours of their internship/classroom observation
block which typically immediately precedes a teacher preparation candidate’s practice
teaching term. This type of utilisation falls in the category that Hixon and So (2009) have
referred to as Type III field experiences for pre-service teachers. Type III field
experiences are abstract experiences with a model of reality. One limitation of a Type III
environment is the lack of interaction with real teachers and students (Hixon and So,
2009). However, benefits include exposure to multiple teaching strategies and learning
styles in a short period of time, and better understanding of how the conceptual and
theoretical knowledge presented in pre-service teachers’ college classes relates to actual
classroom practices and student behaviours. Technology-enhanced virtual field
experiences can support pre-service teachers’ abilities to see the theories they are learning
in practice (Frey, 2008). These latter types of benefits were observed as outcomes of the
pre-service educators working in the simSchool environment, both by the researchers and
the pre-service educators’ college instructors (Christensen, 2008).
Several other researchers in addition to Hixon and So (2009) have explored the
possibilities of linking simulations with field experiences. Among these are Foley and
McAlister (2005), Ferry et al. (2005) and Girod and Girod (2006). The latter two groups
were involved in the early discussion and design stages of simSchool, and hence it is not
surprising that one of the initial formalised uses of simSchool is in this area. Hixon and
So (2009) have pointed out that much more research is needed to determine the optimum
mix of model-based explorations versus face-to-face observation and interaction with real
students, during the preparation of pre-service teachers.
4.4 Prospects for simSchool validation by tracking performance into classroom
The researchers participating in the project, which began in 2007, originally envisioned
tracking pre-service teachers who used the simulator during their teacher preparation
activities, into their classroom placements, for the purpose of assessing the impact on
actual classroom performance. However, as pre-service activities unfolded during the
first year of the project, the teacher education faculty in the project requested (after initial
pilots) that the simulation activities be moved from the fourth year, pre-student teaching
semester where they were originally targeted, to the second year of the university, at the
beginning of teacher preparation sequence, and included in the third year of the
university, to be used to foster dialog during the candidates’ methods classes. This
transition toward introducing the simulations earlier in the teacher preparation sequence
meant that it would not be possible to track meaningful numbers of graduates/teacher
inductees into or beyond their first year of teaching, within the confines of a three year
project. The simSchool research team has formulated methods for using outcome
measures such as teacher attrition at the end of one to three years in the classroom, or
even K-12 student performance scores of simSchool-trained teachers versus those not
receiving simulator training, as an ultimate test of the effectiveness of simSchool.
Budgetary limitations have prohibited that possibility from becoming reality to date.
Current plans are to continue to explore the prospect of integrating embedded assessment
along with ongoing communication and support, into the next generation of the
simSchool product. Design research activities in this area are proceeding on an ongoing
simSchool: an online dynamic simulator for enhancing teacher preparation 217
Using a game to teach teachers? The idea challenges conventional thinking and may
involve some risks. However, if we succeed in reducing teacher attrition and provide an
opportunity to rapidly increase a new teacher’s knowledge and skills in areas such as
differentiation, special education issues, individualisation of learning and grouping
practices, simulation could play an important role in preparing tomorrow’s teachers. The
most prominent feature of our project is that it adds an entirely new learning opportunity
for both pre-service and in-service teachers. Teacher educators can use simulations to
improve teaching and ultimately influence the skill level of new teacher’s entering the
classroom. Indeed, during the four years since the inception of the research project at the
university, the use of simSchool has been approved by the US NCATE, and used
extensively for pre-intern observation activity.
This research was supported in part by the US Department of Education Fund for the
Improvement of Postsecondary Education Grant #P116B060398 and the US National
Science Foundation Research and Disabilities Education (RDE) Grant #0726670.
Aldrich, C. (2004) Simulations and the Future of Learning: An Innovative (and Perhaps
Revolutionary) Approach to E-learning, John Wiley and Sons, San Francisco.
Bialo, E.R. and Sivin-Kachala, J. (1996) ‘The effectiveness of technology in schools: a summary of
recent research’, School Library Media Quarterly, Vol. 25, No. 1, pp.51–57.
Bull, G., Park, J., Searson, M., Thompson, A., Mishra, P., Koehler, M.J. and Knezek, G. (2007)
‘Editorial: developing technology policies for effective classroom practice’, Contemporary
Issues in Technology and Teacher Education [online serial], Vol. 7, No. 3, available at
Cheong, D.U. and Kim, B.K. (2008) ‘A simulation for improving teachers’ motivational skills’,
in D. Gibson and Y.K. Baek (Eds.): Digital Simulations for Improving Education: Learning
through Artificial Teaching Environments, pp.227–248, IGI Global, Hershey, PA, USA.
Cheong, D.U. and Kim, S.H. (2009) ‘Developing a simulation for practicing discipline skills of
pre-service teachers’, The Journal of Korean Association of Computer Education, Vol. 12,
No. 3, pp.63–74.
Christensen, R. (2008) ‘SimMentoring project update’, in SimZine, Vol. 1, No. 4, p.1, Institute for
the Integration of Technology into Teaching & Learning, University of North Texas.
Cohen, J. (1988) Statistical Power Analysis for the Behavioral Sciences, 2nd ed.,
Lawrence Erlbaum Associates, Hillsdale, NJ.
Costa, P.T. and McCrae, R.R. (1992) NEO PI-R: Professional Manual, Psychological Assessment
Resources, Inc., Odessa, FL.
Ferry, B., Kervin, L., Turbill, J., Cambourne, B., Hedberg, J., Jonassen, D. and Puglisi, S. (2004)
‘The design of an on-line classroom simulation to enhance the decision making skills of
beginning teacher’, Proceedings of the AARE International Conference, Melbourne, Victoria.
Ferry, B., Kervin, L., Turbill, J., Cambourne, B., Hedburg, J. and Jonassen, D. (2005)
‘Incorporating real experience into the development of a classroom-based simulation’, Journal
of Learning Design, Vol. 1, No. 1, pp.22–32.
218 R. Christensen et al.
Foley, J.A. and McAllister, G. (2005) ‘Making it real: SimSchool a backup for contextualizing
teacher preparation’, AACE Journal, Vol. 13, No. 2, pp.159–177.
Foreman, J., Gee, J., Herz, J., Hinrichs, R., Prensky, M. and Sawyer, B. (2004) ‘Game-based
learning: how to delight and instruct in the 21st century’, EDUCAUSE Review, Vol. 39, No. 5,
Frey, T. (2008) ‘Determining the impact of online practicum facilitation for inservice teachers’,
Journal of Technology & Teacher Education, Vol. 16, No. 2, pp.181–210.
Gibson, D. (2007) ‘SimSchool – a complex systems framework for modeling teaching & learning’,
Paper presented to the National Educational Computing Conference, Atlanta, GA, June.
Gibson, D. (2008) ‘Modeling classroom cognition and teaching behaviors with COVE’,
in D. Gibson and Y. Baek (Eds.): Digital Simulations for Improving Education, IGI Global,
Gibson, D. (2009) ‘Designing a computational model of learning’, in R. Ferdig (Ed.): Handbook of
Research on Effective Electronic Gaming in Education, Vol. 2, pp.671–701, Information
Science Reference, Hershey, PA.
Gibson, D., Baek, Y., Knezek, R. and Christensen, R. (2007) ‘Cognitive and conceptual assessment
frameworks for simulating teaching and learning’, Paper presented to the American
Educational Research Association Annual Conference, Chicago, Illinois, 13 April.
Girod, M. and Girod, G. (2006) ‘Exploring the efficacy of the cook school district simulation’,
Journal of Teacher Education, Vol. 57, No. 5, pp.481–497.
Hettler, L., Gibson, D., Christensen, R. and Zibit, M. (2008) Simmentoring: Guiding Development
From Virtual to Real Teaching!, CurveShift, Inc., Stowe, VT.
Hixon, E. and So, H.J. (2009) ‘Technology’s role in field experiences for preservice teacher
training’, Educational Technology & Society, Vol. 12, No. 4, pp.294–304.
Kiesler, D. (1983) ‘The 1982 interpersonal circle: a taxonomy for complementarity in human
transactions’, Psychological Review, Vol. 90, No. 2, pp.185–214.
Knezek, G. and Christensen, R. (2009) ‘Pre-service educator learning in a simulated teaching
environment’, in Gibson, D. et al. (Eds.): Proceedings of Society for Information Technology
& Teacher Education International Conference 2009, AACE, Chesapeake, VA, pp.938–946.
Kim, B.K. and Cheong, D.U. (2008) ‘SimClass: simulate your class before you teach’,
in D. Gibson and dY.K. Baek (Eds.): Digital Simulations for Improving Education: Learning
through Artificial Teaching Environments, pp.289–307, IGI Global, Hershey, PA, USA.
McCrae, R. and Costa, P. (1996) ‘Toward a new generation of personality theories: theoretical
contexts for the five-factor model’, in J.S. Wiggins (Ed.): The Five-factor Model of
Personality: Theoretical Perspectives, pp.51–87, Guilford, New York.
Mislevy, R.J., Steinberg, L.S. and Almond, R.G. (2003) ‘On the structure of educational
assessments’, Measurement: Interdisciplinary Research and Perspectives, Vol. 1, No. 1,
Prensky, M. (2001) Digital Game-based Learning, McGraw-Hill, New York.
Riedel, E. (2000) Teacher Beliefs & Preparation Survey, Center for Applied Research and
Educational Improvement, University of Minnesota.
Srivastava, S. (2006) ‘Measuring the big five personality factors’, available at
http://www.uoregon.edu/~sanjay/bigfive.html (accessed on 11 October 2006).
Tyler-Wood, T.L., Christensen, R., Knezek, G., Periathiruvadi, S., Barrio, B., Ellison, A. and
Lim, O. (2010) ‘Projects SETS: validation of a simulated classroom’, Presented to E-Learn
2010, Orlando, Fl.
Zibit, M. and Gibson, D. (2005) ‘simSchool: the game of teaching’, Innovate 1, Vol. 6, available at
http://www.innovateonline.info/index.php?view=article&id=173 (accessed on 24 April 2008).
simSchool: an online dynamic simulator for enhancing teacher preparation 219
a TPS instructional self-efficacy scale (five items)
• TSP 1I. If I really try hard, I can get through to even the most difficult or
• TSP 1G. If a student in my class becomes disruptive and noisy, I feel assured
that I know some techniques to redirect him/her quickly.
• TSP 1C. When I really try, I can get through to most difficult students.
• TSP 1H. If one or more of my students couldn’t do a class assignment, I would
be able to accurately assess whether the assignment was at the correct level of
• TSP 1F. If a student did not remember information I gave in a previous lesson, I
would know how to increase his/her retention in the next lesson.
b TPS learning locus of control scale (five items)
• TSP 1D. A teacher is very limited in what he/she can achieve because a
student’s home environment is a large influence on his/her achievement.
• TSP 1J. When it comes right down to it, a teacher really can’t do much because
most of a student’s motivation and performance depends on his or her home
• TSP 1B. If students aren’t disciplined at home, they aren’t likely to accept any
• TSP 1E. If parents would do more for their children, I could do more.
• TSP 1A. The amount a student can learn is primarily related to family
c TPS teaching skill scale (15 items)
• Below is a list of different skills you may use in teaching. Please choose the
response that indicates how prepared you feel currently to do each one. The
responses are on a scale of 1 = strongly disagree to 6 strongly agree.
a Describing the teaching context.
b Stating objectives clearly.
c Stating objectives so they are aligned with goals.
d Selecting objectives aligned with student needs.
e Selecting varied and complex objectives.
f Selecting a broad array of teaching strategies.
g Sequencing teaching strategies.
h Allotting time for instruction realistically.
i Developing high-quality adaptations.
220 R. Christensen et al.
j Developing a wide array of adaptations.
k Interpreting on-task behaviour accurately.
l Interpreting assessment results accurately.
m Connecting teaching and learning.
n Analysing my own teaching performance.
o Making decisions based on the assessment results from my students.