Content uploaded by Jo Tondeur
Author content
All content in this area was uploaded by Jo Tondeur on Nov 27, 2020
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=rtpe20
Technology, Pedagogy and Education
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/rtpe20
Ready, set, go! Profiling teachers’ readiness for
online teaching in secondary education
Sarah K. Howard , Jo Tondeur , Fazilat Siddiq & Ronny Scherer
To cite this article: Sarah K. Howard , Jo Tondeur , Fazilat Siddiq & Ronny Scherer (2020):
Ready, set, go! Profiling teachers’ readiness for online teaching in secondary education,
Technology, Pedagogy and Education
To link to this article: https://doi.org/10.1080/1475939X.2020.1839543
View supplementary material
Published online: 26 Nov 2020.
Submit your article to this journal
View related articles
View Crossmark data
Ready, set, go! Proling teachers’ readiness for online teaching in
secondary education
Sarah K. Howard
a
, Jo Tondeur
b
, Fazilat Siddiq
c
and Ronny Scherer
d
a
School of Education, University of Wollongong, Wollongong, Australia;
b
Interdisciplinary Department of Teacher
Education, Vrije Universiteit Brussel, Brussels, Belgium;
c
Department of Education and Quality in Learning,
University of South-Eastern Norway, Drammen, Norway;
d
Centre for Educational Measurement at the University of
Oslo, University of Oslo, Oslo, Norway
ABSTRACT
In a very short time, secondary school education across the globe transi-
tioned to online learning and teaching, in response to the Covid-19
pandemic. This study aims at identifying teacher proles in secondary
education to better understand perceptions of both individual and insti-
tutional readiness to transition to online teaching. To do this, the current
study grouped teachers on the basis of their TPACK self-ecacy beliefs,
online presence and perceived institutional support for online teaching.
To date, data have been collected from teachers (N = 222) from 20
countries. The data were submitted to latent prole analysis to identify
readiness proles. The added value of the current study lies in the com-
bined view of individual and institutional readiness and the uniqueness of
the dataset. It provides a large-scale international perspective and a wide
range of possible experiences. Findings inform how education institutions
can personalise and support transitions to online teaching.
ARTICLE HISTORY
Received 2 August 2020
Accepted 13 October 2020
KEYWORDS
K–12 education; online
teaching; teacher readiness;
institutional readiness;
Covid-19
Introduction
Online teaching and learning have been commonly used for almost two decades in higher education
(Martin et al., 2019). The various tools and platforms available to support online interactions, such as
discussions, assessment, sharing and interaction, have been well developed, tested, and have
reached a relatively high level of acceptance. However, adoption in school education has been
slower. With students and teachers all co-located, teaching has been able to happen face-to-face.
Research suggests that where online learning has been implemented (aside from distance educa-
tion), it is actually blended learning, where only a portion of the learning is online and in the
student’s control (Pulham & Graham, 2018).
However, in the rst half of 2020, schools across the world were forced to shift to fully online
teaching and learning in response to the Covid-19 pandemic. In many instances, this was a sudden
shift in delivery of learning. School teachers, many of whom had never taught online, were expected
to redesign their programmes to support their students in a 100% online environment. This required
a full shift in pedagogical approach to teaching and learning, and the use of a range of new
technologies (e.g., Gurley, 2018).
In the following discussion, we aim to explore teachers’ perceived readiness to shift their teaching
from face-to-face to fully online in response to the Covid-19 pandemic. Readiness is examined in
CONTACT Sarah K. Howard sahoward@uow.edu.au School of Education, Faculty of Art, Social Science & Humanities,
University of Wollongong, Wollongong, NSW 2522, Australia
Supplemental data for this article can be accessed here.
TECHNOLOGY, PEDAGOGY AND EDUCATION
https://doi.org/10.1080/1475939X.2020.1839543
© 2020 Technology, Pedagogy and Education Association
relation to perceptions of how well they felt they were prepared for this change and how well they
felt their institutions were prepared. Research has shown that both individual and institutional
factors inuence teachers’ capacity to take up new digital practices (e.g., Ertmer & Ottenbreit-
Leftwich, 2010; Howard, 2019). Therefore, we argue that it is necessary to consider teachers’
perceptions of both their own readiness and that of their institution, to gain a full view of their
position. This combined approach can provide a clearer picture of what support may be needed,
either in terms of training or school agenda setting, to support the transition to online teaching.
To do this, we present an international questionnaire designed to capture teachers’ general
teaching characteristics and experience, self-ecacy teaching online, fostering online presence
and institutional support transitioning to online teaching as a result of Covid-19. Latent prole
analysis was conducted to identify teacher proles of readiness to transition to online learning.
Proles provide a way to consider a range of heterogeneous experiences in a given situation
(Asendorpf, 2015), from which a more eective person-centred approach to support can be
designed. Implications for practice, school leadership supporting change, and future online teaching
and learning will be explored through the results.
What is online learning?
The term online learning is often used interchangeably with distinct and overlapping terms such as
distance learning, blended learning and e-learning. Over the past three decades, online learning has
become a signicant part of education, internationally (Singh & Thurman, 2019). It is necessary to
understand expectations of online learning, to understand the practice of online teaching (i.e. the
design and deliver of learning online). However, what comprises online learning and how to develop
it to support online learning continues to be a vague area of education and educational research.
One reason for this issue is our understanding of ‘What is online learning?’ Singh and Thurman
(2019) identied 46 dierent denitions of online learning, which mostly come from the higher
education context. There are a few commonly agreed elements in online learning, they are often
interpreted dierently and they have changed over time, as digital technologies supporting online
learning have developed. Common elements identied were: Technology, Time, Distance,
Interactivity and Educational Context (Singh & Thurman, 2019). Interestingly, denitions of online
learning did not include learning as a concept. A lack of a clear denition in the eld limits the
possibility of clear articulation in teaching, identication of exemplars and guidance on how to
address design (Singh & Thurman, 2019). Therefore, schools adopting online learning are left with
limited guidance about how teachers can be supported or what may be an appropriate vision for
online learning.
Transition to online teaching
On 30 January, the Covid-19 virus was declared a Public Health Emergency by the World Health
Organization (WHO). On 11 March 2020, the WHO declared Covid-19 a global pandemic (Cucinotta &
Vanelli, 2020). On 13 March countries began to implement lockdowns, which meant shutting down
parts of the economy (Hirsch, 2020). In many places, people were ordered to stay home or at least
limit activity outside the home. This included stopping travel, public gatherings and many busi-
nesses from trading. It also included moving many forms of work and education into the home. For
schools and universities this initiated a rapid transition to fully online teaching and learning
occurring around the globe. Suddenly, teachers had to provide fully online teaching, which created
an alien situation for themselves and their students (Ferdig et al., 2020). At the time of writing, in
many places face-to-face teaching had resumed in schools, but how long schools would stay open is
uncertain.
The transition to online teaching for secondary teachers and students was sudden and chaotic,
given that online learning has been not clearly dened (e.g., Singh & Thurman, 2019), and it is highly
2S. K. HOWARD ET AL.
variable (Chigeza & Halbert, 2014). In part, the lack of clarity resulted in limited evidence of good
practice available to guide the transition. Further, in many cases teachers do not possess knowledge
about online pedagogies or how to support learning online, as it is not included in many teacher
training programmes (McAllister & Graham, 2016). Therefore, many school teachers have limited
knowledge of and experience with online learning and teaching.
Areas of teacher knowledge needing development may include knowing which digital online
technologies to use and which types of tasks should be included in online learning (e.g., asynchro-
nous discussion tasks, online research tasks, video lectures or live video discussions). In regard to
online teaching, questions may include how to provide clear instruction, how to most eectively
communicate and how to most eectively assess learning (Gurley, 2018). Therefore, to be able to
learn from teachers’ experiences in the rapid transition to online teaching during the Covid-19
pandemic, it is necessary to investigate how prepared teachers felt they were for online teaching and
to understand how they experienced the rapid transition to online learning, so schools can draw on
these experiences to prepare for future online and blended learning.
Readiness to teach online
Teaching online requires technological skills, but also dierent pedagogical approaches than teach-
ing face-to-face, to support learning online (Gurley, 2018). However, the question is to what extent
teachers in secondary schooling are prepared to teach online. We argue that readiness to teach
online should include considerations across both individual teachers and their institutions. Several
studies have explored teacher readiness characteristics that can be associated with the implementa-
tion of online teaching and learning (for an overview see Phan & Dang, 2017). However, these studies
have focused only on individual educator characteristics and higher education (e.g., Martin et al.,
2019).
Individual characteristics aecting online teaching
TPACK self-efficacy
Online teaching self-ecacy can be considered a key component of educators’ readiness to teach
online (Hung, 2016). Teachers’ self-ecacy to teach and support learning has been explored through
the lens of the Technological Pedagogical and Content Knowledge (TPACK) framework (Koehler
et al., 2014). TPACK comprises several knowledge domains, including domain-general and technol-
ogy-specic aspects, that are relevant for teachers to implement technology in teaching and learning
processes (Voogt et al., 2013). Use of the TPACK framework can provide a focus on learning and
pedagogy that is typically missing from conceptions of online learning.
The following dimensions are key to the framework (Scherer et al., 2018): TPCK – ‘knowledge
about the complex relations among technology, pedagogy, and content that enable teachers to
develop appropriate and context-specic teaching strategies’ (Koehler et al., 2014, p. 102); TPK –
knowledge about the use of information and communications technology (ICT) to implement
instructional practices, principles and strategies; TCK – knowledge about how the subject matter
can be represented with the help of technology; and TK – knowledge of and about technology. TPCK,
TPK and TCK represent the key pedagogical and didactical aspects of TPACK, while TK represents
a purely technological domain (Schmidt et al., 2009). This distinction was evidenced empirically by
studies showing that the factor structure of TPACK self-ecacy scales was composed of a general
TPACK factor and a specic TK factor (Scherer et al., 2018; Tondeur et al., 2017).
Archambault and Crippen (2009) argued that the concept of TPACK is particularly relevant in the
context of online learning. According to these authors, the focus becomes more centred around how
the online course is designed, with special emphasis on the online teaching materials. At the same
time, some studies show a lack of integration between teachers’ pedagogical and technological
knowledge in the context of online teaching (e.g., Benson & Ward, 2013; Brinkley-Etzkorn, 2018). To
TECHNOLOGY, PEDAGOGY AND EDUCATION 3
illustrate, Benson and Ward (2013) created proles based on the professor’s degree of Technological,
Pedagogical and Content Knowledge in online higher education. They conclude that the knowledge
domains interact in unique patterns and that a balanced integration of TPACK is unlikely when
instructors are not able to verbalise their pedagogical reasoning behind their decisions to teach
online. Also, Brinkley-Etzkorn (2018) found that integrating technology and pedagogy is dicult
when training new online educators. Therefore, understanding teachers’ TPACK self-ecacy can be
a useful approach to identifying readiness to teach online.
Online teaching presence
As stated earlier, teaching in an online space diers from face-to-face teaching, because of the
separation from students by distance and time (Gurley, 2018). In online teaching, depth of learning in
an online space is related to presence. This can be conceptualised as social, cognitive and teaching
presence (Law et al., 2019; Wilson & Stacey, 2004). Online teacher presence emphasises educators’
responsibilities of design, organisation, facilitation and instruction in the online learning space so
that educational purposes can be fullled while learners and teachers are not co-located or working
at the same time (Martin et al., 2019). This is in line with the denition of Arbaugh and Hwang (2006),
stating that teaching presence refers to ‘the design, facilitation, and direction of cognitive social
processes for the purpose of realizing personally meaningful and educationally worthwhile learning
outcomes’ (p. 10). Key components of teaching presence are active communication, providing
feedback and learner–learner interaction (Wilson & Stacey, 2004). Gurley (2018) explored the com-
ponents of teacher presence in relation to teachers’ behaviours in blended and online learning
environments. Behaviours related to Feedback, Clear instruction and Assessment were found to
relate to teachers’ perceptions of high teaching presence.
Institutional support
The rapid transition to online learning which occurred in schools all around the world in response to
the Covid-19 pandemic pushed fast consideration of how teachers were trained to teach online (if it
was possible in the timeframe), if schools had a pedagogical vision about online teaching and
learning, how to support students to learn online etc. Studies show that the success of online
teaching can also be aected by institutional characteristics, such as technical support, pedagogical
support or the school vision about the implementation of online or/and blended learning (see
Almpanis, 2015; Bao, 2020). While institutional support is crucial for the successful implementation of
online education, some studies suggest that it is often insucient (e.g., Pallo & Pratt, 2013). The
level of institutional support largely aects how online course learning can be implemented and
sustained (see, e.g., McGee et al., 2017), but it remains unclear whether or not schools actually meet
the needs of the teachers to support online learning.
Perceptions of institutional support, particularly values and beliefs of the institution, also serve an
important role in change processes (e.g., Ertmer & Ottenbreit-Leftwich, 2010). School leadership, and
even leadership in a subject area, can strongly aect how a teacher perceives and values educational
and technological change (Hargreaves & Goodson, 2006). Importantly, strong leadership and clear
support to integrate new technologies and practices in teaching and learning can motivate teachers
to change, while a lack of commitment to change at an organisational level can demotivate teachers
and limit change (Howard, 2019). To support online learning, several aspects of institutional support,
including the schools’ vision and professional development for online teaching, need to be
addressed. In this respect, Philipsen et al. (2019) found that it was not enough to provide training
programmes to prepare educators to teach online; institutions also need a clear vision and goals
related to online teaching and learning. Teachers’ perceptions of institutional support and related
goals and vision in view of online education will have an impact on their overall conception of
readiness and ability to teach online (e.g., Howard et al., 2018). Therefore, to fully understand
teachers’ perceptions of readiness to engage in technology-related change initiatives, it is necessary
to examine both individual readiness and beliefs about the readiness of their institution.
4S. K. HOWARD ET AL.
Purpose of the study
The main aim of the current study is to identify teacher proles in secondary education based on
their perceptions of readiness to transition to online teaching as a result of Covid-19, in regard to
their own readiness and that of their institution. Importantly, teachers’ individual and institutional
perceptions of readiness are considered together, given that both are inuential in teacher
change and adoption of new practices related to digital technologies. The research question
addressed is:
1. What are teachers’ proles of readiness to teach online?
To address this question, the current study examines whether teachers can be grouped on the
basis of their TPACK self-ecacy beliefs, their online presence and institutional support for online
learning. A latent prole analysis was conducted, which represents a person-centred approach
(Asendorpf, 2015). While the variable-centred approach is concerned with information about the
trait, its structure, stability and validity for an average person (Bergman & Wångby, 2014), the person-
centred approach is considered more holistic in which the persons are regarded as dynamic systems
of interwoven components (Magnusson, 1988). As each person is considered a functioning totality in
the person-centred approach (Bergman & Wångby, 2014), it is often studied by analysing typical
patterns, for instance, shared by a group/subsample, and not separate variables. Hence, in this study
we conduct latent prole analysis to make visible dierent groups of teachers, better understand
their proles and consequently be better prepared to develop and provide adequate training for
online teaching and learning.
Materials and methods
In the current study we dene online teaching and learning as: delivering learning content, materials
and creating experiences using online platforms or tools, e.g., learning management systems, for
80% or more of a subject (Rogers et al., 2009). This is a technology-focused denition and clearly
focusing on delivery in the online space. The denition does not prioritise particular online learning
practices related to time or interaction. To be inclusive of a range of educational contexts, it does not
emphasise distance as a requisite.
Sample and procedure
Between March and May 2020, we launched an online survey via Google Forms® that was aimed at
assessing educators’ readiness for online teaching around the world in response to the Covid-19
pandemic (‘So, we’re all online?: Readiness for online learning’). The questionnaire was distributed
via several social media channels, and institutions for higher and continuing education. Participants
were fully informed about the intentions of the research before providing tacit consent (by clicking
through to the questionnaire) for their data to be included in the study. All data were de-identied
before analysis. As of 31 May 2020, the dataset comprised 1144 educators in 64 countries, across
primary, secondary and higher education. For the current analysis, only those identifying as second-
ary school teachers were included (n = 222; 68.8% women) and represented 20 countries. On
average, teachers were 43.5 years old (SD = 10.7, range: 23–74) and had 15.3 years of teaching
experience (SD = 10.1, range: 1–42) across a broad range of subjects (i.e. Arts & Humanities, Social
Sciences, Science, Business).
The teachers reported an average of 4.5 years of experience with online teaching in total, but only
34.5% indicated that they taught online prior to the Covid-19 pandemic. The shift to online teaching
was mandatory for 68.1% of the teachers, it was expected but not mandatory for 24.4% of the
teachers, and for 7.5% of the teachers the shift was neither mandatory nor expected. Similarly, most
teachers indicated that they had to transfer all their teaching to an online mode (67.6%), while some
indicated that only some (27.9%) or none (4.5%) of their teaching went online. On average, teachers
TECHNOLOGY, PEDAGOGY AND EDUCATION 5
were given 4.3 days to prepare for the shift to online teaching and learning. Please nd more details
about the sample characteristics in the Supplementary Material S1.
Measures
In the present study, we measured teachers’ readiness for online teaching by three key components:
TPACK self-ecacy (supporting learning), perceived online teaching presence (time and distance)
and perceived institutional support (educational context) using scales and standalone items.
Teachers’ background variables
To characterise and explain the proles, we assessed the following variables: teachers’ age (in years),
gender (0 = Male, 1 = Female), teaching experience (in years), online teaching before the Covid-19
pandemic (0 = No prior experience, 1 = Yes, teachers had prior experience), days to prepare for online
teaching after the decision for the shift (in days), days into online teaching after the shift (in days),
degree of the shift to online teaching (0 = No, none of it, 1 = Some of my teaching, 2 = Yes, all of it) and
degree of the institutional decision to shift to online teaching (0 = It was mandatory, 1 = It was
expected, 2 = It was mandatory).
TPACK self-ecacy
We focused on the three dimensions TPCK, TPK and TCK to represent the pedagogical and content-
related aspects of online teaching readiness (see Appendix A). To assess these dimensions, we
administered the validated TPACK self-ecacy scale presented by Archambault and Crippen
(2009), which was adapted to the online teaching context. The respective stimulus ‘I am condent
in my ability to . . . ’ introduced two TCK items (e.g., ‘ . . . implement curriculum in an online
environment’), four TPK items (e.g., ‘ . . . implement dierent methods of teaching online’) and four
TPCK items (e.g., ‘ . . . use technology to predict students’ skills/understanding of a particular topic’).
Participants responded on a 5-point agreement scale ranging from 0 (strongly disagree) to 4 (strongly
agree). The internal consistencies of the overall scale were high, Cronbach’s Alpha = 0.93, McDonald’s
Omega = 0.95.
Perceived online teaching presence
We assessed teachers’ perceptions of their online presence through three key dimensions (Gurley,
2018; see Appendix B): online teaching presence for clear instruction (POPCLA; e.g., ‘Overall, I can
clearly communicate important course goals’; 4 items), student feedback and assessment (POPFED;
e.g., ‘Overall, I provide feedback in a timely fashion’; 4 items) and cognitive activation (POPCOG; e.g.,
‘Overall, I help to keep course participants on a task in a way that helps students to learn’; 5 items).
Teachers indicated their agreement with these statements on a 5-point scale ranging from 0 (strongly
disagree) to 4 (strongly agree). The resultant three subscales showed suciently high internal
consistencies (POCLA: α = 0.90, ω
t
= 0.92; POPFED: α = 0.82, ω
t
= 0.87; POPCOG: α = 0.90, ω
t
= 0.93).
Perceived institutional support
We assessed teachers’ perceptions of the institutional support for online teaching using six items of
the scale proposed by Philipsen (2018; see Appendix c). These items addressed several aspects of
institutional support, including the schools’ vision and professional development for online teaching
(e.g., ‘In my institution, there is a supportive environment as regards professional development for
online learning’, ‘In my institution, there are clear objectives as regards online learning’). Participants
indicated their agreement with the six items on a 6-point scale ranging from 0 (strongly disagree) to 5
6S. K. HOWARD ET AL.
(strongly agree), and the internal consistency was high, Cronbach’s Alpha = 0.95, McDonald’s
Omega = 0.96. Given that this scale addressed the general institutional support independent of
the Covid-19 pandemic, we further added two standalone items that assessed teachers’ perceptions
of the technical and pedagogical support specic to the transition to online teaching during the
Covid-19 pandemic (Stimulus: ‘If your institution has asked you to transition your teaching from face-
to-face to online, have you been provided with the following . . . ’; item PISCO1: ‘Additional technical
support has been provided to transition face-to-face teaching to online because of Covid-19ʹ; item
PISCO2: ‘Additional pedagogical support has been provided to transition face-to-face teaching to
online because of Covid-19ʹ).
Statistical analyses
Conrmatory factor analysis
To identify unobserved (latent) proles of teachers with respect to their readiness for online
teaching, we rst examined the psychometric quality of the scales measuring their TPACK self-
ecacy, perceived online presence and institutional support. Specically, we evaluated their factor
structures conducting conrmatory factor analysis (Brown, 2015). For all scales, we specied a single-
factor model as a baseline model, modied it following our hypotheses on the specic structure of
the scale (e.g., assuming multiple factors instead of a single factor) and evaluated the model t
utilising goodness-of-t indices (for an acceptable t: CFI ≥.95, RMSEA ≤ .08, SRMR ≤ .10; e.g., Hu &
Bentler, 1999). Once we had identied a well-tting factor model, we extracted the respective factor
scores and used them as observed (manifest) representatives of the scale. To accommodate possible
deviations from the assumption of multivariate normality for the items within a scale, we used robust
maximum-likelihood (MLR) estimation and derived robust standard errors of all model parameters.
Consequently, model comparisons were based on the Satorra–Bentler adjusted chi-square (SB-)
dierence test (Satorra & Bentler, 2010). We performed all analyses in the R packages ‘lavaan’ version
0.6–6 (Rosseel, 2012) and ‘psych’ version 1.9.12.31 (Revelle, 2019), utilising the full-information-
maximum-likelihood procedure to handle missing item responses (Enders, 2010). Please nd all
details of these analyses in the Supplementary Material S1.
Latent prole analysis
As a second step, we performed latent prole analysis (LPA) using the factor scores of TPACK self-
ecacy (gTPACK), the three dimensions of perceived online presence (POPCLA, POPFED, POPCOG),
perceived institutional support (gPIS) and the two standalone items asking teachers about the
perceived support for online teaching during the Covid-19 pandemic (PISCO1 and PISCO2). LPA
represents a person-centred approach that identies homogeneous groups in a sample based on
a set of prole indicators (Lubke & Muthén, 2005) and oers a more exible and model-based
approach to identifying groups than cluster analysis (Marsh et al., 2009). Furthermore, LPA oers
relative t indices, such as the Akaike information criterion [AIC], the Bayesian information criterion
[BIC] and the sample-size-adjusted BIC [aBIC], allowing researchers to compare dierent assumptions
on the number of proles, their shape and sizes.
Typically, identifying the number of latent proles in a sample follows a rigorous procedure in
which a series of LPAs are conducted with varying numbers of proles (Masyn, 2013). The resultant
models are then compared via the adjusted Lo-Mendell-Rubin [LMR] and the Vuong-Lo-Mendell-
Rubin [VLMR] likelihood-ratio tests [LRT], information criteria and the entropies, that is, an indicator
of classication accuracy with acceptable values above .70 (Jung & Wickrama, 2008). However, the
optimal prole solution should reveal conceptually meaningful and interpretable proles of sub-
stantial size (Marsh et al., 2009). We performed the LPA with the software package Mplus version 8.3
TECHNOLOGY, PEDAGOGY AND EDUCATION 7
(Muthén & Muthén, 1998/2017), using MLR estimation. Please nd the details in the Supplementary
Material S2.
Results
Descriptive statistics, measurement models and correlations
The descriptive statistics for the variables we submitted to LPA (i.e. the prole indicators) did not
indicate any ceiling or oor eects, and the respective items only marginally deviated from an
optimal normal distribution (see Supplementary Material S1). Table 1 exhibits the respective statistics
and the correlation matrix.
Prole indicators were positively and signicantly correlated, with high correlations among
indicators representing the subscales of the same construct and low to moderate correlations
among indicators of dierent constructs. The full item-level correlation matrix is presented in
Supplementary Material S3.
Next, we inspected the measurement models of the scales (see Supplementary Material S1). The
model describing TPACK self-ecacy contained a general TPACK factor and four residual covariances
among items that addressed the same concepts or similar formulations, SB-χ2(31) = 43.8, p = .064,
CFI = 0.988, RMSEA = 0.043, SRMR = 0.031. Neither a single-factor model without residual covariances
nor a model distinguishing between TPCK, TPK and TCK as three correlated factors exhibited good t
to the data. The latter showed factor correlations between ρ = 0.95 and ρ = 0.98. Perceived
institutional support was represented by a single factor and a residual covariance between two
items. The respective model showed excellent t to the data, SB-χ2(8) = 11.4, p = .179, CFI = 0.996,
RMSEA = 0.044, SRMR = 0.015. Finally, the measurement model distinguishing between three factors
of perceived online teaching presence (i.e. POPCLA, POPFED, POPCOG) exhibited a good t to the data
(SB-χ2[60] = 84.4, p = .021, CFI = 0.981, RMSEA = 0.043, SRMR = 0.034) and resulted in factor
correlations between ρ = 0.68 and ρ = 0.84. Two residual covariances were part of this model (see
Supplementary Material S1).
Latent prole analysis
Identifying the number of profiles
Table 2 shows the resultant log-likelihood values, the information criteria, entropies and the like-
lihood-ratio tests comparing the model with k proles to the model with k-1 proles.
Increasing the number of proles decreased the log-likelihood values and information criteria and
indicated a better t of the LPA models with more proles. At the same time, the likelihood-ratio tests
suggested that this decrease was no longer statistically signicant when adding one more prole to the
three-prole solution – however, the extant literature discusses these tests controversially, especially
because they only test global model t dierences and may not have sucient power to detect the
correct number of proles in situations with a highly separability of the proles (Tein et al., 2013). In this
sense, these tests should not serve as the only source for determining the proles. Examining the
information criteria, we also found an elbow in their decrease between three and four proles. Finally,
the entropy was the highest for the ve-prole model. We further examined the prole sizes and their
shapes and found that very small and hardly distinguishable proles occurred for more than four proles
(see Table 2). Although the three-prole model exhibited well-interpretable proles with consistently low,
medium or high values of the prole indicators, adding one more prole identied a group of teachers
with a pattern that could be clearly distinguished from these three standard proles. Considering the
high entropy (0.924), the preference over the three-prole model according to the information criteria,
the suciently large prole sizes and the interpretability of the proles, we accepted the LPA model with
four proles as the nal model.
8S. K. HOWARD ET AL.
Table 1. Means, standard deviations and correlations with confidence intervals.
Variable M SD 1 2 3 4 5 6
1. gTPACK 0 0.80
2. gPIS 0 1.18 .36**
[.24,.47]
3. PISCO1 2.60 1.62 .34** .71**
[.22,.45] [.64,.77]
4. PISCO2 2.08 1.54 .37** .72** .68**
[.25,.48] [.65,.78] [.60,.74]
5. POPGEN 0 0.75 .66** .35** .32** .33**
[.58,.73] [.23,.46] [.19,.43] [.21,.45]
6. POPCLA 0 0.65 .62** .36** .34** .28** .74**
[.53,.69] [.24,.47] [.22,.45] [.15,.39] [.68,.80]
7. POPFED 0 0.68 .65** .36** .33** .33** .91** .82**
[.56,.72] [.24,.47] [.21,.44] [.21,.44] [.88,.93] [.78,.86]
TECHNOLOGY, PEDAGOGY AND EDUCATION 9
Table 2. Information criteria, entropies and results of the likelihood-ratio tests for the LPA models with one to six profiles.
Model LL Npar SCF AIC BIC aBIC Entropy p(VLMR-LRT) p(LMR-LRT) Description
One profile –2139.006 14 1.0201 4306.012 4353.649 4309.282 1.000 - - Baseline model
Two profiles –1888.273 29 1.1282 3834.547 3933.224 3841.321 0.857 0.3061 0.309 Two profiles with high vs. low scores
Three profiles –1731.877 44 0.8431 3551.753 3701.471 3562.031 0.921 0.0461 0.0474 Three profiles with high, medium and low scores
Four profiles –1642.848 59 0.8146 3403.695 3604.453 3417.477 0.924 0.2398 0.2413 Suggested solution
Five profiles –1549.075 74 0.9864 3246.150 3497.948 3263.436 0.935 1.0000 1.0000 Smallest group: n = 11, two very similar profiles
Six profiles –1482.811 89 0.8325 3143.622 3446.460 3164.412 0.926 1.0000 1.0000 Smallest group: n = 10, three very similar profiles, convergence issues
Note. LL = Log-likelihood value, Npar = Number of parameters, SCF = Scale correction factor, AIC = Akaike’s Information Criterion, BIC = Bayesian Information Criterion, aBIC = Sample size-adjusted
BIC, p(VLMR-LRT) = p-value of the Vuong-Lo-Mendell-Rubin (VLMR) likelihood-ratio test, p(LMR-LRT) = p-value of the Lo-Mendell-Rubin (LMR) likelihood-ratio test. The suggested number of profiles
is highlighted in grey.
10 S. K. HOWARD ET AL.
To further back this decision, we examined whether the four proles diered signicantly in the
prole indicators using a multivariate analysis of variance. The overall, multivariate test of mean
dierences between the four proles was statistically signicant (Pillai’s trace V = 0.56, F[7,
212] = 38.7, p < .001) and explained about 57% of the variance in the prole indicators. Further post-
hoc tests revealed signicant prole dierences in all prole indicators (Fs > 12.99, ps < .001),
explaining between 5.6% (PISCO1) and 50.0% (POPGEN) of variation. These ndings support the
distinction between the four proles.
Considering the heterogeneous nature of the sample, we also tested for possible country eects.
To do this, we extended the LPA to multilevel LPA with countries as cluster units and tested for
signicant between-country variation in the prole membership probabilities. The comparison
between the LPA model with predictors at the individual level and its multilevel extension did not
reveal any signicant variation between countries (Likelihood-ratio test: chi-square [6] = 3.9, p = .69).
Hence, we did not have evidence for country eects.
Characterising the profiles
The characteristics of teachers in the four proles are presented in Table 3. The four proles are
presented in Figure 1. It is observed that Prole 3 has the largest membership with 46.6% of the
sample (n = 103), the smallest being Prole 4 with 13.5% (n = 30).
Figure 1 presents the factor means of each prole for teachers’ perceptions of their own readiness
(gTPACK, POPCLA, POPFED & POPCOG) and readiness of their institution (gPIS, PISCO1 & PISCO2).
Strong correlations (see Table 1) between gTPACK and gPIS are reected in Proles 2–4, and strong
correlations among subfactors of both gTPACK and gPIS are observed in all four proles. On teachers’
perceptions of their individual readiness to teach online, Proles 3 and 4 both reect teachers’
positive perceptions, resulting in 59.9% of participants reporting positive perceptions about their
readiness to teach online. This is particularly interesting, given the majority of the sample reported
having not previously taught online. It can also be noted that teachers in Prole 4 had more days to
prepare to teach online (see Table 3).
Proles 1 and 2 both reect negative perceptions; variance on all factors was less in Proles 3 and
4 than Proles 1 and 2 (see Supplementary Material S4).
On teachers’ perceptions of their institution’s readiness to teach online, the pattern changes
slightly. Proles 1, 3 and 4, comprising 83.3% of the sample, reect positive perceptions about
institutional readiness. Prole 2 is the only group to report negative perceptions of institutional
readiness. Results suggest these are strongly negative perceptions across all three institutional
factors. Teachers in Prole 1 and 2 had similar perceptions of their individual readiness, but analysis
of background variables relating to institutional experiences suggests participants were more likely
to be members of Prole 1 (B = –0.40, SE = 0.19, p < .039, OR = 0.67) if the transition to online
teaching in response to Covid-19 was not made mandatory in their institution or if they did not have
to transition all of their teaching (see Table 3).
In summary, four proles were identied. These can be classied as representing ‘High’ percep-
tions of readiness (Prole 4), ‘Medium’ perceptions (Prole 3), ‘Low’ perceptions (Prole 1) and
‘Mixed’ perceptions (Prole 2). It can be observed that the four-prole solution does indeed provide
a theoretically meaningful representation of teachers’ perceptions. This also highlights the impor-
tance of institutional readiness, particularly for teachers with lower perceptions of their own readi-
ness to teach online. Implications and scenarios will be considered in the next section.
Discussion, implications and future research
The main aim of the current study is to identify teacher proles in secondary education based on
their perceptions of readiness to transition to online teaching as a result of Covid-19, in regard to
their own readiness and that of their institution. Addressing the research question, the main
contribution of this article is the identication of four proles based on combined individual and
TECHNOLOGY, PEDAGOGY AND EDUCATION 11
Table 3. Teacher characteristics within the four profiles.
Teacher characteristics Profile 1 (n = 37) Profile 2 (n = 52) Profile 3 (n = 103) Profile 4 (n = 30)
Background variables
Age M (SD) in years 41.9 (9.3) 44.9 (11.3) 43.5 (11.2) 42.6 (9.3)
Gender
Women 21.6% 36.5% 33.0% 27.6%
Men 78.4% 63.5% 67.0% 72.4%
Teaching experience M (SD) in years 14.9 (10.2) 15.5 (11.0) 15.3 (10.3) 15.2 (8.1)
Online teaching and learning
Prior online teaching experience
Yes 10.8% 17.3% 27.2% 53.3%
No 89.2% 82.7% 72.8% 46.7%
Days of preparation for online teaching M (SD) 3.6 (4.4) 2.4 (3.7) 3.9 (5.1) 9.5 (27.4)
Shift to online teaching due to Covid-19
No, none of the teaching was shifted. 2.7% 9.6% 2.9% 3.3%
Some of the teaching was shifted. 27.0% 19.2% 32.0% 30.0%
Yes, all of the teaching was shifted. 70.3% 71.2% 65.1% 6.7%
Days into online teaching after the shift M (SD) 1.4 (0.7) 1.5 (1.1) 1.8 (1.3) 1.6 (1.5)
Institutional decision of the shift
It was not mandatory. 13.9% 6.2% 6.0% 6.9%
It was expected, but not mandatory. 27.8% 27.1% 24.0% 17.2%
It was mandatory. 58.3% 66.7% 70.0% 75.9%
Note. gTPACK = TPACK self-efficacy, POPCLA = Perceived online teaching presence: Clarity of instruction, POPFED = Perceived online teaching presence: Feedback to and assessment of students,
POPCOG = Perceived online teaching presence: Cognitive activation, gPIS = Perceived institutional support in general, PISCO1 = Perceived institutional support: Technical support during Covid-19,
PISCO2 = Perceived institutional support: Pedagogical support during Covid-19.
12 S. K. HOWARD ET AL.
institutional factors. In the following discussion, we explore the implications of individual and
institutional perceptions of each prole in regard to the Covid-19 transition to online teaching and
how these ndings can inform future change initiatives.
First, we consider implications of Proles 3 and 4, Medium and High perceptions of readiness for
online teaching, respectively. While both of these groups are positive, the nature of their perceptions
is quite dierent. Primarily, the Medium group is only weakly positive, and signicantly lower than
the High group. As a result, the level of support to transition to online teaching between these two
sets of teachers would vary signicantly. In particular, the Medium group would likely benet from
direct support on how to facilitate learning (gTPACK), create clear instruction (POPCLA), provide
feedback (POPFED) and keep students engaged (POPCOG). Given the Medium group has generally
expressed positive perceptions, the teachers would likely feel positively about online teaching and
learning and be receptive to targeted support to develop their online teaching (Almpanis, 2015; Bao,
2020). Given the very positive perceptions and higher experience with online teaching in the High
group, these teachers could be used in a school to support their peers to design and implement
online learning.
In terms of institutional support, given the strong correlation between perceptions of Institutional
Support (gPIS) and support for the Covid-19 transition (PISCO1 & PISCO2), it is unsurprising that
teachers in the Medium group felt their institutions had less overall support for online learning. The
transition to online learning was also less likely to be mandatory for teachers in the Medium group
(see Table 3). However, it is necessary to consider possible eects of the relationship between their
individual perceptions and what they perceive as support at their institution. Teachers in the
Medium prole reported their institutions were only weakly supporting them technologically and
pedagogically to transition to online learning, and this is likely to aect how they perceive their own
readiness (e.g., Chandra & Mills, 2015; Ertmer & Ottenbreit-Leftwich, 2010). A perceived lack of
support or a weakly positive perception of support in relation to variables such as professional
development, clear objectives and an institutional strategy will aect how teachers approach online
learning, what they feel are the objectives and how they design interactions. For teachers with only
weak positive perception of their own readiness, it would actually be more important for institutions
Figure 1. Profiles describing secondary school teachers’ readiness for online teaching.
TECHNOLOGY, PEDAGOGY AND EDUCATION 13
to have a strong vision and clear goals for online teaching and learning (Hargreaves & Goodson,
2006; Howard, 2019). A lack of consistency across an institution will also aect the experiences of
their students, which in secondary education may then be highly variable across their dierent
subjects (e.g., McGee et al., 2017). For teachers in the High prole, they reported very positive
perceptions of their institutional readiness. However, while not statistically signicantly dierent in
the sample, the technological support was higher than pedagogical support. For teachers with more
experience in online teaching, ongoing pedagogical support would be important for them to
continue professional learning and experimenting with new online learning approaches (e.g.,
Philipsen et al., 2019). This should be a consideration for high-performing teachers in change
initiatives. However, during the rapid shift to online teaching during the Covid-19 pandemic, it
would be unlikely this group would be engaging in training. They would be able to instead provide
necessary ‘just-in-time’ institutional support for other teachers, and support through communities of
practice beyond their schools, to support a consistent vision of online learning and teaching (e.g.,
Howard et al., 2018).
Teachers in the Low and Mixed proles, Proles 1 and 2, respectively, reported negative percep-
tions of their individual readiness to teach online. The diculty of this, in the rapid transition to
online learning in the Covid-19 pandemic, is that there was little time to provide teachers with the
necessary professional development. In this case, how institutions approach the rapid transition to
support teachers is even more important than usual. With this in mind, we consider the split between
Low and Mixed groups on institutional readiness and implications for institutional responses during
the rapid transition.
The Low group also reported very negative perceptions of their institution’s readiness, while the
Mixed group was positive. First, it is quite possible that the Low group’s strong negative perceptions
of their institutions aected their perceptions of readiness, their engagement in teaching online and
designing online learning (e.g., Hargreaves & Goodson, 2006). Further, the nding that teachers in
the Low group were less likely to have a mandatory online transition in their institution, combined
with their negative perceptions, suggests they may have experienced a more chaotic approach to
the transition to online teaching. In contrast, the Mixed group was actually more positive about their
institutional readiness than the Medium group. Positive perceptions of their institution would have
a positive eect on their individual readiness (Ertmer & Ottenbreit-Leftwich, 2010). While the Mixed
teachers might not have felt condent, they would be more likely to be motivated to teach online if
their institutions created a clear vision and provided the necessary support (Howard et al., 2018).
Institutional responses at times of rapid changes in teaching and learning could possibly com-
pensate for a lack of time and/or training for teachers to appropriately prepare for changes. For
teachers with a Low or Mixed prole, a strong institutional response to change could signicantly
improve their experience and the experiences of their students. This is particularly important for
changes initiatives around online learning, where there is not a clear understanding of what this
comprises (Singh & Thurman, 2019), or disciplinary expectations in secondary education. Future
research will explore some of the implications of interactions between individual and institutional
readiness. While participating teachers represented 20 dierent countries in the sample, there was
no country eect in prole membership. However, given the small sample size (N = 222), the results
should be considered exploratory, rather than generalisable. Therefore, teachers’ qualitative
responses will be analysed to better understand their experience transitioning to online teaching,
in relation to the specic factors (e.g., TPACK, online presence and institutional readiness) to gain
a better understanding of the proles, where educational context aects teachers’ experiences and
what type of support may be appropriate for each group.
A limitation of the study is distribution through social media and other online channels. This
potentially excludes teachers who are not active online and would potentially have a dierent
experience in online learning. However, a reasonable way to access international teachers who
are/were transitioning to online teaching, but not participating in social media, does not exist.
Moreover, while distribution through social media aorded access to teachers across 20 countries,
14 S. K. HOWARD ET AL.
there were only 1–2 teachers from many of those contexts. This limits the ability to test for
dierences between countries. A second issue is the absence of an outcome variable to understand
the implications of prole membership on design of online learning and results of their transition. A
signicant proportion of teachers have agreed to a follow-up study. This study will aim to explore
results of the rapid transition to online teaching by further exploring a range of countries and
contexts to better understand the online learning and teaching outcomes for each prole.
Conclusions
The rapid transition from face-to-face teaching to online in secondary education, in response to the
Covid-19 pandemic, was unprecedented. The short timeframe to transition to online teaching and
learning resulted in limited time for teachers to upskill and prepare. Therefore, the role of the
institution had an important responsibility to provide support and a common view of expectations
for online learning and teaching, to compensate for individual perceptions of readiness. Clearly, from
the four proles how this happened across the 20 countries was quite variable. While the immediate
transition from face-to-face teaching has passed, and many students and teachers across the world
have returned to the classroom, teaching practice in schools will forever be changed. Change is likely
to be on-going, given the need to prepare for possible similar events in the future, but also to
continue to integrate some of the new and useful practices teachers have learned and adopted in
this time. Findings from this study provide important insights into four proles of readiness that can
guide institutions and teachers to support on-going development of online pedagogies, teaching
and learning. Importantly, the individual and the institution need to be considered together, to
better understand teachers’ experiences, their need for support and the need for a clear under-
standing of online learning and teaching.
Disclosure statement
There is no nancial interest or benet that has arisen from the direct applications of this research.
Notes on contributors
Sarah K. Howard is an Associate Professor at the University of Wollongong. Her research focuses on the use of new
technologies and data science to explore classroom practice and teacher change, specically related to digital
technology adoption and integration in learning.
Jo Tondeur is an Assistant Professor at the Vrije Universiteit Brussel (MILO). His research interests are in the eld of
instructional design and educational innovation. His work focuses on ICT integration in teaching and learning
processes.
Fazilat Siddiq is an Associate Professor at the University of South-Eastern Norway. Her primary research interest is in
measurement of students’ digital competence, with an emphasis on communication and collaboration. She is also
interested in teachers’ Technological Pedagogical Content Knowledge and use of ICT in their classroom practices.
Ronny Scherer is a Professor at the University of Oslo. His research focuses on substantive-methodological synergisms in
the broad areas of science education, computer-based assessment and problem solving.
References
Almpanis, T. (2015). Sta development and institutional support for technology enhanced learning in UK universities.
The Electronic Journal of E-Learning, 13(5), 366–375.
Arbaugh, J. B., & Hwang, A. (2006). Does ‘teaching presence’ exist in online MBA courses? The Internet and Higher
Education, 9(1), 9–21. https://doi.org/10.1016/j.iheduc.2005.12.001
Archambault, L., & Crippen, K. (2009). Examining TPACK among K-12 online distance educators in the United States.
Contemporary Issues in Technology and Teacher Education, 9(1), 71–88. https://www.learntechlib.org/primary/p/
29332/
TECHNOLOGY, PEDAGOGY AND EDUCATION 15
Asendorpf, J. B. (2015). Person-centered approaches to personality. In M. Mikulincer, P. R. Shaver, M. L. Cooper, &
R. J. Larsen (Eds.), APA handbooks in psychology®. APA handbook of personality and social psychology, Vol. 4. Personality
processes and individual dierences (pp. 403–424). American Psychological Association. https://doi.org/10.1037/
14343-018
Bao, W. (2020). COVID-19 and online teaching in higher education: A case study of Peking University. Human Behavior
and Emerging Technologies, 2(2), 113–115. https://doi.org/10.1002/hbe2.191
Benson, S. N. K., & Ward, C. L. (2013). Teaching with technology: Using TPACK to understand teaching expertise in online
higher education. Journal of Educational Computing Research, 48(2), 153–172. https://doi.org/10.2190/EC.48.2.c
Bergman, L. R., & Wångby, M. (2014). The person-oriented approach: A short theoretical and practical guide. Estonian
Journal of Education, 2(1), 29–49. https://doi.org/10.12697/eha.2014.2.1.02b
Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the inuence of faculty development training on
teaching eectiveness through a TPACK lens. The Internet and Higher Education, 38, 28–35. https://doi.org/10.1016/j.
iheduc.2018.04.004
Brown, T. A. (2015). Conrmatory factor analysis for applied research (2nd ed.). The Guilford Press.
Chandra, V., & Mills, K. A. (2015). Transforming the core business of teaching and learning in classrooms through ICT.
Technology, Pedagogy and Education, 24(3), 285–301. https://doi.org/10.1080/1475939X.2014.975737
Chigeza, P., & Halbert, K. (2014). Navigating e-learning and blended learning for pre-service teachers: Redesigning for
engagement, access and eciency. Australian Journal of Teacher Education, 39(11), 133–146. https://doi.org/10.
14221/ajte.204v39n11.8
Cucinotta, D., & Vanelli, M. (2020). WHO declares COVID-19 a pandemic. Acta bio-medica: Atenei Parmensis, 91(1),
157–160. https://doi.org/10.23750/abm.v91i1.9397
Enders, C. K. (2010). Applied missing data analysis. Guilford Press.
Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change: How knowledge, condence, beliefs, and
culture intersect. Journal of Research on Technology in Education, 42(3), 255–284.
Ferdig, R. E., Baumgartner, E., Hartshorne, R., Kaplan-Rakowski, R., & Mouza, C. (2020). Teaching, technology, and teacher
education during the COVID-19 pandemic: Stories from the eld. Association for the Advancement of Computing in
Education. https://www.learntechlib.org/p/216903/
Gurley, L. E. (2018). Educators’ preparation to teach, perceived teaching presence, and perceived teaching presence
behaviors in blended and online learning environments. Online Learning, 22(2), 179–220. https://doi.org/10.24059/
olj.v22i2.1255
Hargreaves, A., & Goodson, I. (2006). Educational change over time? The sustainability and nonsustainability of three
decades of secondary school change and continuity. Educational Administration Quarterly, 42(1), 3–41. https://doi.
org/10.1177/0013161X05277975
Hirsch, C. (2020, March 31). Europe’s coronavirus lockdown measures compared. Politico. https://www.politico.eu/article/
europes-coronavirus-lockdown-measures-compared/
Howard, S. (2019). Policy rationales and integration rationales, implications for subject area teaching. In A. Tatnall (Ed.),
Encyclopedia of education and information technologies (pp. 1–9). Springer International Publishing. https://doi.org/
10.1007/978-3-319-60013-0_91-1
Howard, S. K., Curwood, J. S., & McGraw, K. (2018). Leaders fostering teachers’ learning environments for technology
integration. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), Second handbook of information technology in
primary and secondary education (pp. 1–19). Springer International Publishing. https://doi.org/10.1007/978-3-319-
53803-7_35-1
Hu, L. T., & Bentler, P. M. (1999). Cuto criteria for t indexes in covariance structure analysis: Conventional criteria versus
new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/
10705519909540118
Hung, M. L. (2016). Teacher readiness for online learning: Scale development and teacher perceptions. Computers &
Education, 94, 120–133. https://doi.org/10.1016/j.compedu.2015.11.012
Jung, T., & Wickrama, K. A. S. (2008). An introduction to latent class growth analysis and growth mixture modeling. Social
and Personality Psychology Compass, 2(1), 302–317. https://doi.org/10.1111/j.1751-9004.2007.00054.x
Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The technological pedagogical content
knowledge framework. In M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational
communications and technology (pp. 101–111). Springer.
Law, K. M., Geng, S., & Li, T. (2019). Student enrollment, motivation and learning performance in a blended learning
environment: The mediating eects of social, teaching, and cognitive presence. Computers & Education, 136, 1–12.
https://doi.org/10.1016/j.compedu.2019.02.021
Lubke, G. H., & Muthén, B. (2005). Investigating population heterogeneity with factor mixture models. Psychological
Methods, 10(1), 21–39. https://doi.org/10.1037/1082-989X.10.1.21
Magnusson, D. (1988). Individual development from an interactional perspective: A longitudinal study. Lawrence Erlbaum
Associates.
16 S. K. HOWARD ET AL.
Marsh, H. W., Lüdtke, O., Trautwein, U., & Morin, A. J. S. (2009). Classical latent prole analysis of academic self-concept
dimensions: Synergy of person- and variable-centered approaches to theoretical models of self-concept. Structural
Equation Modeling: A Multidisciplinary Journal, 16(2), 191–225. https://doi.org/10.1080/10705510902751010
Martin, F., Budhrani, K., & Wang, C. (2019). Examining faculty perception of their readiness to teach online. Online
Learning, 23(3), 97–119. https://doi.org/10.24059/olj.v23i3.1555
Masyn, K. E. (2013). Latent class analysis and nite mixture modeling. In T. D. Little (Ed.), The Oxford handbook of
quantitative methods: Statistical analysis (Vol. 2, pp. 551–611). Oxford University Press.
McAllister, L., & Graham, C. (2016). An analysis of the curriculum requirements for K-12 online teaching endorsements in
the U.S. Journal of Online Learning Research, 2(3), 247–282. Association for the Advancement of Computing in
Education https://www.learntechlib.org/primary/p/173220/
McGee, P., Windes, D., & Torres, M. (2017). Experienced online instructors: Beliefs and preferred supports regarding
online teaching. Journal of Computing in Higher Education, 29, 331–352. https://doi.org/10.1007/s12528-017-9140-6
Muthén, L. K., & Muthén, B. O. (1998/2017). Mplus user’s guide (8th ed.). Muthén & Muthén.
Pallo, R. M., & Pratt, K. (2013). Lessons from the virtual classroom: The realities of online teaching. John Wiley & Sons.
Phan, T. T. N., & Dang, L. T. T. (2017). Teacher readiness for online teaching: A critical review. International Journal Open
Distance E-Learn. IJODeL, 3(1), 1–16.
Philipsen, B. (2018). Teacher professional development for online and blended learning in adult education and training.
[Doctor in Educational Sciences Doctoral Dissertation]. Vrije Universiteit Brussel, Brussels, Belgium.
Philipsen, B., Tondeur, J., Pareja Roblin, N., Vanslambrouck, S., & Zhu, C. (2019). Improving teacher professional
development for online and blended learning: A systematic meta-aggregative review. Educational Technology
Research and Development, 67(5), 1145–1174. https://doi.org/10.1007/s11423-019-09645-8
Pulham, E., & Graham, C. R. (2018). Comparing K-12 online and blended teaching competencies: A literature review.
Distance Education, 39(3), 411–432. https://doi.org/10.1080/01587919.2018.1476840
Revelle, W. (2019). Psych: Procedures for psychological, psychometric, and personality research. R package version 1. 9.12.
Northwestern University. https://CRAN.R-project.org/package=psych
Rogers, P. L., Berg, G. A., Boettcher, J. V., Howard, C., Justice, L., & Schenk, K. D. (Eds.). (2009). Encyclopedia of distance
learning. IGI Global.
Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 1(2), 1–36.
https://doi.org/10.18637/jss.v048.i02
Satorra, A., & Bentler, P. M. (2010). Ensuring positiveness of the scaled dierence chi-square test statistic. Psychometrika,
75(2), 243–248. https://doi.org/10.1007/s11336-009-9135-y
Scherer, R., Siddiq, F., & Tondeur, J. (2018). The technology acceptance model (TAM): A meta-analytic structural equation
modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128,
13–35. https://doi.org/10.1016/j.compedu.2018.09.009
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009). Technological pedagogical
content knowledge (TPACK). Journal of Research on Technology in Education, 42(2), 123–149. https://doi.org/10.1080/
15391523.2009.10782544
Singh, V., & Thurman, A. (2019). How many ways can we dene online learning? A systematic literature review of
denitions of online learning (1988–2018). American Journal of Distance Education, 33(4), 289–306. https://doi.org/10.
1080/08923647.2019.1663082
Tein, J.-Y., Coxe, S., & Cham, H. (2013). Statistical power to detect the correct number of classes in latent prole analysis.
Structural Equation Modeling: A Multidisciplinary Journal, 20(4), 640–657. https://doi.org/10.1080/10705511.2013.
824781
Tondeur, J., Scherer, R., Siddiq, F., & Baran, E. (2017). A comprehensive investigation of TPACK within pre-service
teachers’ ICT proles: Mind the gap! Australasian Journal of Educational Technology, 33(3), 46–60. https://doi.org/
10.14742/ajet.3504
Voogt, J., Fisser, P., Pareja Roblin, N., Tondeur, J., & van Braak, J. (2013). Technological pedagogical content knowledge –
A review of the literature. Journal of Computer Assisted Learning, 29(2), 109–121. https://doi.org/10.1111/j.1365-2729.
2012.00487.x
Wilson, G., & Stacey, E. (2004). Online interaction impacts on learning: Teaching the teachers to teach online.
Australasian Journal of Educational Technology, 20(1), 33–48. https://doi.org/10.14742/ajet.1366
Appendix A
TPACK self-ecacy scale
Technological Content Knowledge
(o) My ability to use technological representations (i.e. multimedia, visual demonstrations, etc.) to demonstrate
specic concepts in my content area.
(t) My ability to implement district curriculum in an online environment.
TECHNOLOGY, PEDAGOGY AND EDUCATION 17
(v) My ability to use various courseware programs to deliver instruction (e.g., Blackboard, Centra).
Technological Pedagogical Knowledge
(h) My ability to create an online environment which allows students to build new knowledge and skills.
(l) My ability to implement dierent methods of teaching online.
(n) My ability to moderate online interactivity among students.
(p) My ability to encourage online interactivity among students.
Technological Pedagogical Content Knowledge
(e) My ability to use online student assessment to modify instruction.
(k) My ability to use technology to predict students’ skill/understanding of a particular topic.
(w) My ability to use technology to create eective representations of content that depart from textbook knowledge.
(x) My ability to meet the overall demands of online teaching.
Adapted from Archambault and Crippen (2009).
Appendix B
Online teaching presence scale
Appendix C
Institutional support for online learning
1. Overall, I clearly communicate important course topics.
2. Overall, I clearly communicate course goals.
3. Overall, I provide clear instructions on how to participate in course learning activities.
4. Overall, I clearly communicate important due dates/time frames for learning activities.
5. Overall, I am helpful in identifying areas of agreement and disagreement on course topics that help students to learn.
6. Overall, I am helpful in guiding the class towards understanding course topics in a way that helps students clarify their
thinking.
7. Overall, I help to keep course participants engaged and participating in productive dialogue.
8. Overall, I help keep the course participants on task in a way that helps students learn.
9. Overall, I encourage course participants to explore new concepts in courses.
10. Overall, my actions reinforce the development of a sense of community among course participants.
11. Overall, I help to focus discussion on relevant issues in a way that helps students to learn.
12. Overall, I provide feedback that helps students understand their strengths and weaknesses relative to the course’s goals
and objectives.
13. Overall, I provide feedback in a timely fashion.
Adapted from Gurley (2018).
In our institution . . .
1. there is a clear vision towards OL.
2. there is a supportive environment as regards professional development for OL.
3. there are clear objectives as regards OL.
4. the current ICT possibilities and infrastructure as regards OL are taken into account.
5. attention is payed to the teacher change processes inherent to changing to online or blended learning.
6. there is a professional development strategy towards OL.
Adapted from Philipsen (2018).
Completely disagree Disagree Disagree more than agree Agree more than disagree Agree Completely agree
0 1 2 3 4 5
18 S. K. HOWARD ET AL.