Content uploaded by Linda Price
Author content
All content in this area was uploaded by Linda Price on Jul 22, 2015
Content may be subject to copyright.
This article was downloaded by: [Open University]
On: 29 October 2014, At: 07:49
Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
Technology, Pedagogy and Education
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/rtpe20
Informed design of educational
technology for teaching and learning?
Towards an evidence-informed model
of good practice
Linda Pricea & Adrian Kirkwooda
a Institute of Educational Technology, The Open University, Walton
Hall, Milton Keynes, UK
Published online: 11 Aug 2014.
To cite this article: Linda Price & Adrian Kirkwood (2014) Informed design of educational
technology for teaching and learning? Towards an evidence-informed model of good practice,
Technology, Pedagogy and Education, 23:3, 325-347, DOI: 10.1080/1475939X.2014.942749
To link to this article: http://dx.doi.org/10.1080/1475939X.2014.942749
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions
Informed design of educational technology for teaching and
learning? Towards an evidence-informed model of good practice
Linda Price*and Adrian Kirkwood
Institute of Educational Technology, The Open University, Walton Hall, Milton Keynes, UK
(Received 30 July 2013; final version received 20 June 2014)
The aim of this paper is to model evidence-informed design based on a selective
critical analysis of research articles. The authors draw upon findings from an
investigation into practitioners’use of educational technologies to synthesise and
model what informs their designs. They found that practitioners’designs were
often driven by implicit assumptions about learning. These shaped both the
design of interventions and the methods sought to derive evaluations and
interpret the findings. The authors argue that interventions need to be grounded
in better and explicit conceptualisations of what constitutes learning in order to
have well-informed designs that focus on improving the quality of student
learning.
Keywords: learning design; educational technology; university teaching; student
learning; technology-enhanced learning; evidence-informed practice
Introduction
The aim of this article is to model evidence-informed design of educational technol-
ogy (ET) used for teaching and learning in higher education, based on a selective
critical analysis of research findings. It is not uncommon in ET for greater attention
to be given to the technology and its implementation than its impact upon student
learning (Kirkwood & Price, 2013b). However, a fundamental question remains as
to whether the increased time spent on implementing ET is benefiting student learn-
ing and not wasting their time, or the time that their teachers have invested. Slavin
(2008) stated that:
Throughout the history of education, the adoption of instructional programs and prac-
tices has been driven more by ideology, faddism, politics, and marketing than by evi-
dence. (p. 5)
Policy makers tend to eschew evidence in their development of policy relating to ET,
while practitioners, enmeshed in a bustling teaching environment, tend to rely on
tacit knowledge (Anderson & Biddle, 1991; Fitz-Gibbon, 1999). Fitz-Gibbon (1997,
pp. 35–36) further argued that evidence-based approaches are necessary in order to:
challenge the implementation of untested practices;
address problems and prevent damaging practices;
generate improvements that lead to more effective learning.
*Corresponding author. Email: linda.price@open.ac.uk
© 2014 Association for Information Technology in Teacher Education
Technology, Pedagogy and Education, 2014
Vol. 23, No. 3, 325–347, http://dx.doi.org/10.1080/1475939X.2014.942749
Downloaded by [Open University] at 07:49 29 October 2014
Cohen, Manion, and Morrison (2011, p. 336) questioned whether
[i]t is bordering on the unethical to implement untried and untested recommendations
in educational practice, just as it is unethical to use untested products and procedures
on hospital patients without their consent.
Research evidence to substantiate the value of technology in enhancing learning has
not been adequately established (Kirkwood & Price, 2013a). Equally, limited atten-
tion has been given to what is actually informing ET designs. Fundamental problems
arise from insufficient understanding by practitioners and researchers of variations in
the nature of learning and teaching (Kember & Kwan, 2000; Samuelowicz & Bain,
1992,2001; Trigwell & Prosser, 1996). This is often accompanied by insufficient
reference to explicit theoretical models or research evidence to inform designs (de
Laat, Lally, Simons, & Wenger, 2006; Kirkwood & Price, 2013b). For example,
while Mishra and Koehler (2008) advanced a design framework that emphasises the
importance of three components (content, pedagogy and technology) for effective
teaching with ET, their depiction of ‘pedagogy’focuses on techniques rather than on
recognising the underlying strategic influence of differing conceptions of teaching
and learning.
Some of the approaches used to investigate the impact of interventions reveal
that ‘teaching’and ‘learning’are taken for granted and that technology is viewed as
the agent of improvement in outcomes (Kirkwood & Price, 2014). Hence there is a
need for both practitioners designing learning programmes and researchers investi-
gating ET interventions to be informed about the educational implications of using
technologies for student learning. Unless interventions are underpinned by evidence
and interpreted through a theoretical lens, our future ET learning designs will likely
be underpinned by opinions rather than evidence.
The use of evidence is important for constructing a firm basis for informing
designs with technology that are built upon solid and explicit theoretical assump-
tions (Price & Kirkwood, 2014). This informs wise ‘investments’by teachers who
design curricula, institutions that support the curricula, and students whom we
expect to learn through our curricula. It also avoids unnecessary duplication of effort
and expense. ‘The most important benefits, ultimately, are the learning outcomes,
the improvements in understanding and skills implicit in the learning objectives’
(Laurillard, 2006, p. 30).
In this paper we use a selective critical analysis of research findings to model
evidence-based practices. We uncover assumptions made by researchers and practi-
tioners about learning and teaching. We examine these assumptions to model how
the use of evidence, or lack of it, impacts upon the learning design and the subse-
quent evaluation of the success of educational technology interventions.
What informs pedagogical designs using technology?
Assumptions about learning and teaching influence how we go about designing
resources and activities that use technology for our students’learning (Kirkwood &
Price, 2013a). Although models and theories may not be expressed explicitly, they
nonetheless underpin all design activities (Kirkwood & Price, 2013a). Research
shows considerable variation in conceptions of learning (Price, 2014; Richardson,
2000). Historically, student learning was most often seen as a quantitative change,
326 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
an increase in knowledge, encompassing the absorption of facts and procedures (see
Bransford, Brown, & Cocking, 2000). Later educational research recognised that
when learners were engaged conceptually rather than through rote learning, they
could generalise and apply their learning better to a greater variety of circumstances
(Baxter Magolda, 1992; Belenky, Clinchy, Goldberger, & Tarule, 1986; Hounsell,
1987; Martin & Ramsden, 1987; Marton, Dall’Alba, & Beaty, 1993; Perry, 1970;
Säljö, 1979; Van Rossum & Schenk, 1984; Vermunt, 1996; Vermunt & Rijswijk,
1988).
Teachers also conceive of teaching in a variety of different ways (Kember &
Kwan, 2000; Prosser, Trigwell, & Taylor, 1994; Samuelowicz & Bain, 1992,
2001). Trigwell and Prosser (1996) found that teachers’approaches to teaching –
what they do in practice –correspond to their conceptions of teaching, which in
turn relate to their conceptions of how their students learn. Consequently, teachers
with a conception that focuses on ‘the transmission of knowledge’usually adopt a
teaching-centred approach and conceive of student learning as an increase in
knowledge (Säljö, 1979). In contrast, those teachers who regard teaching as ‘pro-
moting conceptual development in learners’are likely to adopt a learning-centred
approach and conceive of learning as a qualitative improvement in understanding.
Thus, how teachers conceive of teaching informs how they approach their teaching
(Price, 2014). It also shapes how they design teaching and learning resources and
activities that use technologies, and how they subsequently evaluate student learn-
ing in interventions (Kirkwood & Price, 2014). Individual teachers have consider-
able influence upon the design both of interventions and their evaluation,
particularly as practitioners often conduct research investigations into their own
innovations (Hammersley, 2007).
Often, what influences the design of ET innovations is not theoretical under-
standings and evidence about learning improvement drawn from the literature. More
prevalent is technological determinism and an experimentalist approach that reflects
opinion-based practice (Boyle, 2004) as opposed to evidence-informed practice
(Kirkwood & Price, 2013a,b). This raises several questions about the role of evi-
dence to inform designs as well as assumptions about learning and teaching that
may underpin not only ET designs, but how any subsequent evaluation might be
interpreted. Principally:
For ET interventions, what assumptions are made about the nature of teaching
and learning?
Are those assumptions derived from explicit or implicit theoretical models or
understandings?
How do implicit or explicit theoretical models inform the design of ET
interventions?
How do implicit or explicit theoretical models inform the evaluation approach,
the research methods employed and the interpretation of findings?
Do research/evaluation results inform pedagogical practice or ET designs?
We now critically analyse our selection of the literature to uncover implicit
assumptions about ET and evidence, and how these influence ET designs and inter-
pretations about their success.
Technology, Pedagogy and Education 327
Downloaded by [Open University] at 07:49 29 October 2014
Methods
Theoretical approach
Assumptions about the design of learning and teaching with technology are fre-
quently not made explicit (Price & Kirkwood, 2014; Thorpe, 2008). So scrutinising
underpinning assumptions does not have a strong tradition upon which to draw.
Examining the ‘effects’of interventions is complicated by the differing ideological
positions of researchers in terms of what they considered as evidence, and how that
is characterised (Price & Kirkwood, 2014). For example, a well-established method
of examining the collective impact of a body of interventions is through a meta-
analysis approach (see for example Means, Toyama, Murphy, Bakia, & Jones, 2010;
Tamim, Bernard, Borokhovski, Abrami, & Schmid, 2011). However, this approach
only deems a certain experimental method (with strict inclusion/exclusion criteria)
as acceptably rigorous and valid.
Most interventions that take place within higher education institutions are rela-
tively small-scale and it is unlikely that any evaluation or research concerning their
effectiveness could be conducted with the rigour that would produce conclusive evi-
dence. Cumulatively, however, evidence gathered from a number of similar interven-
tions can provide a useful indication of benefits that might be achieved. As Slavin
(2003, p. 15) has pointed out:
Rather than evaluate one large, definitive study, researchers must usually look at many
small studies …if these studies tend to find consistent effects, the entire set of studies
may produce a meaningful conclusion.
Investigating learning and teaching is a complex activity as there are multiple fac-
tors that can influence student learning outcomes (Price, 2014). We draw on the 4P
model (Price, 2014) and specifically focus on teachers’conceptions of teaching
(Kember & Kwan, 2000; Prosser et al., 1994; Samuelowicz & Bain, 1992,2001)
and teachers’approaches to teaching (Prosser et al., 1994; Trigwell & Prosser,
1996) as indicative of interpretations about student learning (Trigwell & Prosser,
1996).
We also draw on Kirkpatrick’s(
1994) model of evaluation, which proposes that
the effectiveness of education/training should be evaluated at four progressively
challenging levels –reaction,learning,behaviour and results. Sophisticated
evaluations need to attend to multiple levels, i.e. all four levels, while more naïve
evaluations focus on elementary levels such as reaction and learning. While we
acknowledge that Kirkpatrick’s model might not be appropriate for all forms of edu-
cational evaluation, it does offer one useful means of uncovering implicit
assumptions about learning as evidenced through evaluation strategies.
A further foundation of our approach is the pedagogical goal of an ET interven-
tion. Kirkwood and Price (2014) argued for a more holistic approach to examining
the impact of learning and teaching interventions that allows for scrutiny across a
range of factors within the parameters of their own design. This identifies any inter-
vention as relating to one of three goals: replicating, supplementing and transform-
ing. This has the advantage of allowing consideration of contextual factors,
important in research with human participants (Clegg, 2005; Hammersley, 2007;
Kirkwood & Price, 2013a). In our synthesis we use a multifaceted and cumulative
approach to model how theory (implicit or explicit) and practice inform ET designs
and the interpretation of their success.
328 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Sources of data
The starting point was a review of literature undertaken for the UK Higher Educa-
tion Academy to investigate how practitioners used ET to support student learning.
That review considered the kinds of evidence that were produced to substantiate
claims of improvements achieved (Price & Kirkwood, 2011). It revealed that practi-
tioners were not making good use of current research to inform their designs. A fur-
ther analysis of those articles is presented here, explicitly examining the theoretical
models/evidence and assumptions (whether explicit or implicit) that underpin inter-
ventions and their relationship with the specific design, the evaluation/research
undertaken and the interpretation of findings. This examination of real cases offers
the opportunity to model ‘informed pedagogical designs that use technology’with a
view to facilitating more robust approaches to improving student learning.
The data sources were acquired by searching for articles published during the
period 2005 to 2010 using the ‘Web of Science’and ‘Academic Search Complete’
online databases. The keywords applied were ‘technology’and ‘university or higher
education’and ‘teaching or learning’and ‘evidence or empirical’. Several hundred
abstracts were scrutinised, but a shortened list of articles was read in full. A total of
46 articles were included. The remainder were excluded because they were not con-
cerned with evidence generated from actual interventions in the higher education
context.
Data analysis
The fundamental unit of analysis was each individual research paper. Using content
analysis (similar to Hew & Cheung, 2013), each article was characterised according
to the following five parameters (see Table 1). Each parameter is associated with a
data category and, where appropriate, is related to its theoretical underpinning.
The criteria in Table 1provide an indication of how each intervention was
designed in terms of:
whether theoretical underpinnings of teaching and learning have explicitly
informed the design and evaluation;
conceptions of learning in terms of what ‘measures’were used to evaluate the
intervention;
how the complexity of the evaluation approach reflects the sophistication of
the design and its impact on learning.
Results and discussion
Table 2shows an analysis of the articles using the criteria in Table 1. We used the
first criterion (Pedagogical basis of intervention or study) as an overarching lens
through which to construe what informs design. In the first category, ‘Replicating
existing teaching practices’, the articles contained implicit assumptions about teach-
ing and learning, a quantitative expression of student learning, and evidence col-
lected was concerned with the lower levels of Kirkpatrick’s evaluation model.
In Category 1, all but one of the articles made implicit assumptions about learn-
ing. In other words, there was little or no discussion in the articles about what
‘learning’involved. This category of design tended to reflect an assumption that
Technology, Pedagogy and Education 329
Downloaded by [Open University] at 07:49 29 October 2014
learning improvement was about increasing knowledge acquisition, which, in turn,
was evidenced by students achieving higher grades. This relates to a transmissive
model of teaching (Trigwell, Prosser, & Taylor, 1994), where learning improvement
is viewed as a quantitative gain (Trigwell & Prosser, 1996). The last article in Cate-
gory 1 also focuses on replication, but in this instance it compared campus-based
and distance learners. While an explicit theoretical model was evident, the actual
design suggests that underlying assumptions about teaching are again transmissive,
with learning being seen as an increase in knowledge.
The first nine articles in Category 2, ‘Supplementing existing teaching practices’,
focus on designs that make current course materials or resources available more flexi-
bly. All of the articles in this category make implicit assumptions about learning. The
evaluation strategy is also quantitative, indicating that teaching and learning are con-
ceived as quantitative activities. The remaining 14 articles in this category are charac-
terised as adopting or developing additional learning resources or tools for students to
use. Most of these expressed explicit theoretically grounded assumptions about learn-
ing. Eleven articles considered learning gains to be about qualitative improvements in
understanding, which they sought to evaluate through predominantly qualitative
Table 1. Characteristics of the five parameters used for content analysis.
Parameter Data category
Theoretical
underpinning
Pedagogical basis of ET
intervention (explicit or
implicit)
(1) Replicating existing
teaching practices
(2) Supplementing existing
teaching or
(3) Transforming the
learning experience
Kirkwood & Price’s
(2014) model of
enhancements
Assumptions about
learning and teaching
Explicit or implicit
Assumptions about learning and
conceptions of learning
Learning viewed as quantitative
or qualitative
Säljö’s(
1979)
hierarchical scheme
Number of data collection
sources involved
Count of methods used
Focus of evaluation –what
aspects were addressed
Identification of highest level of
evaluation:
(1) 1 = Reaction (participants’
satisfaction with an
intervention)
(2) 2 = Learning (what
knowledge participants
gain)
(3) 3 = Behaviour (what
participants can do
differently)
(4) 4 = Results (how
participants apply the
knowledge and skills
gained)
Kirkpatrick’s(
1994)
4-level model of
evaluation
330 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Table 2. A categorisation of the reviewed interventions involving technology for teaching and learning.
Pedagogic basis (design)
of intervention or study
Illustrative
research
articles
Assumptions
about
teaching and
learning
Theoretical assumptions about
improvements in learning
(what they are considered to be)
No. of
data
collection
types
Learning
viewed as
quantitative
or
qualitative
Highest
level in
Kirkpatrick’s
model
1. Replicating existing teaching
practices: characterised by
replicating an element of
conventional teaching for delivery
to students using some form of
technology (8 studies)
Connolly
et al. (2007)
Implicit An improvement in grades 4 Quantitative 2
de Grez,
Valcke, and
Roozen
(2009)
Explicit Social cognitive theoretical
perspective towards self-regulated
learning
1 Qualitative 2
Delialioglu
and Yildirim
(2008)
Implicit An improvement in achievement,
retention, attitude and satisfaction
3 Quantitative 2
Hui, Hu, and
Clark (2007)
Implicit More efficient knowledge
acquisition
2 Quantitative 2
Lorimer and
Hilliard
(2008)
Implicit An improvement in grades 2 Quantitative 2
Neumann and
Hood (2009)
Implicit Increase in knowledge and an
improvement in grades
(constructivist model)
3 Quantitative 2
Stephenson,
Brown, and
Griffin
(2008)
Implicit Increase in knowledge and an
improvement in grades –but did
distinguish between different types
of questions around Bloom’s
taxonomy
2 Quantitative 2
(Continued)
Technology, Pedagogy and Education 331
Downloaded by [Open University] at 07:49 29 October 2014
Table 2. (Continued).
Pedagogic basis (design)
of intervention or study
Illustrative
research
articles
Assumptions
about
teaching and
learning
Theoretical assumptions about
improvements in learning
(what they are considered to be)
No. of
data
collection
types
Learning
viewed as
quantitative
or
qualitative
Highest
level in
Kirkpatrick’s
model
Woo et al.
(2008)
Implicit Lectures can be replaced by web-
based technologies
4 Quantitative 1
1. Replicating existing teaching
practices: characterised by
comparing differing technologies
for delivering the same material or
resources to campus-based or
distance learners (1 study)
Griffin,
Mitchell, and
Thompson
(2009)
Explicit Bloom’s taxonomy: improvement in
different types of skills shown
through multiple choice questions
(but implicit assumption about
information transfer)
2 Quantitative 2
2. Supplementing existing teaching
practices: characterised by making
available versions of existing
course materials/resources/tools that
students can access and use
whenever they want (9 studies)
Copley
(2007)
Implicit An increase in knowledge 2 Quantitative 1
Cramer et al.
(2007)
Implicit An increase in performance 3 Quantitative 2
Dalgarno
et al. (2009)
Implicit An increase in knowledge measured
by test performance
3 Quantitative 2
Evans (2008) Implicit An increase in knowledge measured
by test performance
1 Quantitative 1
Fernandez,
Simo, and
Sallan (2009)
Implicit Information transmission –student
perceptions of its value
5 Quantitative 1
Lonn and
Teasley
(2009)
Implicit An increase in knowledge and
performance
3 Quantitative 1
332 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Swan and
O’Donnell
(2009)
Implicit An increase in knowledge and
performance
5 Quantitative 2
Taylor and
Clark (2010)
Implicit Better information transmission to
student
3 Quantitative 1
Tynan and
Colbran
(2006)
Implicit Better experiences through using
podcasts
1 Quantitative 1
2. Supplementing existing teaching
practices: characterised by making
available versions of existing
course materials/resources/tools that
students can access and use
whenever they want (9 studies)
Cubric
(2007)
Explicit Constructivist theory –developing
learning through the support of
group work
3 Qualitative 2
de Leng et al.
(2009)
Explicit Developing critical thinking 4 Qualitative 3
Demetriadis
et al. (2008)
Explicit Better-quality problem solving 3 Qualitative 2
Elgort,
Smith, and
Toland
(2008)
Explicit Constructivist approach to
learning –assessed through better
group work and high-quality
development on wiki
4 Qualitative 3
Hramiak,
Boulton, and
Irwin (2009)
Explicit Qualitative changes in learning for
reflection and professional
development
2 Qualitative 3
Kerawalla
et al. (2009)
Explicit Student centred –developments in
reflection upon development
2 Qualitative 3
McLoughlin
and Mynard
(2009)
Explicit Developing higher order thinking 1 Qualitative 3
Murphy and
Ciszewska-
Carr (2007)
Implicit Good communication
experiences –information
transmission
1 Quantitative 1
(Continued)
Technology, Pedagogy and Education 333
Downloaded by [Open University] at 07:49 29 October 2014
Table 2. (Continued).
Pedagogic basis (design)
of intervention or study
Illustrative
research
articles
Assumptions
about
teaching and
learning
Theoretical assumptions about
improvements in learning
(what they are considered to be)
No. of
data
collection
types
Learning
viewed as
quantitative
or
qualitative
Highest
level in
Kirkpatrick’s
model
Ng’ambi and
Brown
(2009)
Explicit Development of student engagement 2 Qualitative? 2
Sorensen
et al. (2007)
Implicit Better Internet use implies better
learning and practice
5 Quantitative 2
Wheeler and
Wheeler
(2009)
Explicit Constructivist approach –better-
quality writing
2 Qualitative 1
Wyatt et al.
(2010)
Implicit Information transmission 2 Quantitative 2
Xie, Ke, and
Sharma
(2008)
Explicit Qualitative improvements in
reflective thinking skills
3 Qualitative 3
Zorko (2009) Explicit Better collaborative learning 4 Qualitative 3
3. Transforming the learning
experience: characterised by
redesigning learning activities or
substantial parts of modules to
provide active learning
opportunities for students
(8 studies)
Coller and
Scott (2009)
Explicit Better engagement leads to better-
quality learning
3 Qualitative 3
Cooner
(2010)
Explicit Qualitative changes in learning
acknowledging learning complexity
2 Qualitative 3
Dalsgaard
and Godsk
(2007)
Explicit Social constructivist model –
learning as qualitative
improvements
3 Qualitative 2
334 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Hakkarainen,
Saarelainen,
and Ruokamo
(2007)
Explicit Constructivist approach to
improving meaningful learning
3 Qualitative 1
Hemmi,
Bayne, and
Land (2009)
Explicit Developing reflection and identity –
considers what is happening as
opposed to specific focus on
learning
6 Qualitative 3
Herman and
Kirkup
(2008)
Explicit Developing learners as reflective
professionals
5 Qualitative 2
Lee,
McLoughlin,
and Chan
(2008)
Explicit Constructivist learning with students
as producers and active participants
in the process
1 Qualitative 2
Tormey and
Henchy
(2008)
Explicit Increased student engagement 3 Qualitative 1
3. Transforming the learning
experience: characterised by
redesigning learning activities or
substantial parts of modules to
provide active learning
opportunities for students
(8 studies)
Chen, Chen,
and Tsai
(2009)
Explicit Improving various aspects of the
learning process
4 Qualitative 2
Downing
et al. (2007)
Explicit Collaborative learning to improve
the quality of learning
3 Qualitative 1
Kanuka,
Rourke, and
Laflamme
(2007)
Explicit Collaborative learning to improve
the quality of learning
2 Qualitative 3
Kirkwood
(2006)
Explicit Role of assessment in improving
student access to and use of online
resources
1 Qualitative 1
(Continued)
Technology, Pedagogy and Education 335
Downloaded by [Open University] at 07:49 29 October 2014
Table 2. (Continued).
Pedagogic basis (design)
of intervention or study
Illustrative
research
articles
Assumptions
about
teaching and
learning
Theoretical assumptions about
improvements in learning
(what they are considered to be)
No. of
data
collection
types
Learning
viewed as
quantitative
or
qualitative
Highest
level in
Kirkpatrick’s
model
Melrose and
Bergeron
(2007)
Explicit Using affective factors to support
good-quality student interactions
2 Qualitative 1
Thorpe
(2008)
Explicit Better engagement and interaction
leads to better learning
5 Qualitative 3
336 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
methods. This illustrates a more student-centred approach to teaching. In many cases
technology was used to promote developments in learning that were not being met by
other means. The remaining three articles in this category had implicit assumptions
about learning. In these articles the intervention studies were not concerned with qual-
itative improvements in learning, but focused more on students’reactions to and/or
perceptions of the technology-based interventions. Overall, articles in this category
did not focus on the higher levels of Kirkpatrick’s model.
Interventions included in Category 3, ‘Transforming the learning experience’,
sought improvements through making structural changes and pedagogical designs
that exploited the use of technology. They tended to make explicit underpinning the-
ories of learning and qualitative approaches were used to evaluate student learning
outcomes. The articles in this category were also more concerned with the higher
levels in Kirkpatrick’s model. The first eight papers in this category focused on rede-
signing parts of modules to provide active learning experiences for students. All of
the designs were informed by explicit theoretical assumptions about learning. Again,
all of these focused on achieving qualitative differences in learning. The last six arti-
cles in this category explored which ET designs were more effective in promoting
qualitatively richer student learning. In this category the theoretical assumptions
were more explicit where the pedagogical designs that used ET were more complex.
Collectively, this analysis illustrates that the designs of ET interventions are
based upon assumptions about student learning. These assumptions also reflect
teachers’conceptions of teaching and their approaches to teaching, i.e. their educa-
tional practice. For example, presentational forms of teaching with technology (e.g.
PowerPoint shows, podcasts of lectures and webcasts) tend to buttress the practices
of teaching-centred teachers, through replicating or supplementing their existing
ways of teaching. In contrast, teachers with learning-focused conceptions of teaching
are more likely to exploit technologies that expedite and support the learning and
development of students. Examples include designs where students are required to
interrogate sources of information or data, to undertake group tasks or to reflect
upon and demonstrate developments in their understanding and practices (using
wikis, blogs, discussion forums, portfolios etc.). Often these are associated with
endeavours to transform the learning experience through active engagement in
knowledge building and sharing, and reflection upon learning and development epi-
sodes and processes. These variations in teachers’conceptions and approaches to
teaching help us to understand the different ways in which technology is used for
university teaching (Kirkwood & Price, 2012).
Evaluation approaches and how they reflect theoretical assumptions
Evaluation methods too are underpinned by theoretical assumptions. The replication
designs prevalent in Category 1 in Table 2are largely related to implicit assumptions
about learning and many adopted comparative methods to examine student perfor-
mance. These compare and contrast the performance of ‘with-technology’and ‘non-
technology’groups of students. Comparative methods remain a common approach
in ET research (Means et al., 2010; Tamim et al., 2011). However they conceal
assumptions about learning as being a quantitative accumulation of knowledge, with
the expectation of demonstrating that students have ‘learned more’following the
introduction of technology. The evaluation methods are predicated upon a
technologically deterministic perspective, where the technology in and of itself is
Technology, Pedagogy and Education 337
Downloaded by [Open University] at 07:49 29 October 2014
considered to be the agent of change. It also assumes a transmissive approach to
teaching. The evaluations in Category 1 studies tended to use fewer evidence collec-
tion methods. Further, the nature of the evaluation focuses on the lower levels of
Kirkpatrick’s four-level model.
Where studies have relied primarily upon self-report surveys to gauge students’
and teachers’reactions to and satisfaction with technology-based interventions
(Kirkpatrick Level 1), the findings reveal nothing about any learning improvements
achieved. Studies that focus mainly upon test scores or assignment grades achieved
(Kirkpatrick Level 2) indicate that learners have been able to acquire knowledge,
but may not have developed greater understanding.
Interventions in Category 2 are split between those that aimed to make existing
teaching resources available in a supplementary form, and those that added addi-
tional resources with the intention of improving learning. Interventions in the first
group were mostly related to implicit assumptions about learning predicated on the
accumulation of information (quantitative) and, again, reflect a transmissive
approach to teaching. This group also tended to use relatively few evaluation meth-
ods, while the nature of the evaluation concentrated at the lower levels of Kirkpa-
trick’s model.
In comparison, articles in the second group within Category 2 tended to focus on
how to enhance some aspect of the educational process. Here more explicit and the-
oretically underpinned discussions about learning are evident. The nature of the
evaluations also changed in terms of their increased complexity and more sources of
evidence were sought. This category included constructivist approaches to learning
in which the evidence gathered focuses upon the qualitative developments in student
learning.
In Category 3, explicit consideration of theories of learning becomes prevalent.
Designs in the first eight articles focus on introducing a different pedagogical
approach or way of working, for example promoting students’explicit reflection
upon the development of professional practices or the completion of group tasks
with shared outputs. Evidence of the effect of these designs sought to establish that
the new or re-conceptualised design enabled better-quality student learning.
Increased complexity in the evaluation methods is particularly evident in the second
group of Category 3 studies. More sources of evidence are sought to demonstrate
the impact of the intervention. In addition, the nature of the evaluation examined
more complex levels of Kirkpatrick’s model.
This review has illustrated considerable under-utilisation of theoretical models of
learning to drive pedagogical designs of ET interventions (Kirkwood & Price,
2013a). Implicit models and assumptions were, nevertheless, informing the design
of interventions. We argue that a scholarly approach to designing teaching and learn-
ing with technology was often missing (Kirkwood & Price, 2013b). However, we
found that explicit and more sophisticated theories of learning tended to underpin
better-informed designs.
Towards an evidence-informed model
Our analysis has shown that practitioners’use of ET was driven by a variety of
assumptions about the nature of learning. These influenced how teachers approached
their teaching and how they used ET in their designs. What is particularly striking
was the link between implicit assumptions, the design of the ET intervention and the
338 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
methods employed to evaluate the effectiveness of their intervention. Practitioners
do not appear to be capitalising on existing evidence and theories about learning and
teaching, particularly with technology (Kirkwood & Price, 2013a,b,2014; Price &
Kirkwood, 2011). More emphasis appears to be placed on experimentation with
technology or opinion-based practice rather than drawing upon existing theories and
evidence to inform new designs and to interpret the findings.
A practice-based model
Figure 1illustrates a practice-based model and the relationships we have found in
our analysis between:
theoretical models and assumptions about teaching and learning;
the pedagogical design of resources and activities for learning using technology;
the approach adopted to evaluate that design and the interpretation of evaluative
findings.
It also shows how these findings feed back to either reinforce or modify the the-
oretical models and assumptions.
For example, if the teacher’s model assumes that learning is about accumulating
more information, then the design will focus on creating situations from which
students acquire more knowledge. The subsequent evaluation will seek to establish,
through a test, how much ‘extra’information students have acquired when compared
with a similar non-intervention group. If the test scores indicate an improvement,
this will act to ‘demonstrate’the impact of technology and to reinforce the assump-
tions made about learning and teaching. In contrast, teachers with a more complex
model of learning will adopt a multifaceted approach to ET design and to the
evaluation of outcomes.
A partially informed model
From our analysis we conclude that many designs for using ET have not been
informed by explicit theoretical understandings and appropriate evidence. However,
some had been (see Table 2). We illustrate a model of ET design that is partially
Theoretical Models of
Teaching & Learning and
Underlying Assumptions
Pedagogical
Design for Use of
Technology
Evaluation of
Pedagogical
Design
Interpretation of
Findings
Figure 1. A practice-based model of ET design.
Technology, Pedagogy and Education 339
Downloaded by [Open University] at 07:49 29 October 2014
informed by research evidence and theory (see Figure 2). In this model explicit
research evidence and theory underpin models of and assumptions about student
learning and the pedagogical design.
Evidence and theory not only influence the pedagogical design, but also (indi-
rectly) the strategy for evaluating the effects of an intervention and the interpretation
of the results. So, by adopting a more evidence-informed approach to ET interven-
tions, designs that are more likely to lead to qualitative improvements in learning
can be developed. However, we refer to this as ‘partially informed’because integra-
tion of these conceptions with the educational context is lacking.
An evidence-informed model –including the influence of context
What has not been discussed explicitly in this article is the role of contextual factors.
Nonetheless, they can shape designs significantly. The teaching context can influ-
ence what a teacher is able to do in terms of pedagogic possibilities and it may influ-
ence what students can do in terms of access to technologies. For example, if a
teacher is presented with a large cohort of students and assigned a large lecture hall
for their teaching activities, then these circumstances reinforce a lecture-model
approach to teaching as well as an information-accumulation model for learning.
Context also influences the methods adopted for evaluating the effectiveness of
the intervention. The comparative study method (in which ET use is compared with
teaching without the use of technology) may be influenced by pragmatic matters.
Given a particular context, convenience and expediency often determine how partici-
pants are selected for an intervention and the evaluation conducted. For example,
participants might be concurrent groups of students within the same cohort, or con-
secutive cohorts of students taking seemingly the same module.
Further, the departmental and institutional contexts within which university
teachers operate and their disciplinary affiliation exert considerable influence upon
teachers’beliefs and practices about teaching and learning (Lindblom-Ylänne,
Trigwell, Nevgi, & Ashwin, 2006). In the articles we reviewed, the contextual
Theoretical Models
and Assumptions
Pedagogical Design
for Use of
Technology
Evaluation Interpretation of
Findings
Research Evidence
and Theory
Figure 2. A partially informed model of ET design.
340 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
particularities of any ET intervention were often found to be under-specified or paid
insufficient attention. This made it difficult for us to draw particular conclusions
about the role of context in many cases. The deficiency also limited the potential to
generalise from the findings.
Other influences can also be important. For example, the nature of the academic
development and support that practitioners receive can influence academics’confi-
dence and skill in using particular technologies. Institutional policy-makers play key
roles in determining the integration of technology, as they influence the culture
within which practitioners operate and hence their actions (Price & Kirkwood,
2008). We argue that informed designs need to take account of various contexts: the
teaching context, the institutional context and the student context. To counter the
deficiencies revealed in the literature review, we add contextual constraints and
influences to the model in Figure 3.
How can we influence informed design in educational technologies that enhance
learning?
We argue that informed design of ET interventions needs to be grounded in a better
conceptualisation of what constitutes and shapes learning rather than a focus on
technology as the primary agent of change. Further, contextual factors relating to the
environment within which teaching and learning takes place often influence uses of
technology to a greater extent than is usually acknowledged.
Havnes (2004) argued that a social approach to learning is necessary in order to
broaden attention from the actions of individuals (students and/or teachers) to the
social system and the surrounding institutional practices. Changing just one constitu-
ent part of any educational programme is unlikely to bring about a substantial alter-
ation to the whole. If the transformation that many teachers seek is to be achieved,
Theoretical
Models and
Assumptions
Pedagogical
Design for use of
Technology
Evaluation Interpretation
of Findings
Research Evidence
and Theory
Context
Figure 3. An evidence-informed model of learning design with technology constrained by
contextual factors.
Technology, Pedagogy and Education 341
Downloaded by [Open University] at 07:49 29 October 2014
consideration must be given to the interaction of each part with the others. A holistic
view that draws on good evidence to inform designs is required to make good
choices, whether within a course or programme, or across a whole institution.
At the institutional level, clarity is necessary in terms of the goals and aims to
be served by the ever-greater adoption of technology. Academic policies and strate-
gies need to be coordinated across all relevant parts of the faculty and institution.
Implementation of ET is not just confined to interventions aimed at enhancing stu-
dent learning. It necessitates reviewing the underpinning infrastructure that will be
required to support such interventions, establishing what changes in processes are
required and what changes in the skill sets of staff will be necessary.
Concluding comments
This synthesis of the research literature has revealed an under-specification of both
theoretical models and evidence from relevant research and evaluation studies to
inform the design of ET interventions in higher education and to interpret their
effectiveness. It was found that practitioners’use of technology was driven by a
varying set of (often implicit) beliefs about the nature of teaching, learning and tech-
nology and a range of assumptions that underpin those beliefs. These implicit or
explicit assumptions influenced the design of ET interventions, the approach to eval-
uating their effectiveness and the interpretation of the findings (Kirkwood & Price,
2013a). Context exerts considerable influence upon academics’beliefs and practices
concerning teaching and learning. Nonetheless, many interventions provided insuffi-
cient explicit recognition of the constraints imposed by departmental, institutional
and disciplinary contexts within which interventions took place.
Based upon our critical analysis and the gaps identified, we argue that evidence-
informed design of ET interventions needs to be grounded in a better conceptualisa-
tion of what constitutes and shapes learning and how interventions can be integrated
within the context of their implementation, rather than a focus on technology as the
agent of change.
Notes on contributors
Linda Price is a Senior Lecturer at the UK Open University and Professor of Engineering
Education at Lund University, Sweden. Since 1995 she has promoted pedagogically driven
uses of new technology in higher education. She coordinates staff development activities
across the university and has developed evidence-informed professional development pro-
grammes that promote student-centred learning for Open University faculty and external
bodies, such as the UK Higher Education Academy. Her professional development activities
draw upon institutional, national and international research on the student experience to pro-
mote the synergy between research and practice.
Adrian Kirkwood is a Senior Lecturer at the UK Open University. He has been monitoring
and evaluating developments in media-based teaching and learning for more than 25 years,
both within the UK Open University and in other education and training organisations. His
primary interest is in student learning with media technologies –with the emphasis on learn-
ing to a greater extent than on media technologies per se. Adrian has a long record of sup-
porting professional development for staff at the UK Open University. Specifically, this has
taken an evidence-informed approach to making the most effective use of media technologies
for learning and teaching in courses developed for independent adult students.
342 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
References
Anderson, D. S., & Biddle, B. J. (1991). Knowledge for policy: Improving education through
research. London: Falmer.
Baxter Magolda, M. B. (1992). Knowing and reasoning in college: Gender-related patterns
in students’intellectual development. San Francisco, CA: Jossey-Bass.
Belenky, M. F., Clinchy, B. M., Goldberger, N. R., & Tarule, J. M. (1986). Women’s ways of
knowing: The development of self, voice and mind. New York, NY: Basic Books.
Boyle, T. (2004). Technology and the reflective practitioner. In A. Irons & S. Alexander
(Eds.), Effective learning and teaching in computing (pp. 182–188). London: Routledge-
Falmer.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain,
mind, experience, and school. Washington, DC: National Academy Press.
Clegg, S. (2005). Evidence-based practice in educational research: A critical realist critique
of systematic review. British Journal of Sociology of Education, 26, 415–428.
doi:10.1080/01425690500128932
Cohen, L., Manion, L., & Morrison, K. (2011). Research methods in education (7th ed.).
Abingdon: Routledge.
De Laat, M., Lally, V., Simons, R.-J., & Wenger, E. (2006). A selective analysis of empirical
findings in networked learning research in higher education: Questing for coherence.
Educational Research Review, 1,99–111. doi:10.1016/j.edurev.2006.08.004
Fitz-Gibbon, C. T. (1997). The Value Added National Project: Feasibility studies for a
national system of value-added indicators (Final Report). Durham: Curriculum, Evaluation
and Management Centre, Durham University. Retrieved from http://www.cem.org/attach
ments/publications/CEMWeb009%20Feasibility%20Study%20Nat%20System%20VA%
20Indicators.pdf
Fitz-Gibbon, C. T. (1999). Education: High potential not yet realized. Public Money and
Management, 19,33–39.
Hammersley, M. (2007). Educational research and evidence-based practice. London: Sage.
Havnes, A. (2004). Examination and learning: An activity-theoretical analysis of the relation-
ship between assessment and educational practice. Assessment & Evaluation in Higher
Education, 29, 159–176.
Hew, K. F., & Cheung, W. S. (2013). Use of Web 2.0 technologies in K–12 and higher edu-
cation: The search for evidence-based practice. Educational Research Review, 9,47–64.
doi:10.1016/j.edurev.2012.08.001
Hounsell, D. (1987). Essay writing and the quality of feedback. In J. T. E. Richardson, M.
W. Eysenck, & D. Warren Piper (Eds.), Student learning: Research in education and
cognitive psychology (pp. 109–119). Milton Keynes: Society for Research into Higher
Education & Open University Press.
Kember, D., & Kwan, K. P. (2000). Lecturers’approaches to teaching and their relationship
to conceptions of good teaching. Instructional Science, 28, 469–490.
Kirkpatrick, D. L. (1994). Evaluating training programs. San Francisco, CA: Berrett-Koehler
Publishers.
Kirkwood, A. T., & Price, L. (2012). The influence upon design of differing conceptions of
teaching and learning with technology. In A. D. Olofsson & O. Lindberg (Eds.), Informed
design of educational technologies in higher education: Enhanced learning and teaching
(pp. 1–20). Hershey, PA: IGI Global.
Kirkwood, A. T., & Price, L. (2013a). Examining some assumptions and limitations of
research on the effects of emerging technologies for teaching and learning in higher edu-
cation. British Journal of Educational Technology, 44, 536–543. doi:10.1111/bjet.12049
Kirkwood, A. T., & Price, L. (2013b). Missing: Evidence of a scholarly approach to teaching
and learning with technology in higher education. Teaching in Higher Education, 18,
327–337. doi:10.1080/13562517.2013.773419
Kirkwood, A. T., & Price, L. (2014). Technology-enhanced learning and teaching in higher
education: What is ‘enhanced’and how do we know? A critical literature review. Learn-
ing, Media and Technology, 39,6–36. doi:10.1080/17439884.2013.770404
Laurillard, D. (2006). Modelling benefits-oriented costs for technology enhanced learning.
Higher Education, 54,21–39. doi:10.1007/s10734-006-9044-2
Technology, Pedagogy and Education 343
Downloaded by [Open University] at 07:49 29 October 2014
Lindblom-Ylänne, S., Trigwell, K., Nevgi, A., & Ashwin, P. (2006). How approaches to
teaching are affected by discipline and teaching context. Studies in Higher Education, 31,
285–298. doi:10.1080/03075070600680539
Martin, E., & Ramsden, P. (1987). Learning skills, or skill in learning? In Student learning:
Research in education and cognitive psychology (pp. 155–167). Milton Keynes: Society
for Research into Higher Education & Open University Press.
Marton, F., Dall’Alba, G., & Beaty, E. (1993). Conceptions of learning. International Journal
of Educational Research, 19, 277–300.
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-
based practices in online learning: A meta-analysis and review of online learning studies.
Washington, DC: US Department of Education Office of Planning, Evaluation, and Policy
Development. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-prac
tices/finalreport.pdf
Mishra, P., & Koehler, M. J. (2008, March). Introducing technological pedagogical content
knowledge. Paper presented at the annual meeting of the American Educational Research
Association, New York, NY.
Perry, W. G. (1970). Forms of intellectual and ethical development in the college years: A
scheme. New York, NY: Holt, Rinehart and Winston.
Price, L. (2014). Modelling factors for predicting student learning outcomes in higher
education. In D. Gijbels, V. Donche, J. T. E. Richardson, & J. D. Vermunt (Eds.), Learn-
ing patterns in higher education: Dimensions and research perspectives ( pp. 56–77).
London: Routledge.
Price, L., & Kirkwood, A. T. (2008). Technology in the United Kingdom’s higher education
context. In S. Scott & K. Dixon (Eds.), The 21st century, globalised university: Trends
and development in teaching and learning (pp. 83–113). Perth: Black Swan.
Price, L., & Kirkwood, A. T. (2011). Enhancing professional learning and teaching through
technology: A synthesis of evidence-based practice among teachers in higher education.
Open University. Retrieved from http://oro.open.ac.uk/30686/
Price, L., & Kirkwood, A. (2014). Using technology for teaching and learning in higher edu-
cation: A critical review of the role of evidence in informing practice. Higher Education
Research & Development, 33, 549–564. doi:10.1080/07294360.2013.841643
Prosser, M., Trigwell, K., & Taylor, P. (1994). A phenomenographic study of academics’
conceptions of science learning and teaching. Learning and Instruction, 4, 217–232.
Richardson, J. T. E. (2000). Researching student learning: Approaches to studying in
campus-based and distance education. Buckingham: Society for Research into Higher
Education & Open University Press.
Säljö, R. (1979). Learning in the learner’s perspective: I. Some common-sense assumptions
(Report No. 76). Gothenburg: University of Gothenburg, Institute of Education.
Samuelowicz, K., & Bain, J. D. (1992). Conceptions of teaching held by academic teachers.
Higher Education, 24,93–111. doi:10.1007/BF00138620
Samuelowicz, K., & Bain, J. D. (2001). Revisiting academics’beliefs about teaching and
learning. Higher Education, 41, 299–325.
Slavin, R. E. (2003). A reader’s guide to scientifically based research. Educational Leader-
ship, 60(5), 12–17.
Slavin, R. E. (2008). Perspectives on evidence-based research in education –What works?
Issues in synthesizing educational program evaluations. Educational Researcher, 37,
5–14. doi:10.3102/0013189X08314117
Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011).
What forty years of research says about the impact of technology on learning: A second-
order meta-analysis and validation study. Review of Educational Research, 81,4–28.
doi:10.3102/0034654310393361
Thorpe, M. (2008). Effective online interaction: Mapping course design to bridge from
research to practice. Australasian Journal of Educational Technology, 24,57–72.
Trigwell, K., & Prosser, M. (1996). Changing approaches to teaching: A relational perspec-
tive. Studies in Higher Education, 21, 275–284. doi:10.1080/03075079612331381211
Trigwell, K., Prosser, M., & Taylor, P. (1994). Qualitative differences in approaches to teach-
ing in first year university science. Higher Education, 27,75–84.
344 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Van Rossum, E. J., & Schenk, S. M. (1984). The relationship between learning conception,
study strategy and learning outcome. British Journal of Educational Psychology, 54,
73–83.
Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning styles and
strategies: A phenomenographic analysis. Higher Education, 31,25–50.
Vermunt, J. D., & Rijswijk, F. A. W. M. (1988). Analysis and development of students’skill
in self regulated learning. Higher Education, 17, 647–682.
Additional references for reviewed articles
Chen, Y., Chen, N., & Tsai, C. (2009). The use of online synchronous discussion for web-
based professional development for teachers. Computers and Education, 53, 1155–1166.
Coller, B., & Scott, M. (2009). Effectiveness of using a video game to teach a course in
mechanical engineering. Computers and Education, 53, 900–912.
Connolly, T. M., MacArthur, E., Stansfield, M., & McLellan, E. (2007). A quasi-experimental
study of three online learning courses in computing. Computers and Education, 49,
345–359.
Cooner, T. S. (2010). Creating opportunities for students in large cohorts to reflect in and on
practice: Lessons learnt from a formative evaluation of students’experiences of a
technology-enhanced blended learning design. British Journal of Educational
Technology, 41, 271–286.
Copley, J. (2007). Audio and video podcasts of lectures for campus-based students: Produc-
tion and evaluation of student use. Innovations in Education and Teaching International,
44, 387–399.
Cramer, K. M., Collins, K. R., Snider, D., & Fawcett, G. (2007). The virtual lecture hall: Util-
isation, effectiveness and student perceptions. British Journal of Educational Technology,
38, 106–115.
Cubric, M. (2007). Wiki-based process framework for blended learning. In Proceedings of
the 2007 International Symposium on Wikis (pp. 11–22). Retrieved from http://citeseerx.
ist.psu.edu/viewdoc/download?doi=10.1.1.96.3492&rep=rep1&type=pdf
Dalgarno, B., Bishop, A. G., Adlong, W., & Bedgood Jr, D. R. (2009). Effectiveness of a
Virtual Laboratory as a preparatory resource for distance education chemistry students.
Computers and Education, 53, 853–865.
Dalsgaard, C., & Godsk, M. (2007). Transforming traditional lectures into problem-based
blended learning: Challenges and experiences. Open Learning, 22,29–42.
de Grez, L., Valcke, M., & Roozen, I. (2009). The impact of an innovative instructional inter-
vention on the acquisition of oral presentation skills in higher education. Computers and
Education, 53,112–120.
de Leng, B. A., Dolmans, D. H., Jöbsis, R., Muijtjens, A. M., & van der Vleuten, C. P.
(2009). Exploration of an e-learning model to foster critical thinking on basic science
concepts during work placements. Computers and Education, 53,1–13.
Delialioglu, O., & Yildirim, Z. (2008). Design and development of a technology enhanced
hybrid instruction based on MOLTA model: Its effectiveness in comparison to traditional
instruction. Computers and Education, 51, 474–483.
Demetriadis, S. N., Papadopoulos, P. M., Stamelos, I. G., & Fischer, F. (2008). The effect of
scaffolding students’context-generating cognitive activity in technology-enhanced case-
based learning. Computers and Education, 51, 939–954.
Downing, K., Lam, T.-F., Kwong, T., Downing, W.-K., & Chan, S.-W. (2007). Creating inter-
action in online learning: A case study. ALT-J, 15, 201–215.
Elgort, I., Smith, A. G., & Toland, J. (2008). Is wiki an effective platform for group course
work? Australasian Journal of Educational Technology,24, 195–210. Retrieved from
http://www.ascilite.org.au/ajet/ajet24/elgort.html
Evans, C. (2008). The effectiveness of m-learning in the form of podcast revision lectures in
higher education. Computers and Education, 50, 491–498.
Fernandez, V., Simo, P., & Sallan, J. M. (2009). Podcasting: A new technological tool to
facilitate good practice in higher education. Computers and Education, 53, 385–392.
Technology, Pedagogy and Education 345
Downloaded by [Open University] at 07:49 29 October 2014
Griffin, D. K., Mitchell, D., & Thompson, S. J. (2009). Podcasting by synchronising Power-
Point and voice: What are the pedagogical benefits? Computers and Education, 53,
532–539.
Hakkarainen, P., Saarelainen, T., & Ruokamo, H. (2007). Towards meaningful learning
through digital video supported, case based teaching. Australasian Journal of Educa-
tional Technology,23,87–109. Retrieved from http://www.ascilite.org.au/ajet/ajet23/
hakkarainen.html
Hemmi, A., Bayne, S., & Land, R. (2009). The appropriation and repurposing of social tech-
nologies in higher education. Journal of Computer Assisted Learning, 25,19–30.
Herman, C., & Kirkup, G. (2008). Learners in transition: The use of ePortfolios for women
returners to science, engineering and technology. Innovations in Education and Teaching
International, 45,67–76.
Hramiak, A., Boulton, H., & Irwin, B. (2009). Trainee teachers’use of blogs as private
reflections for professional development. Learning, Media and Technology, 34, 259–269.
Hui, W., Hu, P. J.-H., & Clark, K. Y. (2007). Technology-assisted learning: A longitudinal
field study of knowledge category, learning effectiveness and satisfaction in language
learning. Journal of Computer Assisted Learning, 24, 245–259.
Kanuka, H., Rourke, L., & Laflamme, E. (2007). The influence of instructional methods on
the quality of online discussion. British Journal of Educational Technology, 38, 260–271.
Kerawalla, L., Minocha, S., Kirkup, G., & Conole, G. (2009). An empirically grounded
framework to guide blogging in higher education. Journal of Computer Assisted Learn-
ing, 25,31–42.
Kirkwood, A. (2006). Going outside the box: Skills development, cultural change and the
use of on-line resources. Computers and Education, 47, 316–331.
Lee, M. J. W., McLoughlin, C., & Chan, A. (2008). Talk the talk: Learner-generated podcasts
as catalysts for knowledge creation. British Journal of Educational Technology, 39,
501–521.
Lonn, S., & Teasley, S. D. (2009). Podcasting in higher education: What are the implications
for teaching and learning? The Internet and Higher Education, 12,88–92.
Lorimer, J., & Hilliard, A. (2008, November). What is your response? It’s time to get personal.
In Proceedings of the 7th European Conference on E-Learning (Vol. 2, pp. 128–134).
Retrieved from https://uhra.herts.ac.uk/dspace/bitstream/2299/6111/1/ECEL_2008_Paper_
Final_v1_B_W.pdf
McLoughlin, D., & Mynard, J. (2009). An analysis of higher order thinking in online discus-
sions. Innovations in Education and Teaching International, 46, 147–160.
Melrose, S., & Bergeron, K. (2007). Instructor immediacy strategies to facilitate group work
in online graduate study. Australasian Journal of Educational Technology,23, 132–148.
Retrieved from http://www.ascilite.org.au/ajet/ajet23/melrose.html
Murphy, E., & Ciszewska-Carr, J. (2007). Instructors’experiences of web based synchronous
communication using two way audio and direct messaging. Australasian Journal of Edu-
cational Technology, 23,68–86. Retrieved from http://www.ascilite.org.au/ajet/ajet23/mel
rose.html
Neumann, D. L., & Hood, M. (2009). The effects of using a wiki on student engagement and
learning of report writing skills in a university statistics course. Australasian Journal of
Educational Technology,25, 382–398. Retrieved from http://www.ascilite.org.au/ajet/
ajet25/neumann.html
Ng’ambi, D., & Brown, I. (2009). Intended and unintended consequences of student use of
an online questioning environment. British Journal of Educational Technology, 40,
316–328.
Sorensen, P., Twidle, J., Childs, A., & Godwin, J. (2007). The use of the internet in science
teaching: A longitudinal study of developments in use by student-teachers in England.
International Journal of Science Education, 29, 1605–1627.
Stephenson, J. E., Brown, C., & Griffin, D. K. (2008). Electronic delivery of lectures in the
university environment: An empirical comparison of three delivery styles. Computers
and Education, 50, 640–651.
Swan, A., & O’Donnell, A. (2009). The contribution of a virtual biology laboratory to
college students’learning. Innovations in Education and Teaching International, 46,
405–419.
346 L. Price and A. Kirkwood
Downloaded by [Open University] at 07:49 29 October 2014
Taylor, L., & Clark, S. (2010). Educational design of short, audio-only podcasts: The teacher
and student experience. Australasian Journal of Educational Technology,26, 386–399.
Retrieved from http://www.ascilite.org.au/ajet/ajet26/taylor.html
Thorpe, M. (2008). Effective online interaction: Mapping course design to bridge from
research to practice. Australasian Journal of Educational Technology,24,57–72.
Retrieved from http://www.ascilite.org.au/ajet/ajet24/thorpe.html
Tormey, R., & Henchy, D. (2008). Re-imagining the traditional lecture: An action research
approach to teaching student teachers to ‘do’philosophy. Teaching in Higher Education,
13, 303–314.
Tynan, B., & Colbran, S. (2006). Podcasting, student learning and expectations. In Proceed-
ings of the 23rd Annual ASCILITE Conference: Who’s learning? Whose technology?
(pp. 825–832). Retrieved from http://www.ascilite.org.au/conferences/sydney06/proceed
ing/pdf_papers/p132.pdf
Wheeler, S., & Wheeler, D. (2009). Using wikis to promote quality learning in teacher train-
ing. Learning, Media and Technology, 34,1–10.
Woo, K., Gosper, M., McNeill, M., Preston, G., Green, D., & Phillips, R. (2008). Web-based
lecture technologies: Blurring the boundaries between face-to-face and distance learning.
ALT-J, 16,81–93.
Wyatt, T. H., Krauskopf, P. B., Gaylord, N. M., Ward, A., Huffstutler-Hawkins, S., & Goodwin,
L. (2010). Cooperative m-learning with nurse practitioner students. Nursing Education
Perspectives, 31, 109–113. Retrieved from http://www.nlnjournal.org/doi/abs/10.1043/
1536-5026-31.2.109
Xie, Y., Ke, F., & Sharma, P. (2008). The effect of peer feedback for blogging on college
students’reflective learning processes. The Internet and Higher Education, 11,18–25.
Zorko, V. (2009). Factors affecting the way students collaborate in a wiki for English
language learning. Australasian Journal of Educational Technology, 25, 645–665.
Technology, Pedagogy and Education 347
Downloaded by [Open University] at 07:49 29 October 2014