Technical ReportPDF Available

The undergraduate experience of blended e-learning: a review of UK literature and practice

Authors:

Abstract

The review report addresses the current meanings of 'blended learning' across the sector. A team of researchers from the Oxford Centre for Staff and Learning Development (OCSLD) has completed a review of the UK literature and practice relating to the undergraduate experience of blended learning. The study aimed to review existing research and practice on blended e-learning, identify key studies and issues, and make recommendations to guide future policy and practice. The review team combined traditional desk research, with institutional visits and interviews with key personnel. The review report addresses the current meanings of 'blended learning' across the sector, the underlying institutional rationales for blended learning, the monitoring and evaluation strategies being adopted for ensuring and enhancing the quality of blended e-learning. The review has found that the student response to the provision of online information to supplement traditional teaching is overwhelmingly positive. It is clear from the uptake of this area of technology by institutions, the rise of the use of the term 'blended learning' and the number of evaluative studies identified in the review, that institutions and practitioners are attempting to engage with blended learning and are doing so successfully.
1 The Higher Education Academy – October 2006
The undergraduate experience of blended e-learning: a
review of UK literature and practice
Rhona Sharpe, Greg Benfield, George Roberts, Richard
Francis
2 The Higher Education Academy – October 2006
Executive summary
The review of blended e-learning was undertaken by the Oxford Centre for Staff and
Learning Development at Oxford Brookes University for the Higher Education
Academy. The aim was to review existing research and practice on blended e-
learning, identify key studies and issues, and make recommendations to guide future
policy, practice and research.
Methodology
A key aim was to ensure that the review findings would be grounded in practice and
relevant to the needs of the Academy’s audience. Consistent with this aim, we
adopted a methodology that combined traditional desk research with institutional
visits and interviews with key personnel.
The review of over 300 studies of blended learning aimed to reveal methodologically
sound evidence of the impact of blended learning on the student experience. We
used a best evidence synthesis to identify the key papers with the aim of creating a
manageable knowledge base for the synthesis. The following inclusion criteria were
used in the selection of key studies: published since 2000, scenarios which blend
technology with face to face teaching, experiences of undergraduates, representative
of UK learning environments, clear rationales and/or objectives, embedded,
evaluation of the learner’s experience, justified and rigorous evaluation methodology.
The institutional visits and interviews aimed to give access to unpublished literature
and to reveal practices that we could not know about as ‘outsiders’. Interviews were
conducted with seven institutions with reputations as long standing implementers of
blended e-learning. The group represented a range of institutions including post-92
universities, research-intensive universities, and institutions with a sharp metropolitan
focus or those serving a broader, regional area. We identified five attributes that were
part of the blended learning agenda for some institutions: widening participation,
enhancing learning, flexibility of provision, prominent e-learning early adopters, and
computer aided assessment as a dominant feature.
What is blended learning?
Throughout this review we have been keen to find evidence of how the potential
offered by technology is actually being interpreted and used by institutions, their staff
and their students. We avoided reaching our own definition, noting instead eight
dimensions implicit in the definitions we found: delivery, technology, chronology,
locus, roles, pedagogy, focus and direction.
From the institutional visits and the review of course evaluations, we observed that
there were three ways in which the term ‘blended learning’ was being used. Currently
the most common type of blended learning is the provision of supplementary
resources for courses that are conducted along predominantly along traditional lines
through an institutionally supported virtual learning environment. Second, we found
some, but far fewer, impressive examples of transformative course level practices
underpinned by radical course designs. These often make use of technology to
facilitate interaction and communication and replace other modes of teaching and
learning. Third, we are aware of students taking a holistic view of the interaction of
3 The Higher Education Academy – October 2006
technology and their learning, including the use of their own technologies, although
this is currently under reported and under researched in higher education.
What underlying rationales are being used for promoting
blended e-learning?
Institutional rationales for blended e-learning were highly contextualised and specific
to each institution. They included: flexibility of provision, supporting diversity,
enhancing the campus experience, operating in a global context and efficiency.
A few course level rationales related to institutional strategy, particularly offering
flexibility in time and place of learning. However, most rationales at this level were in
response to practical challenges being faced by staff and/or in response to student
feedback (loss of staff-student contact, large classes, inconsistency in quality and
quantity of feedback between markers) as well as responding to the demands of
professional bodies in vocational courses. The rationale reported most frequently by
local implementations was maintaining quality in response to increasing cohort sizes.
What monitoring and evaluation strategies are being adopted
for ensuring and enhancing the quality of blended e-learning?
All seven institutions we visited described current plans for initiating institutional
monitoring and evaluation strategies to assess their students’ experiences. All were
finding establishing institutional level practices problematic. We suggest this was due
to the pressure to implement rather than evaluate, the low status of pedagogic
research, and poorly defined measures of institutional success in embedding blended
e-learning. All institutions welcomed the opportunity to share approaches both
through this review and the Academy’s Benchmarking e-Learning project. In
response to the requests for support around evaluation, we have taken the
opportunity to highlight examples of suitable approaches and techniques and make
recommendations for those wanting to undertake their own evaluations.
Despite the difficulties around establishing institutional level monitoring and
evaluation strategies, we identified and described a number of effective practices in
various levels of operation in universities: regular module evaluations being used to
inform departmental action plans, triangulated evaluations of students VLE use and
institutional support for the collection and dissemination of case studies. We also
suggested the promotion of pedagogic research both within institutions and for
assessing the impact of course redesigns and drew on the pedagogic literature to
make recommendations about conducting such research, concerning triangulation of
data, collection of rich data and planning longitudinal and ethnographic studies.
Finally, we noted the importance of making use of the findings of evaluations in
course planning and redesign and noted examples where this had been achieved.
What impact is blended e-learning having on the student
experience?
We find that student response is overwhelmingly positive to the provision of online
course information to supplement traditional teaching. Students make regular and
frequent use of electronic resources with few reported problems of access. They
4 The Higher Education Academy – October 2006
particularly value flexible access both from home and on-campus. The impact of the
provision of course notes is discussed in relation to support for students with
disabilities and the possible impact on attendance. Students are concerned about the
costs associated with downloading and printing and are critical of inconsistent use
between staff and modules.
We find from evaluations of redesigned courses, that while students recognise the
value in the blend of face to face and technology supported activities, there are large
individual differences in how they experience the blend. It seems to be important that
students understand the role of technology in their learning and the implications for
their study strategies and engagement in learning activities.
There is an increasing recognition that students are making use of their own
technology as well as those provided for them and that they are doing this in ways
that are not planned for, difficult to predict and may not be immediately visible to their
teachers and researchers. Taking a holistic view requires an understanding of the
individual. We found that where there is a significant individual difference such as
disability or culture, this dominates the experience for the individual, although not in
ways that are stable or predictable. Others individual differences which seem to be
important are prior experience and attitudes towards using computers within learning.
What are the success factors for blended e-learning?
Drawing out themes emerging throughout the previous sections of the review, we
make the following recommendations for successful blended e-learning in practice:
Use the term blended learning. Although difficult to define, the term ‘blended
learning’ is finding acceptance among higher education staff. We suggest that
the advantages of the term include its poor definition - which allows staff to
negotiate their own meaning - the implication of the protection of face to face
teaching, and the implication of designing for active learning.
Work with and within your context. We found that institutions who we had
identified as successful implementers of blended e-learning had highly
contextualised and specific rationales for their adoption of technology.
Similarly, successful local implementations were often in response to a real
relevant issues occurring at the course level.
Use blended learning as a driver for transformative course redesign. The
importance of transformative course level designs was identified as one of
three characterisations of blended e-learning. Throughout the review, staff
repeatedly identified engaging in course redesign as critical to their success.
The valuable features of the course redesign were identified as: undertaking
an analysis of the current course, collecting and making use of student
feedback, undertaking the design as a team, designs which make explicit
their underlying principles, and developing the course iteratively over a
number of years.
Help students develop their conceptions of the learning process. It seems to
be important how students conceive of their engagement with the learning
processes and activities within a blended e-learning context. In order to
support students, it is vital that we are consistent and transparent in
communicating our expectations about, for instance, attendance patterns or
how to engage in purposeful dialogue in asynchronous discussions.
5 The Higher Education Academy – October 2006
Disseminate and communicate results of evaluations. The need to co-
ordinate, promote and disseminate results from evaluations was identified as
a crucial aspect of monitoring institutional strategies and course redesigns.
6 The Higher Education Academy – October 2006
Contents
1 INTRODUCTION.................................................................................................. 8
1.1 O
VERVIEW....................................................................................................... 8
1.2 A
IMS ............................................................................................................... 8
1.3 M
ETHODOLOGY ............................................................................................... 9
1.3.1 Literature search strategy...................................................................... 10
1.3.2 Database entry...................................................................................... 11
1.3.3 Identification of key papers.................................................................... 11
1.4 I
NSTITUTIONAL VISITS..................................................................................... 13
1.4.1 The process of selecting institutions...................................................... 14
1.4.2 Interviews .............................................................................................. 15
1.5 R
EFLECTIONS ON THE PROCESS..................................................................... 17
2 DEFINITIONS AND USES OF BLENDED LEARNING..................................... 18
2.1 O
VERVIEW..................................................................................................... 18
2.2 U
SE OF THE TERM BLENDED LEARNING.......................................................... 18
2.3 D
IMENSIONS OF BLENDED LEARNING: TOWARDS A MEANING ............................ 21
2.4 T
HREE CHARACTERISATIONS OF BLENDED E-LEARNING ................................... 24
2.4.1 The provision of online supplementary resources................................. 24
2.4.2 Transformative course level practices underpinned by radical course
designs 24
2.4.3 A holistic view of technology use to support learning............................ 26
3 RATIONALES FOR BLENDED E-LEARNING.................................................. 27
3.1 O
VERVIEW..................................................................................................... 27
3.2 I
NSTITUTIONAL RATIONALES FOR BLENDED LEARNING...................................... 27
3.2.1 Flexibility of provision ............................................................................ 28
3.2.2 Supporting diversity............................................................................... 28
3.2.3 Enhancing the campus experience ....................................................... 29
3.2.4 Operating in a global context................................................................. 29
3.2.5 Efficiency............................................................................................... 30
3.3 C
OURSE RATIONALES FOR BLENDED E-LEARNING............................................ 30
3.3.1 Blended course designs for large group teaching................................. 31
3.3.2 Engaging students out of class.............................................................. 32
3.3.3 Developing professional skills ............................................................... 32
3.4 E
DUCATIONAL RATIONALES FOR BLENDED E-LEARNING ................................... 30
3.4.1 Blends to promote associative learning................................................. 34
3.4.2 Blends to promote constructivist learning.............................................. 35
3.4.3 Blends to promote situative learning ..................................................... 36
4 EVALUATING AND MONITORING BLENDED E-LEARNING ......................... 38
4.1 O
VERVIEW..................................................................................................... 38
4.2 A
PPROACHES TO INSTITUTIONAL EVALUATIONS............................................... 38
4.2.1 Institutional evaluations in the bedding in phase................................... 39
4.2.2 Institutional evaluations of VLEs in use................................................. 42
4.2.3 Use of course review in institutional evaluation..................................... 44
4.2.4 Use of case studies in institutional evaluations..................................... 45
4.2.5 Recommendations for institutional evaluation....................................... 48
4.3 A
PPROACHES TO COURSE EVALUATIONS ........................................................ 49
4.3.1 Evaluations of the impact on student performance ............................... 50
4.3.2 Evaluations to understand the learner experience................................ 51
4.3.3 Evaluations leading to course improvements........................................ 53
4.4 E
VALUATING THE HOLISTIC LEARNER EXPERIENCE .......................................... 53
4.4.1 Longitudinal evaluations........................................................................ 54
4.4.2 Ethnographic evaluations...................................................................... 54
7 The Higher Education Academy – October 2006
5 THE LEARNER EXPERIENCE OF BLENDED E-LEARNING.......................... 56
5.1 O
VERVIEW..................................................................................................... 56
5.2 T
HE STUDENT EXPERIENCE OF ONLINE SUPPLEMENTARY COURSE RESOURCES 57
5.2.1 Flexible access from home and campus............................................... 57
5.2.2 Access to course notes and note taking................................................ 52
5.2.3 Impact on patterns of attendance.......................................................... 60
5.2.4 Inconsistency in use between staff and modules.................................. 62
5.3 T
HE STUDENT EXPERIENCE OF BLENDED COURSES REDESIGNED TO PROMOTE
INTERACTION AND COMMUNICATION
.......................................................................... 62
5.3.1 Experience of online communication and collaboration ........................ 63
5.3.2 Experience of classroom communication systems................................ 66
5.3.3 Experience of interaction with content................................................... 66
5.3.4 Experience of online assessment and feedback................................... 67
5.3.5 The student experience of emerging technologies................................ 69
5.4 T
HE HOLISTIC EXPERIENCE OF BLENDED E-LEARNING...................................... 69
5.4.1 International students............................................................................ 71
5.4.2 Gender................................................................................................... 71
5.4.3 Disability................................................................................................ 72
5.4.4 Learning style........................................................................................ 72
5.4.5 Experience of and attitudes towards technology................................... 73
6 CONCLUSIONS AND RECOMMENDATIONS.................................................. 74
6.1 C
RITICAL SUCCESS FACTORS FOR BLENDED E-LEARNING ................................ 74
6.1.1 Use the term ‘blended learning’............................................................. 75
6.1.2 Work with and within your context............Error! Bookmark not defined.
6.1.3 Use blended learning as a driver for transformative course redesigns
Error! Bookmark not defined.
6.1.4 Help students develop their conceptions of the learning process ....Error!
Bookmark not defined.
6.1.5 Disseminate and communicate results of evaluations........................... 76
6.2 R
ECOMMENDATIONS FOR POLICY AND PRACTICE............................................. 77
6.3 R
ECOMMENDATIONS FOR FUTURE RESEARCH................................................. 78
REFERENCES.......................................................................................................... 79
7 APPENDICES.................................................................................................... 82
7.1 APPENDIX
1 - ENDNOTE KEYWORDS V 4.6................................................... 82
7.2 A
PPENDIX 2 - EXAMPLE OF APPLICATION OF INCLUSION CRITERIA .................... 84
7.3 A
PPENDIX 3 - SEMI-STRUCTURED INTERVIEW SCHEDULE ................................ 85
8 The Higher Education Academy – October 2006
1 Introduction
1.1 Overview
This review of e-blended learning has been undertaken by the Oxford Centre for Staff
and Learning Development (OCSLD) for the Higher Education Academy. The aim
was to review existing research and practice on blended e-learning, identify key
studies and issues, and make recommendations to guide future policy, practice and
research. We were keen to ensure that the review findings would be grounded in
practice and relevant to the needs of the Academy’s audience. Consistent with these
aims we adopted a methodology that combined traditional desk research with
institutional visits. This methodology allows us to describe how the national e-
learning agenda and research findings are being interpreted by institutions and their
staff and to analyse the impact for their students.
1.2 Aims
Recent years have seen a dramatic increase in both the uptake of e-learning within
higher education and research into its impact for institutions, practitioners and
students. We are now at a point where 95% higher education institutions are
operating at least one virtual learning environment (JISC, 2005a). Browne and
Jenkins (2003) reported that the use of VLEs was predominantly supplementary to
face to face teaching. It is this blend of e-learning and face to face teaching which is
becoming increasingly prevalent (the nature of the blend is further explored in
Chapter 2). In the UK, in response to input from across the post-16 education sector,
the HEFCE Strategy for e-Learning supports and codifies the prevalence of face to
face teaching blended with e-learning (HEFCE, 2005). Blended learning is rising in
the UK and is being predicted to rise further in reviews of practice in North America
(Bonk, Kim and Zeng, 2006) and Australia (Eklund, Kay and Lynch, 2003).
In light of such an expansion of e-learning and the predicted future rise when used in
combination with face to face teaching, there are already attempts underway to
review the existing research. There have been reviews of e-learning in disciplines
such as language learning (Chapelle, 2004; Felix, 2005) and technologies including
e-portfolios (Beetham, 2005a), computer aided assessment (Conole and Warburton,
2005) and interactive whiteboards (Smith, Higgins et al., 2005) which make
recommendations for the use in these specifics contexts and propose areas for
further investigation. There is a need for a review which is more wide ranging than
these concerned with specific technologies.
Such wide ranging reviews of e-learning have taken place in other sectors of
education. Higgins (2003) undertook a systematic review of ICT in schools for the
British Educational Research Association. He reports that while ICT can improve
learning (through increasing time on task, providing feedback, demonstrating
complex processes visually and prompting discussion in class), it can also be used
inappropriately. Higgins concludes that it is how ICT is used that makes the
difference and this is largely dependent on the actions of the teacher in how they
select and organise resources and integrate them into their teaching. This is perhaps
an unsurprising finding but it is interesting in terms of developing aims for the current
review. There appears to be little value in another review which asks ‘do blended
approaches improve learning?’ and which will predictably give an answer ‘it
depends’. Our review questions are more open, asking, for example, what is the
impact of blended learning for students? Second, if impact is so dependent on the
9 The Higher Education Academy – October 2006
individual teacher, it raises the question of the potential for difference between what
is being reported in the educational literature and what is actually happening within
schools, colleges and universities. The current review aims to present data from the
literature alongside that from unpublished and internal sources which accurately
represent current practice.
Reviewing studies of practice raises questions about how the research has been
conducted. In reviews of the e-learning literature in USA (Sunal et al., 2003) and
Australia (Eklund et al., 2003) both reported that most research is case study based.
Eklund et al note the frequent use of action research by researcher/ practitioners,
which improves local practice, but as a methodology, is difficult to draw more general
conclusions from. This review will identify evaluative studies which have used
methodologies from which we are able to generalise and disseminate their findings.
In addition, in order to promote evidence-informed practice, the review will promote
the use of rigorous evaluative practices.
A final aim arising from previous reviews concerns the rationales for adopting
blended learning. While there has been widespread publication of the potential
benefits of e-learning, and more recently, blended learning, it is not yet clear how
practitioners and their institutions are choosing to make use of these approaches. In
a review of North American practices, Curt Bonk offered the review team 40 possible
problems (e.g. student absenteeism, lack of facilities) and their blended learning
solutions, but we don’t yet know which of these 40 reasons practitioners in the UK
are using (Bonk, 2006). We would hope that alongside such practical issues our
understanding of how students learn informs our choices. However, Cullen et al.
(2002) in a review of pedagogic research for the Economic and Social Research
Council have been critical of the use of theory to inform pedagogic practices finding
that examples of good practice “Are either grounded in the day to day minutiae of
‘chalkface’ learning delivery (and hence ungrounded in theory) or, conversely, are
tied to a particular ‘grand learning theory’ and are unsubstantiated in practice.” (p.
73). The current review will attempt to identify practical and educational rationales
used by practitioners to underpin their choices around blended learning in practice.
Institutional rationales are also of interest. In a JISC commissioned survey of virtual
learning environment (VLE) use within 212 higher education institutions (HEIs), it was
found that uses of VLEs which are expected to impact on student learning, such as
communication, discussion and assessment were the most frequent reason given as
to why HEIs are adopting VLEs (JISC, 2005a). However, using institutional case
studies, Bricheno, Higgison and Weedon (2004) found that these uses are actually
much less common in practice. We note this contradiction between aspiration and
practice and will use the visits to explore this further.
1.3 Methodology
We adopted a pragmatic approach for this study, after Tashakkori and Teddlie
(1998), where the research questions and study aims determine the methods used.
The review of the published, peer reviewed research on blended e-learning was
intended to reveal rationales for adopting blended e-learning and a selection of key
studies would provide evidence of the impact of blended e-learning on the student
experience. We also purposively sampled institutions to visit on the basis of their
reputation for blended e-learning and used the visit/interview methodology to both
record the published institutional policies and strategies and go beneath them to
explore how they are being understood, articulated and implemented by their staff.
10 The Higher Education Academy – October 2006
The following research questions were derived from the aims to guide the review:
a) How is the term ‘blended learning’ being used in higher education?
b) What are the underlying rationales being used for blended e-learning?
c) What monitoring and evaluation strategies are being adopted for ensuring
and enhancing the quality of blended e-learning?
d) What impact is blended e-learning having on the student experience?
e) What are the success factors for blended e-learning?
1.4 Literature search strategy
We first established a plan for the search (Hart, 1998) by selecting a set of databases
and journals to search and developing a set of search terms.
The following databases were searched:
British Education Index
Australian Education Index
ERIC
British Nursing Index
Education Online
Google & Google Scholar.
The following terms were used in combination in the database searches:
Core terms specifiers
ICT
C&IT
Educational technology
E-learning / eLearning
Blended learning
Mixed mode learning
Hybrid models of learning
Virtual learning environment
+ pedagogy
+ student experience
+ learner experience
+ evaluation
In addition, the following publications were hand searched for relevant articles:
ALT-J, Research in Learning Technology
American Journal of Distance Education
British Journal of Educational Technology
11 The Higher Education Academy – October 2006
Computers and Education
Educational Media International
Educause Quarterly
E-learning and education
Electronic Journal of e-Learning
Innovations in Education and Teaching International
Internet and Higher Education
Journal of Computer Assisted Learning
Networked Learning Conference Proceedings
Teaching in Higher Education
Studies in Higher Education.
We put out calls on the Heads of e-Learning Forum and Joint Information Systems
Committee (JISC) e-Learning and Pedagogy Experts email distribution lists asking
for suggestions for articles to include. We used our personal contacts to access
unpublished literature. We also received submissions of papers in response to the
project website.
Finally, we used some of the funding for Curtis Bonk to visit us while he was in the
UK and provide an overview of the North American literature and practice on blended
learning.
1.5 Database entry
This search strategy produced more than 300 publications. Each one was entered
into an Endnote database with the following fields:
Type (Journal article, Book, Book section, Conference proceedings, Edited
Book, Report pr Electronic source)
Author
Date
Title
Source (e.g. for a Journal Article, journal name, volume, issue and pages)
Abstract
Notes (reader’s own summary of the important points about the document)
Label (institution name, public/private, country, language, type of evidence)
Keywords (see below)
Research notes (note here if this is a ‘key paper’)
URL
Link to PDF.
A set of keywords was founded on the five basic research questions (see Appendix
1) with subcategories and descriptors built up as we included more documents in the
database. These keywords enabled us to search the database for the relevant
literature on each research question and within each subcategory.
1.6 Identification of key papers
In the bid we proposed that rather than aiming for a very comprehensive review, a
valuable function of this review would be to make visible existing research, policy and
12 The Higher Education Academy – October 2006
practice which has transformed the student experience, been embedded over a
number of years, and been thoroughly evaluated.
We used a best evidence synthesis to identify the key papers with the aim of creating
a manageable knowledge base for the synthesis (after Slavin, 1986). Slavin
recommends that reviewers apply consistent, well justified, and clearly stated a priori
inclusion criteria. We used the research questions and early searches as a starting
point to identify the inclusion criteria and developed them as the review progressed.
The following inclusion criteria were used in the selection of key studies:
a) Publications since 2000
From the relentless e-mail, to student and staff portals, to virtual learning
environments (VLEs) and computer aided assessment, daily practice in 2006 is
quite different from how it was even five years ago. The uptake of VLEs by HEIs
has been undertaken in a short period from 7% of HEIs in 1997, 81% in 2001,
86% in 2003 and 95% in 2005 (Armitage, Brown and Jenkins, 2001; Browne and
Jenkins, 2003; JISC 2005a). This period has also seen the shift in use of
technologies supporting distance learning to supporting face to face teaching.
Technologies change quickly and teachers are constantly experimenting with
their use. As the review aimed to be representative of the UK undergraduate
experience, we favoured those studies which drew on recent data and excluded
those drawing on data prior to 2000.
b) Scenarios which blend technology with face to face teaching
Our remit was to describe the experiences of students in blended learning
situations in on-campus universities. We excluded the large literature of using
technologies to support distance learning.
c) Scenarios from undergraduate experiences
A good deal of the current advice on e-learning has stemmed from research
conducted with postgraduate and/or professional development courses (notably
Salmon 2002, 2004; McConnell, 2000, 2005). Such participants are likely to be
reflective, articulate, interested and informed about teaching and learning
processes. As we worked towards an appreciation of the importance of students
understanding their own learning as a possible success factor, it became
important to distinguish between undergraduate and postgraduate learners.
d) Scenarios which are representative of UK learning environments
We have drawn mainly on research from the UK. Where this is lacking, we have
used research from other countries with similar teaching and learning
environments and processes, and we have indicated where they are from.
e) Applications with clear rationales and/or objectives
At the institutional level, this might be strategic developments or institutional level
frameworks for supporting blended e-learning. At the course level, applications
should have a clear and explicit rationale and/or pedagogical framework that aims
to transform the student experience.
f) Applications that are, or are becoming, embedded
Many studies of the use of technology are reports on innovative approaches that
are frequently led by an enthusiastic teacher. Hughes and Daykin (2002) noted
that many studies of students perceptions of e-learning had been carried out by
skilled and enthusiastic lecturers with small groups of volunteer students and ‘a
13 The Higher Education Academy – October 2006
lecturer’s enthusiasm for online delivery may have created a form of Hawthorne
effect, where students’ evaluations may have been affected by this enthusiasm
rather than the delivery.’ (p. 217).
In a historical comparison of the literature on electronic voting systems pre and
post 2002, Simpson and Martin (forthcoming) note that more recent papers which
examine established course developments where existing practice is being
refined and enhanced, are likely to be of more use in helping us understand the
impact of technology than common problems associated with first use. We found
that studies of first uses of technology frequently report issues which have
already been well established e.g. encouraging uptake of computer aided
assessments through some summative assessment (Enjelvin, 2005), the need for
preparation and induction for the use of electronic materials (Davies et al., 2005),
the distribution of handsets for electronic voting systems (Reay et al., 2005).
g) Evaluations of the learner’s experience
We have included work which identifies what makes a difference to the student
learning experience or leads to a better understanding of what influences it. The
student view is important to represent because it presents an alternative,
sometimes contradictory, view to the tutors’ views. For example Timmis et al
(2004a) in an evaluation of a blended learning scenario observe that ‘In general,
the tutor’s views of how students were communicating through the VLE was
positive and collaborative. They reported that students were sharing teaching
strategies and that they were able to encourage deeper thinking in the discussion
board’ (p.7). The interviews with students did not reflect this view. Students saw
the discussion board as a forum for maintaining communication when they were
on their placements.
h) Evaluations with a justified and rigorous methodology
While we acknowledge the role of descriptions of practice in changing teacher’s
practice, we are interested here in making recommendations for evidence-
informed practice. We have used anecdotes and quotes from learners where ever
we have found them. However, our common themes and recommendations stem
from studies which are thorough in their evaluation methodology. This includes
piloting data collection techniques, ensuring that samples are sufficiently
representative or purposively sampled, that claims for difference are supported
statistically, that qualitative data are analysed methodically. We favoured
evaluations that were triangulated i.e. that made use of data from a variety of
times, methods and sources.
Each database entry was assessed against the inclusion criteria (see Appendix 2 for
example). The application of the inclusion criteria produced a limited set of key
papers which were given priority in the synthesis of the following chapters.
1.7 Institutional visits
The peer reviewed literature could only provide part of the picture. Additional
evidence of practice is available at institutional web sites, often including institutional
strategies and case studies of good practice. Such 'front of stage' documents can
offer templates and examples of practice for other practitioners to use or adapt to
their own needs. They often lack the historical and contextual information necessary
for deciding what might be effective practice or transferable to other contexts. They
also tend to be sanitised for public consumption. The valuable 'back stage' or 'under
14 The Higher Education Academy – October 2006
stage' histories of successes, failures and conflicts relating to institutional practices
largely resides behind institutional firewalls, inside intranets and in the collective
social memories of the communities of practice that brought them into being (Lave
and Wenger, 1991; Wenger, 1998). To gain access to such material we selected a
small number of institutions for visits and interviews.
The visits aimed to collect internal documents such as e-learning strategies,
institutional learning, teaching and assessment strategies, evidence of internal
evaluations of blended learning, and examples of internal quality assurance
procedures. Each visit involved a discussion with representatives of e-learning within
the institutions. These interviews aimed to give the institutional representatives an
opportunity to explain to us some of the thinking and history behind their institution’s
public policies and practices, and to reveal practices that we could not know about as
‘outsiders’.
1.8 The process of selecting institutions
Given that the scope of the project would enable a maximum of eight institutional
visits, our sampling strategy was to gather a variety of illustrative examples of interest
to the sector, rather than representative of it. Since it is generally preferable to learn
from experience of success, our first selection criterion was that the visit institutions
should be perceived within the sector as a relatively long standing and successful
implementers of blended learning. In addition, we aimed to represent a range of
institutions including post-92 universities, research-intensive universities, institutions
with a sharp metropolitan focus or those serving a broader, regional area.
Having conducted an initial scan of the strategies of a list of long-standing
implementers of blended learning we identified five attributes that were part of the
blended learning agenda for some institutions:
1. Widening participation
2. Enhancing learning
3. Flexibility of provision
4. Prominent e-learning early adopters
5. Computer-aided assessment as a dominant feature
Our list of possible candidates for visits was too long so the final criterion was an
exclusion one: several institutions were not selected because there was already
sufficient material about their experience in the public domain.
Based on these selection criteria, we took a shortlist of 22 institutions to the project
steering group and shortly after this agreed on eight institutions we wanted to visit.
Having conducted such a thorough selection process, we were disappointed to only
be able to arrange visits to four of these. The other three interview sites were chosen
from those who approached us to get involved in order to balance our final sample.
Table 1.1 shows the institutional attributes represented in the group interviewed and
the alias adopted for each institution.
15 The Higher Education Academy – October 2006
Table 1.1 Institutional visit criteria
Institution
1
st
Attribute 2
nd
Attribute 3
rd
Attribute
Blackwater
Regional/local focus Related to flexibility Enhancing learning
Deepshire
Old U/Research
intensive
Enhancing learning Regional/local focus
Eastonhall
Regional/local focus
Old U/Research
intensive
Widening
participation
Kilderhill
Enhancing learning Regional/local focus Related to flexibility
Longside
Prominence in sector Enhancing learning Regional/local focus
Metroville
Widening
participation
CAA Regional/local focus
Westingto
n
Widening
participation
Regional/local focus Prominence in sector
1.9 Interviews
Because our methodology involved site visits and interviews, in accordance with
good practice we subjected our plan to ethics review and gained Oxford Brookes
University Ethics Committee approval. Guaranteeing confidentiality of the data is the
best way to minimise any risk to participants due to their remarks being identified
publicly (Bogdan and Biklen 1982). To encourage participants to speak more freely to
us than they might be able to do as public representatives of their institutions we
informed all invitees that we would de-identify all the data we collected from our visits
by default, using institutional and individual aliases for public reporting. We
nevertheless wanted to offer everyone involved the fullest measure of informed
participation that we could afford them, so pre-publication draft reports were
distributed to participants who were offered the opportunity to be identified in the
final, published reports if they wished.
Two members of the research team attended each institutional visit, except on one
occasion when this was not possible and only one attended. We invited each
institution to send at least two representatives to meet with us on each visit,
recommending ideally the inclusion of a senior manager responsible for e-learning at
a strategic level and someone with an institutional implementation role, such as an
educational developer or faculty e-learning champion.
It cannot be inferred that the views of the participants in these visits are in some way
representative of their institutions as a whole. Each participant gave their views from
their own perspective. They recognised that others within their institutions might have
a different view of the matters we discussed. Indeed, most of our visits were typified
by high levels of unanimity punctuated by the occasional lively, collegial debate.
Table 1.2 shows the roles of the institutional representatives who met with us on
each visit. All included representation at a senior planning level (a Head of e-
16 The Higher Education Academy – October 2006
Learning or equivalent or a Head of Teaching Quality Enhancement). One institution
sent just one representative due to an unexpected crisis that needed to be attended
to by the Head of Learning Technology, who had planned to be there. One of the
visits was conducted by videoconference and documents were exchanged by email.
Table 1.2 Institutional representation at visits
Institution
Institutional
role 1
Institutional
role 2
Institutional
role 3
Institutional
role 4
Blackwater
Head of e-
learning
or equiv
Educational
developer
NA NA
Deepshire
Head of e-
learning
or equiv
Educational
developer
NA NA
Eastonhall
Head of Teaching
Quality
Enhancement
Educational
developer
NA NA
Kilderhill
Head of e-
learning
or equiv
Head of
Learning
Technology
NA NA
Longside
Head of e-
learning
or equiv
Head of
Learning
Technology
Educational
developer
Faculty
champion
Metroville
Head of Teaching
Quality
Enhancement
NA NA NA
Westington
Head of e-
learning
or equiv
Faculty
champion
NA NA
Piloting of interview schedules contributes to rigour (Cohen et al., 2000). We
conducted a first visit with the University of Westington where we piloted a semi-
structured interview schedule based on our five research questions. After evaluating
the effectiveness of the interview items there, we made some minor changes to the
interview schedule. This was distributed to each institution before the visit so that
they could prepare for the discussion (See Appendix 3). Each interview was audio
recorded and transcribed. The transcripts were subsequently reviewed against the
recorded interview by a member of the research team, who made corrections as
necessary. To ensure accuracy and enable each participant to give informed consent
throughout the project, corrected transcripts were returned to participants so that they
could make corrections, or indeed deletions if they wished any remarks to be ‘off the
record’.
The transcripts were coded using the NVivo
©
qualitative data analysis software. To
ensure that the interview data could supplement the literature search, the coding
system we adopted was to apply the keywords that categorised the entries in our
literature search database. Thus, the first level of coding categorised utterances
relevant to one or the other of the five research questions. Then, second and third
17 The Higher Education Academy – October 2006
level subcategories were used to describe the aspect of the research question that
was being addressed. As should be expected given the nature of this data, several
new sub-categories were created to code utterances covering elements that had not
emerged in the literature search.
1.10 Reflections on the process
The searching of the literature revealed over 300 relevant papers published since
2000, of which only 14 were finally adopted as ‘key papers’. In terms of our a priori
inclusion criteria, we noted that most papers were rejected because they were
reports on innovations in their first year of operation and we make recommendations
later in this report about the need for longitudinal research (see Chapter 4). We were
reassured to find many well elaborated and clear rationales for blended learning
presented by practitioners in the literature and discuss these in Chapter 3.
Arranging and conducting the interviews has been very time consuming. We do think
it was worthwhile in that each visit has yielded internal documents - primarily e-
learning strategies, learning and teaching strategies or internal evaluations of e-
learning - that are not available publicly available. In addition, we found that the
transcribed and processed data arising from the interviews provided us with an
expansive and rich dataset concerning institutional processes aimed at embedding of
blended learning. The data has required significant analysis and interpretation which
is ongoing.
18 The Higher Education Academy – October 2006
2 Definitions and uses of blended learning
2.1 Overview
The first research question this project aimed to investigate was how the term
‘blended learning’ is being used in higher education. The literature review
demonstrated the difficulty that others have had in reaching a consensus around a
definition. We noted from the interviews that some institutions have developed their
own language, definitions or typologies to describe their blended practices. We
suggest that this poor definition may be a strength and part of the reason why the
term is being accepted. The lack of definition allows institutions to adapt and use the
term as they see fit, and to develop ownership of it.
Rather than offer another insufficient definition, we synthesised eight dimensions that
embrace the possibilities of blended learning:
delivery different modes (face-to-face and distance education)
technology mixtures of (web based) technologies
chronology synchronous and a-synchronous interventions
locus practice-based vs. class-room based learning
roles multi-disciplinary or professional groupings
pedagogy different pedagogical approaches
focus acknowledging different aims
direction instructor-directed vs. autonomous or learner-directed learning.
Moving from blended learning to our focus for this review of blended e-learning, we
were keen to find evidence of how the potential offered by technology is actually
being interpreted and used by institutions and their staff. Drawing together the data
from the institutional visits and the database of publications, we observed three broad
characterisations for blended e-learning as it is practised today:
The provision of supplementary resources for learning programmes that are
conducted along predominantly traditional lines, through institutionally
supported virtual learning environments.
Transformative course level practices underpinned by radical course designs
which often make significant use of technology to replace other modes of
teaching and learning.
A holistic view of technology and learning, including the use of the learners’
own technologies to support their learning.
2.2 Use of the term ‘blended learning’
To date, the differences and similarities between online, traditional distance and
physical-based teaching have been little understood, leading to confused
notions of the panacea of “blend”. (Salmon, 2005, p. 202)
In our own experience we observe that the term blended learning has been in use for
more than 15 years but that its meaning has been constantly changing during this
period. In the late 1980s the Workers Education Association (WEA), Ruskin College
and the Open University collaborated on what was called a blended learning
programme for adults, without qualifications, returning to education (Moore and
Bryant, 1989). The programme blended evening/weekend courses with residential
Chapter 2: Definitions and uses of blended learning
19
learning sessions and distance learning support. The general discussion around
course development at the Open University at the time used the term ‘blended
learning’ to signify a mix of distance and face-to-face learning. Summer schools and
monthly tutorial-group meetings were typical face to face interventions in a
predominantly distance learning mix. Through the 1990s the corporate training world
spoke of blended learning as enhancements to the typical corporate training
intervention: the short course.
Corporate researchers and practitioners noted that technology enhanced
learning alone was not enough, arguing that people needed experiential
learning for the mastery and retention of knowledge and skills achieved
through the blending of technology and face-to-face interaction. (Matheos et al.,
2005)
Short courses were blended with pre-course readings and post course activities such
as action-learning sets and project-based learning teams to embed the learning in
the workplace (Roberts et al., 1996). Short course participants also received
electronic materials (e.g. spreadsheet-based project finance models, trading
simulations, technical process modelling, etc.), on portable media, initially floppy disk
and later CD and eventually through web services. At the turn of the century web-
based distance learning and training was being blended back with supplementary
printed manuals and optional face to face seminars ‘at a location near you’. So, when
commentators today say, for example, ‘One of the next new terms to dazzle us in
technology enhanced education is ‘blended learning’’ (Smith, 2001) we can read it
with a strong sense of deja vu.
Even today, the term seems to retain both novelty and usefulness, with the University
of Hertfordshire awarded a Centre for Excellence in Teaching and Learning for its
Blended Learning Unit.
It has been suggested that the term remains in use because it is ill defined. Driscoll
(2002) concludes that it is this lack of definition, meaning different things to different
people, which gives the term potential. We would agree that from our experiences at
Oxford Brookes University, allowing staff to debate their interpretation of the term,
helps them develop ownership of how it is used within the institution (Sharpe et al.,
2006). Oliver and Trigwell (2005) draw different conclusions for the same observation
of the ubiquity of term blended learning in higher education. They conclude that:
By any definition there is little merit in keeping the term ‘blended learning’ as it
is currently understood. It is either inconsistent (and so useless as a way of
understanding practice) or redundant, because it simply describes practice
within higher education more generally, and it attributes to learning something
that, in terms of what we know, only applies to teaching or instruction. (Oliver
and Trigwell, 2005, p.21)
Two of the institutions we visited similarly rejected the term as being an unhelpful
buzzword that describes longstanding learning and teaching practice, e.g.:
My own personal and fairly strongly held view, and a view that's shared by at
least some of the rest of the team, and is now being inflicted on the rest of the
university, is that it's a catch-all word that really means nothing. A condition of
teaching and learning in higher education now is that it is mediated
electronically, in all kinds of ways and it's more important to understand the
20 The Higher Education Academy – October 2006
variety of ways in which that's happening than it is to have some sort of catch-
all term. (Blackwater 1)
Despite some concerns over poor definition and use of jargon, the term blended
learning has become ubiquitous. Rothery observes that most standard practices in
universities across Europe now involve a mixture of approaches (Rothery, 2004).
While providers of face-to-face training enhanced their courses with online elements
and preserve the values of the face-to-face experience, the providers of distance
learning courses converged on a blended model from the other direction, offering
optional printed manuals and supplementary face-to-face workshops.
Before we go on to look at what blended learning means to people, it is important to
consider one final aspect of the use of the term. Through our study it becomes
evident that there is something else at work, something that intrudes value-laden
criteria into the discussion. Derntl and Motschnig-Pitrik (2005) capture one of the
reasons the term is finding acceptance. In a climate of rapid change innovation
fatigue can set in. The past ten years have seen some institutional learning and
teaching practice impacted upon heavily by learning technology. However, as Derntl
and Motschnig-Pitrik emphasise, e-learning is still too concerned with content
delivery and transmissive models of learning. If e-learning is reified as unidirectional,
transmissive, computer-based learning, then any blend is bound to find greater
acceptance by academics, whether this is for moving away from transmissive models
of teaching or for preserving ‘pleasurable opportunities we have for face to face
contact with our students’ (Stubbs and Martin, 2003, p.8). Anything that admits
dialogic or reflexive learning practice feels more acceptable, as seen in Metroville
University’s e-Learning Strategy:
Curriculum design will incorporate e-Learning wherever it will most benefit
learners, delivering flexible and distributed learning, accommodating a broad
range of learning opportunities for students afforded by e-Learning
technologies. While the University recognises the value of delivering passive
content via the VLE, its intention is to focus increasingly on the provision of
activity-based, independent learning using e-Learning technologies. The
University acknowledges that this is likely to lead to a shift away from the
traditional regular lecture-tutorial cycle with implications for estates and
timetabling. Academic staff will take the lead in embedding e-Learning for
delivering the curriculum. (Metroville e-Learning Strategy (2005 – 08) p.3)
Such value laden uses of terminology were apparent through our interviews where
we observed the tension between front of the stage, public assertions of institutional
policy and practice and the back-stage rumblings of dissent. In contrast to the front-
of-stage documents, the back-stage discourse revealed in conversation is far more
complex and less harmonious, “I don’t want to disagree with you, but…” (Longside
4). There is evidence of conflict between the different actors involved in the practice
of education.
I told our senior officer responsible and vice principal that personally I wasn’t
particularly happy about all this terminology. We’ve tried to remove as far as
possible jargon throughout everything we’re doing because we won’t reach
academics if we use too much jargon. (Kilderhill 1)
In a recent review of the literature, Oliver and Trigwell (2005) identify three meanings
for the term blended learning:
Chapter 2: Definitions and uses of blended learning
21
1. the integrated combination of traditional learning with web-based online
approaches
2. the combination of media and tools employed in an e-learning environment;
and
3. the combination of a number of pedagogic approaches, irrespective of
learning technology use
Oliver and Trigwell recommend that the term blended learning can be redeemed by
focusing on the variation in the experience of the learner. Implicit in their argument is
an approach that is not wholly instructor-led, but combines learner self-direction with
traditional instruction. McShane (2005) adds a temporal dimension observing that
learning technologies enable blending synchronous and asynchronous teaching and
learning. Schrittesser (2004) adds another important dimension, that of blending an
apprenticeship approach to learning with a reflexive approach to learning. Both
McShane and Schrittesser characterise the blending of pedagogical approaches.
Although we didn’t find many examples of blending pedagogical approaches (see
Chapter 3), it is along these axes that we see the potential of blended learning to
emerge as a transformational practice. It is likely that it is this approach that is likely
to be most productive in making the best use of blended learning. As Salmon says:
E-learning is in a rather extraordinary position. It was bought as a ‘tool’ and now
finds itself in the guise of a somewhat wobbly arrow of change. In practice,
changing the way thousands of teachers teach, learners learn, innovation is
promoted and sustainable change in traditional institutions is achieved across
hundreds of different disciplines is a demanding endeavor that will not be
achieved by learning technologies alone. It involves art, craft and science as
well as technology. (Salmon, 2005, p.201)
2.3 Dimensions of blended learning: towards a meaning
This survey of the research literature has provided us with definitions that embrace
the possibilities of blending along the following dimensions:
Delivery - different modes (face-to-face and distance education)
Technology - mixtures of (web based) technologies
chronology - synchronous and a-synchronous interventions
locus - ‘authentic’ work or practice-based vs. class-room based learning
roles - multi-disciplinary or professional groupings of learners and teachers
pedagogy - different pedagogical approaches
focus - acknowledging different aims
direction - instructor-directed vs. autonomous or learner-directed learning.
The first three items in the list, the blending of delivery mode, technologies, and
chronology are consistent with the historical use of the term. Learning programmes
have been blending distance and face-to-face modes using different technologies to
allow flexible scheduling for years. These meanings are conveyed in the following
institutional descriptions:
In our corporate plan the focus is on flexibility, so it’s really flexible learning. Our
corporate plan says it’s a mix of face to face and distance. We are a
predominately campus based university, with odd pockets of truly distance
22 The Higher Education Academy – October 2006
learning in various places. And we also have big franchised courses which are
delivered at a distance but supported locally. (Eastonhall 1)
Online learning has now changed its meaning from what we meant in 1999,
which was … supplemented work. Online learning now more means you're
online and you might see somebody face to face. That wasn't anywhere near
the way I'd expect it to be used six years ago. (Longside 2)
When the term is used to embrace these three aspects alone, there is little that is
being done to affect business as usual. Learning and teaching remains
predominantly traditional. This is not to be critical of blended learning practices that
go no further. There is no suggestion that traditional practices will not produce
effective and satisfying learning experiences for both teachers and students.
Students appear to be overwhelmingly positive about access to online resources to
supplement traditional teaching (see Chapter 5) but the roles are not challenged. The
learner is subordinate to the learning, which is still institution and teacher centred.
This is where we find the bulk of blended e-learning practice. The VLE is used as a
repository for and a means of transmitting content. A certain amount of chronological
flexibility is introduced, allowing learners to revisit items which may have been
presented at times inconvenient to them. Discussions are released from the confines
of the classroom and may be pursued through online forums. Teachers embrace
such practices as a convenience which allows them to retain the pleasurable aspects
of their jobs and to resist what they feel as commercial performative pressures to do
ever more with fewer resources.
The next three items in our list, locus, roles, and pedagogical approach take
blended learning further. In spite of Oliver’s antipathy to corporate training, the
commercial world has long held that authentic, work-based learning is highly
valuable. This is not only for the efficiency of keeping people on the job while they
learn, but because for many learners, authentic, practice-based settings are more
effective learning environments than abstracted and more theoretical classroom-
based settings.
What we’re trying to do is to bring the real, broader world into the curriculum, so
that students are trying to practise and develop their skills in as close to a real
world context as possible. And we’ve got a number of projects which we’re
about to start which are about trying to create those sorts of environments
across the university. (Metroville 1)
For example, learning technology has been used to support placement-based
learners (McGugan and Peacock 2005; Allison, 2004). Authentic practice-based
learning is increasingly used in medical education such as in the new Peninsula
Medical School (PMS) which has adopted the principle of problem based learning as
a core instructional strategy (PMS, 2005). Engineering subjects are also beginning to
adopt problem-based learning where work placements, site visits and field work have
always been valued (e.g. Manchester School of Engineering, 2004; University
College London, 2002),.
One of the things that we tend to say when we go into departments … is that
they're quite accustomed to blended learning anyway. In my case in
Engineering we've been blending lectures with workshops, with tutorials, with
laboratories, with site visits, and we've spent twenty years doing that. (Kilderhill
1)
Chapter 2: Definitions and uses of blended learning
23
[The course] showed that one can teach even a rather conservative laboratory
subject matter like programming in a new style. It does not depend on the
subject, but only on attitudes. In the beginning I was convinced that this is only
possible with diffuse subjects, but it turned out that instructors can even provide
enough freedom (not only through self-chosen project topics) despite the exact
nature of the requirements. (Derntl and Motschnig-Pitrik 2005)
Essential to this process is the provision of a study or learning landscape, which has
been achieved through a commercial VLE (JISC, 2005a). As roles come into the
blend we begin to see the possibility of breaking down the traditional discipline and
practice-based divisions that characterise higher education. Cross-functional teams
enable different strengths to be harnessed and the preserve of the traditional teacher
is opened up to allow learning technologists and administrators to play a more active
role in the design, development and facilitation of learning programmes.
Staff will have to think through much more critically in terms of course design
how they link through the increasing range of resources that are out there
with what goes on in the classroom setting. (Metroville 1)
It’s about blending the whole student support and curriculum areas together,
ideally into one seamless whole. (Metroville 1)
Initially blended learning was an extra line of support [in Health Care], it was
more about cost effectiveness in the Business School, and with [one of our]
national teaching fellows the approach was retention. So we had blended
learning in different ways. (Longside 4)
Teams may also allow the introduction of more than one pedagogical approach into
the blend. Some learners may thrive in a didactic situation, may want to be given all
the theory before they turn to practice, and may need or want the extrinsic motivation
of summative assessment to produce their best work. Others might prefer a more
active, problem-based approach to learning where theories are derived from or at
least better assimilated in a practical context, and learn best from peer-to-peer
formative assessment. In a blended course it might be possible for some learners to
follow a collaborative learning strategy with like-minded colleagues while others
pursue their studies as independent learners.
The last two terms in our list, focus and direction take the concept of blended
learning even further. The dimension of focus acknowledges that learners’ aims are
at least as important as the aims that the institution might have in developing learning
programmes and the dimension of direction allows that not only are the learners’
aims equal in status to institutional aims but so too might the learner be (or become)
an equal partner in the shaping of their own learning.
As learners arrive at university with ever increasing levels of sophistication in the use
of technology they may be less inclined to be inscribed into institutional practices. It is
in focus and direction that we see holistic practices beginning to be shaped, where
any learner might learn through any (or no) device at any time in any place.
So students will have access to … what classes they are taking, examination
marks, announcements, hopefully in the future through the VLE … access to
discussion forums and files that are posted online and all the usual stuff, plus
video, on any device, whether they’re sitting on campus on a campus
computer, or sitting at home on Broadband, or accessing it on the bus coming
24 The Higher Education Academy – October 2006
in the morning. There will be MP3 files to listen to lectures in French before
they actually go to the lectures. (Kilderhill 1)
2.4 Three characterisations of blended e-learning
We have seen that the term blended learning is difficult to define, which may be its
strength and rather than offer another definition we have suggested eight dimensions
along which blended learning scenarios may vary. Using the review aims as a guide,
we focused for the remainder of the review on scenarios which blend technology with
face to face teaching as the mode being most representative of UK higher education.
Within this still broad categorisation which we refer to as ‘blended e-learning’, we
noted three types of use:
2.5 The provision of online supplementary resources
First, we identified the provision of supplementary resources for learning
programmes that are conducted along predominantly traditional lines, predominantly
through institutionally supported VLEs.
The most pervasive use of technology to be adopted by HEIs in the last decade is
the VLE. Usage is also high and increasing within institutions (JISC, 2005a). A
survey of Scottish HEIs found that between 50 and 90% of courses now make use of
their institutional VLE (Ward, 2006). It is still a consistent finding that the most
common uses of institutionally supported VLEs are to supplement on-campus studies
and to do this by providing course information and access to web resources
(Bricheno, Higgison and Weedon, 2004; JISC, 2005a; Ward, 2006). Britain and Liber
called this ‘content + support’ (1999, p.9, drawing on Mason, 1998) and attributed it
to the most commonly used VLEs affording transmissive rather than constructivist or
collaborative pedagogies.
Many institutions recognise this characterisation. Eastonhall described it as ‘[the
VLE] is just basically a filing cabinet’ (Eastonhall 1) and noted that it was prevalent in
the ‘bedding in’ phase of VLE implementation (see Chapter 4). The University of
Glamorgan has called this ‘e-teaching’
where tutors teach in traditional ways through lectures and seminars, but
provide extra support to the students by placing lecture notes on the web
(Jones and Fitzgibbon, 2002, p.399).
University College Northampton calls this ‘VLE-resourced’ (Enjelvin, 2005). At Oxford
Brookes, this is referred to as Mode 1 and defined as ‘using the web to distribute
course information and carry out course administration’. (Sharpe et al., 2006, p.139).
It is interesting to note that many institutions have developed their own typologies of
blended e-learning in response to its poor definition.
2.6 Transformative course level practices underpinned by
radical course designs
Second, we found a smaller but growing set of practices which are fundamentally
and radically transforming course designs. Evaluations of such courses have clear
principles underlying the changes they have introduced. Here the emphasis shifts
Chapter 2: Definitions and uses of blended learning
25
away from the VLE to extensive use of other technologies that are available to
enhance and perhaps even change how students study, interact and learn.
We suggest that course (re)design is a critical success factor and that transformation
appears as much to result from applying principles of constructive alignment to the
redesign as from the particular technological modalities of the new course that
produces the transformations (Georgetown University, 2002; Hanley, 2002; Talay-
Ongan, 2003; Zemsky and Massy, 2004). In particular, ensuring assessment
strategies are constructively aligned with the learning objectives is important. For
example, Stubbs et al., (2006) in Box 2.1 below, and Boyle et al., (2003) in Box 5.2
both aligned blended learning activities with the assessment in computing courses.
Online materials included samples of code that could be assembled to form a
working model like the ones the students had to build for their individual assessment.
This alignment encouraged the use of the web materials. In the Boyle et al. example,
the course design explicitly took a spiral approach where each topic was introduced
early and revisited and elaborated on later.
Such an emphasis on design is probably not the norm for either traditional or blended
courses. E-learning requires even more planning and exposes the lack of design
usually in operation (Stiles and Yorke, 2003) and yet redesign becomes more difficult
when some staff are adamant that pedagogy doesn’t need changing (e.g. University
of Westington Staff Survey, 2003) and want to use the technology to reinforce or
supplement existing approaches. We have highlighted those implementations which
have clear pedagogical principles underpinning their design of activities and
assessment such as Biggs’ (2003) principles of constructive alignment, or choice of
technology appropriate to task (e.g. Laurillard’s 1993 typology).
Box 2.1
Creating a coherent blended learning experience in the Business School at
Manchester Metropolitan University
The Emerging Technologies and Issues first year module was redesigned to set
students’ expectations for university study. The course team used clear design
principles to make explicit their intended outcomes and to inform the activities and
assessment of the course.
One Design Principle was ‘the tutor as expert of last resort’. This was designed in by
allowing access to tutors only for those students who had engaged with the online
environment. This was reinforced by showing usage data for online materials during
the lectures. The students quickly grasped what was expected of them e.g. “Don’t
even bother asking – he knows you haven’t had a go yet.” Tutors noted that peer
support groups formed and that tutorials were ‘intense experiences’ of non-trivial
problems.
Another Design Principle was that students engaged regularly. It was felt that routine
was important in establishing good study patterns in this first year course. The course
team booked 10 hours of computer labs each week and scheduled students for one
hour each, each week. A tutor was on hand for ½ hour with each session. This
encouraged students to work on their assignments regularly.
Stubbs, Martin and Endlar (2006)
26 The Higher Education Academy – October 2006
2.7 A holistic view of technology use to support learning
Third, some institutions were aware of a holistic view of technology and learning,
including the use of the learners’ own technologies (mobile phones, online
communities, instant messaging, etc) to support their learning, sometimes in
unexpected ways:
the general philosophy that our IT services has is in terms of more a managed
learning environment than a virtual learning environment -- is any device, any
place, any time, and we’re trying to work towards that. … It’s that sort of
philosophy that we have, to recognize that the whole thing will be blended
together, any device, any time, any place. (Kilderhill, 1)
Data emerging from recent learner focused evaluations certainly supports this view.
The JISC funded LEX project has interviewed 55 students from a range of post-16
contexts about their use of technology in support of their learning. They note that
most learners do not distinguish between learning with or without technology and
offer the following quote from one of the learners interviewed.
“To me its just learning, the fact that it’s online as opposed to in a classroom is
irrelevant. It’s just another way of accessing it. It’s all just learning …. for me I
just think of it as learning and I don’t use the term [e-learning].” (Creanor et al.,
2006a, p.5)
While this third characterisation of blended learning might be for the most part
aspirational and inspirational rather than evident in institutional practice, our site visits
and interviews with practitioners in institutions where blended learning might be
considered to be embedded suggests that this holistic vision of a radically
transformed higher education world is guiding their efforts.
27
3 Rationales for blended e-learning
3.1 Overview
The second research question was to identify the underlying rationales being used
for blended learning. Building on the three broad characterizations of blended e-
learning introduced in Chapter 2, we start with rationales for blended e-learning being
used at the institutional level. In the interviews we explicitly asked staff to describe
their institution’s rationale for adopting blended learning (see Interview Schedule,
Appendix 3). We found, as expected, that staff at each location were able to
articulate their institutional rationale and in some cases, the drivers behind them. The
themes expressed were: flexibility of provision, supporting diversity, enhancing the
campus experience, operating in a global context and efficiency. To hear five
different rationales from seven different institutions indicates a great deal of decision
making being made in developing policy and practice. We noted that rationales were
highly contextualised and specific to each institution and suggest this is a significant
factor in the success of these institutions as long standing implementers of blended
e-learning.
In the next section we summarise rationales reported by course evaluations of local
implementations. We were encouraged to find that most evaluations were able to
articulate their reasons for attempting a blended course redesign. A few related to
institutional strategy, particularly offering flexibility in time and place of learning. Most
were in response to practical challenges being faced by staff and/or in response to
student feedback (loss of staff-student contact, large classes, inconsistency in quality
and quantity of feedback between markers) as well as responding to the demands of
professional bodies in vocational courses.
The rationale reported most frequently by local implementations was maintaining
quality in response to increasing cohort sizes. The challenges of teaching large
classes have been in the spotlight in higher education for some time
so it is of interest
that staff are now applying some of the new possibilities offered by blended e-
learning to this problem. Blends to tackle teaching large groups included offering
extension activities, creating more opportunities for feedback, structuring out of class
time, preparing students for practical work, promoting interactivity in class, and
creating opportunities for dialogue in small groups. Other course level rationales
which were not directly related to large classes were bridging the theory-practice gap,
developing computer literacy, and offering flexibility.
To understand blended e-learning holistically, it needs to be seen as part of learning,
so here we highlight educational rationales for blended e-learning which most clearly
express how they have been based on a particular approach to understanding
student learning. In reviewing studies which had a clear and specific educational
rationale, we used the framework provided by Mayes and de Freitas (2004) of
associative, cognitive constructivist, social constructivist and situative learning
theories.
3.2 Institutional rationales for blended learning
As explained in Chapter 1, the institutions visited were initially selected on the basis
of their reputation as relatively long standing and successful implementers of blended
learning. As well as prominence in the sector, the sample was constructed to
28 The Higher Education Academy – October 2006
represent a variety of institutional agendas: a regional or metropolitan focus,
research intensive, widening participation, enhancing learning and flexibility of study.
The interviewees were asked to explain their rationales for adopting blended e-
learning. The SURF (2005) review of institutional policy choices in universities in the
Netherlands, found that choice of approach is determined by the social and historical
context of the institution. Our results support this finding. The rationales expressed
were, in no particular order: flexibility of provision, supporting diversity, enhancing the
campus experience, operating in a global context, and efficiency.
3.3 Flexibility of provision
Eastonhall and Blackwater, both regional universities, stated their rationales as
flexibility of provision, within the context of recruitment and retention. Staff at
Eastonhall were explicit about their intention to provide opportunities for their
students to work from home:
It’s mainly flexibility of provision … Because student recruitment’s an issue; and
retention. We get a lot of students … who are working. So although they are
nominally full time here they’re actually part time. And we have a high
percentage of ethnic minority students who live at home. I think this year was
the first time our first year undergraduate recruitment was 53% from the local
region. (Eastonhall 1)
The context of these institutions as being in, belonging to, even serving a region, is
important. In reading Eastonhall’s e-strategy framework, there is an emphasis on
flexibility of access with a strong campus focus. The intention is that this is achieved
through the use of mobile and wireless technologies to improve access. There is a
vision of placing the University at the centre of the city and region and of supporting
home based learners within the region. In terms of how they are implementing this
vision, Eastonhall gave the examples of large interdisciplinary nursing courses and
the foundation engineering programme where flexibility is essential as students are
on campus infrequently and at different times.
Blackwater has a long history of offering flexible learning opportunities to students in
the region. The agenda is the same, and blended learning is being used as one of
the ways to appeal to today’s students, along with for example, keeping university
facilities open for longer hours. In practical terms, the current e-learning strategy is
integrated with the flexible learning agenda and has been approved by the flexible
learning committee.
3.4 Supporting diversity
Metroville gave as a rationale for blended learning having to deal with a diverse
student body with high proportions of mature students, home students from ethnic
minorities and international students. The challenge for Metroville is to provide
support systems which cater for their students’ diverse needs:
Given that sort of student background, how do you provide a structure which
can cater for individual needs? How do you provide individualised learning in
the context of massification of higher education and a diverse student body?
We see the blended approach as being one route in, because people ought to
29
be able to find, if they’ve got the right skill set, to find a route through to finding
their own support. (Metroville 1)
3.5 Enhancing the campus experience
Longside is now focusing on using technology to enhance existing teaching and
learning practice. In their e-learning strategy, they state that:
In its sustaining role it enables us to enhance and support what is already being
done with existing technologies. For example we can add communication
possibilities, distribute teaching materials more easily, track student
performance in more detail and so on. Sustaining technologies have the ability
to improve existing product and service offerings along the mainstream
dimensions of performance that customers value
(University of Longside e-Learning Strategy, 2005, p. 5)
Similarly Deepshire responded to interview questions about their rationale by
referring to enhancing the existing experience of students and gave some examples
of discipline based projects such as supporting nurses at geographically dispersed
placements and using e-portfolios to help students manage the evidence created
through practice placements.
Kilderhill also gave enhancement as their primary rationale. Here the focus is clearly
on enhancing the experience of students while on campus. Kilderhill has done this
explicitly through the development of classrooms in a city based campus which has
little opportunity for physical expansion. They have designed technology rich
classrooms for different teaching and learning activities e.g.
The first kind of classroom environment we built was what you would call an
interactive classroom, which has group seating and electronic voting systems in
it and all the other stuff you’d expect in a smart classroom. But a lot of that for
example is to do with teaching styles, because it’s all group work, it’s all active
collaborative learning, even in a large classroom. (Kilderhill 1)
3.6 Operating in a global context
Both Metroville and Eastonhall referred to the wider context as an institution
operating within a global context. Metroville did so in two ways. First, there is the
notion of marketing courses globally. Both Metroville and Deepshire gave examples
of new fully online distance programmes being developed to access small, global,
niche markets.
First there is this notion that the rest of the sector is moving this way and you
need to be there if you’re going to market courses globally. Which is
increasingly what you’re doing; you need to be competitive, you need to be
seen to be competitive. So there’s that dimension to what we’re trying to do.
(Metroville 1)
Second, Metroville see a future role in developing global citizens with e-literacy skills.
To be competitive they consider that they need to produce graduates ‘who are
capable of functioning in an e-age’:
30 The Higher Education Academy – October 2006
We would expect our students to be able to have the skills to be able to find
their route through to manage their own learning; to be able to work face to
face, with what goes on in a classroom setting, however we define that, and
also find support online and be able to engage online with the increasing range
of resources that are out there. (Metroville 1)
Eastonhall also expressed competitive advantage as a major driver in their e-strategy
framework document ‘If we don’t provide and support facilities and resources such as
these, other institutions will do so and we will inevitably lose market share.’
3.7 Efficiency
Although none of the interviewees gave financial savings as a primary rationale,
there are clearly continuing pressures on higher education staff to work efficiently.
Kilderhill had explored how adopting blended learning on a programme would impact
on staff time. They talked of freeing up staff time, or at least making it more flexible:
We did a cost benefit analysis and if you do this, if you teach in studios, if you
do active collaborative learning in class, if you properly blend in the online
component of that, it can release staff members’ time. Or to put it a different
way, it allows them to have more control over the time. They don’t have to turn
up at a certain lecture theatre at nine o’clock every Friday morning, so they can
manage their time in different ways. (Kilderhill 1)
Longside described how early developments in the School of Business had been
‘economically driven up to a point’. The school explored how they could
deploy the technology in a way to help us square the incessant pressure of the
time, driven by HEFCE and others, of more and more students and fewer and
fewer staff. (Longside 1)
3.8 Course rationales for blended e-learning
In this phase of the review we were interested to see how the promotion of blended
learning occurring nationally and internationally in educational circles, and being
seen in some institutional strategies, is being interpreted by higher education staff
who are making decisions about how best to teach their students.
Building on the findings from Chapter 2 of a broad characterisation of blended e-
learning around transformative course level practices underpinned by radical course
designs, we collected course evaluations both from the published literature and
received a number of unpublished internal evaluations from our interviews, calls on
email lists and the research team’s professional networks. Given the rhetoric around
technology and the criticism often repeated of implementing technology for
technology’s sake, we were reassured to easily find many course evaluations with
clearly expressed rationales. As explained in Chapter 1, this review aims to be
grounded in practice and to make visible good examples. Here we summarise the
rationales being used and highlight some of those key studies which met our
inclusion criteria.
31
3.9 Blended course designs for large group teaching
It was clear that higher education staff are developing a creative range of blended
course designs to tackle problems created by large group sizes. For example,
Dickinson (2005), in an internal review of an accountancy module at Newcastle
Business School, noted that as the cohort size approached 600, weekly seminars
had been dropped to fortnightly meetings. The course team then made use of the
VLE to supplement the remaining class time, including the presentation of weekly
discussion questions with discussion forums available and timed release of answers.
The intention was to help students structure and pace their study between the
seminars.
Other responses to teaching large groups have been developing learning objects for
difficult topics (Boyle et al., 2003), offering extension activities for some students
(Oliver, 2006), creating additional opportunities for feedback (Catley, 2004),
preparing students for practical work (Davies et al., 2005), promoting interactivity in
class (Boyle and Nicol, 2003), and creating opportunities for dialogue in small groups
(Condron, 2001).
Box 3.1
Preparing for clinical placements in the Physiotherapy degree at the University
of Birmingham.
The course was redesigned in response to an almost doubling of student numbers
(from 48 to 88) over two years which led to difficulties giving students access to
patients. To support the development of observational skills with limited access to
patients, the neurology module has introduced video clips of patients in combination
with traditional group based and practical classroom sessions. Over three successive
years of course delivery the blend has been refined so that now the video clips are
presented through WebCT and available on CD-ROM. Observational skills are
assessed by multiple choice and short answers presented through WebCT.
The evaluations used an end of module questionnaire and focus groups to elicit
student feedback. It is reported that 72% of the survey respondants thought that the
use of computer based material had been useful in developing their observational
skills and 83% thought their IT skills had improved as result. In focus groups,
students identified the benefits of the video resources as visualisation (seeing real
patients and their movements in a real clinical setting) and flexibility (having CD-
ROMs to play at home or at the university in their own time). They raised concerns
about the delivery of some of the summative assessment via computers, suggesting
that those with who were not proficient in using computers or with slow typing speeds
were disadvantaged.
The authors conclude that the project has added value to the students experiences
by offering a ‘more student-centred, interactive, and flexible approach to learning
whereby students were able to practise their skills frequently, in their own time, and
in greater detail than they would have done by traditional methods alone where
patient contact is limited.’ (p. 847-848).
Davies, Ramsay, Lindfield & Couperthwaite (2005)
32 The Higher Education Academy – October 2006
3.10 Engaging students out of class
A second grouping is of those course evaluations which describe course designs
aiming to support students during periods of little staff student contact. The reasons
that students might not have easy access to staff are varied and location and
discipline specific (and include large classes discussed previously).
Clouder and Deepwell (2004) report on the findings of an initial phase of an action
research project where online discussion boards were provided for physiotherapy
students while they were away from Coventry University on placements. Whilst on
placement the students are expected to conduct critical incident analyses and write
these up for inclusion in their portfolio. The discussion boards were created for
students to share their incidents and experiences during the placement period which
is often characterised by rapid personal and professional development. As in many
other discussion based implementations, the students and the course team found the
discussion more challenging than they had expected. The clear rationale remains of
providing peer support and fostering reflective capabilities in these student
healthcare professionals.
In quite a different kind of blend, Clarke et al., (2004) describe a small pilot study at
the University of Oxford where there is often a long period of time without student –
staff contact between the end of teaching and the exam period. This study created
sets of multiple choice questions to supplement first year undergraduate
management lectures. The aim was to provide additional ways for students to learn
and judge their progress during such periods of low staff-student contact time.
Even without prolonged periods without contact with staff, some courses have aimed
to better support and structure what students do outside of class time. Enjelvin
(2005) reports on the how the French Department at the University of Northampton
have been progressively developing their use of technology to provide interactive
tasks for students to use outside of class time. This use was driven by student
feedback from the normal quality assurance processes where students were
requesting more opportunities for reinforcement activities that could provide
immediate feedback. Over the years these have progressed from being entirely
optional to forming part of the summative assessment. Students are expected to
complete all the weekly tasks and the best five grades count towards the overall
module result. A survey of student perceptions reported generally positive responses
from students about the use of these interactive tasks. 74% of students agreed that
the tasks are a good idea because they can spread their workload, 87% agreed that
they were useful to follow up/reinforce class activities. Interestingly for the discussion
of the nature of the blend, 73% agreed that as a result of the integration of web
based tasks, more interactive activities take place in class. Finally, 91% agreed that
having the formally assessed tasks had improved their overall coursework grade for
this module.
3.11 Developing professional skills
There were examples of blended course redesigns in nursing programmes, prompted
by the disciplinary body’s emphasis on inter-professional learning at Oxford Brookes
University (Sharpe et al., 2006), Eastonhall and Deepshire.
33
There are some disciplines which require use of IT as a competency and this has
been a driver for a redesign (Molesworth, 2004). The Business School at Oxford
Brookes University have identified online collaborative work as a key skill for modern
business practice and so have introduced a collaborative ‘virtual task’ into the Team
Challenges module. Evaluation using focus groups to collect student feedback is
showing that students are highly satisfied with the appropriateness and relevance of
this activity and the module leader reports a higher standard of reflection on team
theory (Sharpe et al., 2006).
Box 3.2
Using Excel tasks to develop computer literacy in accountancy at the
University of Limerick
Once qualified, professional accountants are required to demonstrate their
competency in accountancy related IT skills. This was one of the drivers for the
development of a blended ‘Principles of Accounting’ module for 600 students at the
University of Limerick, Ireland.
The weekly lectures and twice weekly tutorials (held in a computer lab) have been
supplemented by online course content, quizzes and Excel based tasks. The Excel
tasks are optional but do provide up to 10% on the module mark if completed. Three
quizzes are presented for one week each during the semester.
Student feedback has been collected over two years from web server logs, a survey
of student views and focus groups. The analysis draws out a number of emerging
themes important to the student experience including the important influence of the
peer group, the marking scheme, study patterns and technical support. Of relevance
here is the finding that students who held negative attitudes towards computers were
least likely to complete the optional computer based tasks. In terms of developing
professional competencies, it is important to support these students to locate and use
online resources.
Concannon, Flynn and Campbell (2005)
In Australia, Ellem and McLaughlin (2005) give a different reason for wanting
students to develop their IT skills. Rather than professional competency, they
required students to submit all assessed work electronically to the Turnitin plagiarism
detection software. Along with the mention of large classes, this was the driver for the
course team to undertake a major course redesign to supplement face to face
teaching with online resources, discussions and quizzes.
3.12 Educational rationales for blended e-learning
When we try to pin down the meaning of any modification of the term ‘learning’ such
as e-learning, blended learning, distance learning or work-based learning, we will
ultimately have to address what is understood by learning. If learning is the process
of acquiring new knowledge, skills and awareness and taking on their habits, we still
have to ask how that new knowledge, skills and awareness is acquired. Mayes and
de Freitas (2004) organised learning theories that have impacted on e-learning
developments into broad groupings: associative, constructivist (individual/cognitive or
social) and situative (drawing on Greeno, Collins and Rusnick, 1996). We have
drawn on a useful summary of these models and their implications for teaching and
34 The Higher Education Academy – October 2006
assessment is provided by the JISC Effective Practice with e-Learning Guide and
Workshops (JISC, 2005b).
The Tavistock Institute’s review of pedagogic research in post-compulsory education
observed that ‘priority should be given to understanding and unpacking the
pedagogic models underpinning the design of learning VLEs; unpacking the ‘grand
visions’ and ‘cultural logics’ that shape design’ (Cullen et al., 2002, p.16). We
identified course evaluations that were clear that they aimed to improve learning and
were able to explain how the expected learning occurred in relation to educational
theory. These are discussed below using the framework from Mayes and de Freitas.
3.13 Blends to promote associative learning
In associative models of learning, it is understood that people learn by association,
initially through basic stimulus-response conditioning, later through the capacity to
associate concepts in a chain of reasoning, or to associate steps in a chain of activity
to build a composite skill. In some learners, associativity leads to accuracy of
reproduction or recall. Here, improvements in learning outcomes would be expected
where there is a high component of acquisition of information and factual recall
required.
Rationales to enhance learning through associative learning often start with the
recognition that there are problem areas either in students’ achievement in specific
parts of the course (Boyle et al., 2003; Boyle, 2005) or overall course pass levels
(Catley, 2004; Morris and Walker, 2006). Each of these studies developed learning
resources and/or assessments and report significant improvements in student
performance such as the case study reported in Box 3.3.
Box 3.3
Introduction of computer aided assessments into an introductory chemistry
course at the University of Dundee
The course team recognised that there was a problem in the course with a 73%
overall pass mark and ‘weaker students failing to grasp some of the basics of the
subject matter and only scraping a pass mark’ (p.1). The students had complained
about the time lag for feedback on the 8 practical reports and the inconsistency in
quality of feedback and grading between the 10 markers.
The course was redesigned to include both high and low stakes computer aided
assessments. The low stakes assessments were made available for a week,
students were allowed unlimited attempts and their best mark was recorded. High
stakes assessments were unseen and conducted under examination invigilation
conditions in computer labs.
The pass rate improved to 93% and student feedback and analysis of logs identified
the low stakes assessments as being critical. Students completed each of the 5 low
stakes assessments on average three or four times and received instant feedback
which provided clues to the answer, but not the actual answer.
Student feedback was extremely positive and students identified the multiple
attempts with feedback as highly motivating and helpful:
35
“The ability to re-do tests and assignments again once you have already done it is
good. It enables you to continue learning the more you try it.” (p.5)
Out of a class size of 191, 91 responses to a feedback questionnaire were received.
99% of those who responded said the online tutorials improved their understanding
of the topics covered and 93% felt that the marks awarded for these were
appropriate. 90% of respondents said that the online assessments for the practicals
improved their understanding and 99% felt the marks awarded were appropriate.
Morris and Walker (2006)
3.14 Blends to promote constructivist learning
In constructivist models of learning, it is understood that people learn by active
construction of ideas and building of skills, through exploration, experimentation,
receiving feedback, and adapting themselves accordingly. Students will be engaged
in activities which focus on real world, authentic tasks and require collaboration with
their peers. Learning constructively leads to the integration of concepts and skills into
the learner’s existing conceptual or competency structures. In models of social
constructivism, other people and groups learn with the support of dialogue and in the
process of collaborative activity. Improvements in learning outcomes would be
demonstrated by more sophisticated understanding of complex issues or
demonstration of higher order thinking.
Recent reviews of the pedagogic literature in the UK and Australia both confirm that
constructivism is the dominant model of learning influencing school and post-
compulsory education (Cullen et al., 2002; Eklund et al., 2003). As Cullen et al note
in their review ‘many practices are tied to a ‘grand learning theory’’ (p.16). We found
frequent mention of constructivism underpinning the course design, but few explicit
articulations of what this meant in practice.
Many of the earliest e-learning environments and activities were collaborative group
learning in asynchronous text based discussion environments which were designed
around principles of social constructivism (see for example Mason and Kaye, 1989).
In this model students are expected to create their own meaning from a variety of
different perspectives through participation in collaborative tasks. More recently,
Salmon’s widely adopted five stage model of computer mediated communications
describes a sequence of stages learners progress through, the fourth being
‘knowledge construction’. Early work initially reported positive results, although as
discussed in Chapter 5, more recent attempts to promote online communications and
collaborations with undergraduate students are not as consistent (see Box 3.4).
Given the inconsistent findings for the success of collaborative groupwork online, it
seems important that we rehearse here the principles of constructivist design for
learning.
Box 3.4
Discussion boards to support group project work in a final year module in
nursing management at the University of the West of England
Hughes and Daykin (2002) used the rationale of moving towards a ‘student-centred
constructivist learning approach’ to introduce discussion boards to support group
project work. The number of face to face sessions were reduced to accommodate the
36 The Higher Education Academy – October 2006
online work. The evaluation drew on a content analysis of the messages posted and
two focus groups which sampled students who had read and posted to varying
degrees. The interactions were analysed by Salmon’s stages. There was no
evidence of knowledge construction as students were reluctant to criticise each
other’s work online. The important finding of this study in the context of this
discussion of rationale, is that Hughes and Daykin conclude by suggesting that online
tutors need more preparation for the facilitation skills required to scaffold the
knowledge construction phase.
Constructivism is a cognitive model based on an understanding of how human
attention, memory and language operates. One of the well established findings is the
importance of relating incoming information to existing knowledge structures or
‘schema’. That is, learners are influenced by what they already know. Therefore,
constructivist pedagogies recommend that new information is carefully sequenced
and that learners are supported or ‘scaffolded’ by expert tutors and environments
who present new material and questions at the appropriate time.
Clark and James (2005) present a coherent rationale for their blended design based
on principles of ‘guided construction’. They describe the redesign of an introductory
soil science module at the University of South Australia. Here weekly online readings
with question prompts replaced the course textbook. There were two lectures every
week. The first lecture was of a traditional type, at the end of which that week’s
course readings and questions were released. Students were expected to work on
their answers before the next lecture which was run in a question and answer format
with no new information presented. Students were encouraged to use the online
discussion forum to work collaboratively on their answers, although few did. Student
feedback on the use of questions was overwhelmingly positive with 63 of the 65
responses to end of module questionnaire agreeing or strongly agreeing with ‘the
online readings and associated questions helped my learning’. In focus groups it
emerged that students had some difficulties accessing course readings, but found
the questions very useful e.g. “If you did the study questions, the class provided
immediate feedback to see if you are on the right track.”
3.15 Blends to promote situative learning
In situative models of learning, it is understood that people learn through participation
in communities of practice, progressing from novice to expert through observation,
reflection, mentorship, and legitimate peripheral participation in community activities.
Situativity leads to the development of habits, values, identities and skills that are
relevant to and supported by that community. Improvements in learning outcomes
would be demonstrated by improvement in learning the skills and knowledge of a
particular discipline in the culture of its use in a working organisation.
There are examples of courses with a clear rationale to develop the skills, attitudes
and behaviours of practitioners in the subject’s profession. The professional skills
might be quite specific to the discipline, including developing differential diagnosis in
veterinarian science (Ellis, Marcus and Taylor, 2005), writing guidelines for users in
computing (Oliver, 2006), negotiating and bargaining in world trade economics (Carr
et al., 2004) and legal skills in a personal injury claim negotiation scenario (Maharg,
2001). This model of learning is particularly suitable to postgraduate and professional
development work based programmes, much of which fall outside the scope of this
review of undergraduate teaching.
37
In undergraduate teaching, Ellis et al. (2005) and Ziegenmeyer and Kupetz (2005)
both used case studies to create opportunities for their students to behave as novice
practitioners. Ellis et al investigated the experience of veterinarian science students
in an Australian university. The course blended face to face lectures, tutorials and
laboratory sessions with detailed cases presented as online resources (e.g. history,
laboratory test results, images, relevant research papers). This case based learning
is described as ‘learning through being situated in an authentic context that requires
the learner to acquire skills and knowledge in order to solve a problem or manipulate
a solution’ (p. 240). Students accessed the case based resources via their VLE and
developed lists of differential diagnoses for further investigation with diagnostic tests,
requiring them to think and act like practitioners. Ziegenmeyer and Kupetz also used
case studies and here developed hypertext multimedia based studies to support
students teachers to link the theoretical and practical aspects of their learning
experiences to gain the “experience of acting as a teacher in an authentic teaching
context” (p. 179).
38 The Higher Education Academy – October 2006
4 Evaluating and monitoring blended e-learning
4.1 Overview
The third research question was to identify the monitoring and evaluation strategies
being adopted to ensure and enhance the quality of blended learning. Our approach
throughout this review is practice-focused so we limited this section to review
examples of approaches to evaluating technology in use in blended contexts.
Following this approach, we build on the three characterisations of blended e-
learning in use described in Chapter 2; reviewing institutional, course and then
holistic evaluations. In each section we have drawn together our findings from the
literature and the transcripts and documents obtained from the institutional visits.
All seven institutions we visited described current plans for initiating institutional
monitoring and evaluation strategies to assess their students’ experiences. All were
finding establishing institutional level practices problematic. We suggest this was due
to the pressure to implement rather than evaluate, the low status of pedagogic
research, and poorly defined measures of institutional success in embedding blended
e-learning. All institutions welcomed the opportunity to share approaches both
through this review and the Academy’s Benchmarking e-Learning project (HEA,
2006). In response to the requests for support around evaluation, we have taken the
opportunity to highlight examples of suitable approaches and techniques and make
recommendations for those wanting to undertake their own evaluations.
Despite the difficulties around establishing institutional level monitoring and
evaluation strategies, we identified and described a number of effective practices in
various levels of operation in universities: regular module evaluations being used to
inform departmental action plans, triangulated evaluations of students’ VLE use and
institutional support for the collection and dissemination of case studies. We also
suggest the promotion of pedagogic research both within institutions and for
assessing the impact of course redesigns. We draw on the pedagogic literature to
make recommendations about conducting such research: triangulation of data,
collection of rich data and planning longitudinal and ethnographic studies. Finally, we
note the importance of making use of the findings of evaluations in course planning
and redesign and note examples where this had been achieved.
4.2 Approaches to institutional evaluations
It is clear both from the literature and from our institutional visits that systematic,
institutional evaluation of blended e-learning is problematic. Bricheno et al. (2004, p.
7) in a major review of the impact of large scale networked learning in HE and FE
institutions say ‘the evaluation of staff and student experiences appears to be an
area that institutions are just beginning to address and one that would benefit from
further investigation’. There are several reasons for this. Where a central e-learning
unit is expected to monitor and evaluate, the pressure is to maintain and deliver
services rather than to judge their effectiveness, i.e. to implement rather than
evaluate. Another factor is the relatively low status of pedagogic research in higher
education, which means that for many institutions measuring anything other than
some level of e-learning ‘activity’ is a low priority.
As this review was underway, the Academy also funded 12 institutions to pilot five
different methodologies and toolsets for benchmarking e-learning. Preliminary
39
findings from the e-Benchmarking project suggest that even with institutional support
for conducting evaluations, the measures of institutional success in blended e-
learning are often not clearly defined and relevant data may not be available. “One of
the key lessons from the pilot phase of the exercise is that many institutions just did
not have the data to hand and so far from e-benchmarking being a collation exercise
it required considerable data generation effort.” (Morrison, 2006).
We found that institutional evaluation largely relies on gathering data that comes
easily to hand. In the main these are things like VLE usage data, evaluations of pilot
or pump-priming projects that include dissemination outcomes as condition of
funding, and small case studies by enthusiastic innovators of e-learning. While all of
these methods provide useful insights into aspects of blended e-learning, they are
rarely drawn together in such a way that they can answer the question, ‘what is the
impact of blended learning on the student experience in this institution?’ Indeed the
e-Benchmarking project is increasing referring to students experiences rather than
‘the student experience’ in recognition that students will experience the same
institutional context and programme in different ways.
All seven of the institutions that we visited described current plans for initiating
institutional evaluation research. They all perceived gaps in their current evaluation
methods and often these concerned understanding the student experience. As one
participant said to us:
One of the things that’s been seriously lacking over the last five years at least,
has been a lack of interest in evaluation, proper evaluation, not lip service
evaluation like the student satisfaction surveys (Blackwater 2).
At Longside, despite widely disseminated case studies and publications about e-
learning going back at least five years,
We don't have any systematic, institution-wide, sufficiently detailed research
into the student experience in my view (Longside 1).
With these perceptions in mind and a generalised concern about scarce resources
for conducting evaluation work, it is not surprising that several of the institutions we
visited welcomed the Academy’s forthcoming e-Benchmarking project as providing
vital assistance with institutional evaluation.
4.3 Institutional evaluations in the bedding in phase
The common experience following the introduction of an institutional VLE is an initial
period in which the dominant strategy is encouraging staff and students to use it. This
‘bedding in period' typically seems to last three to five years. Salmon (2005, p. 208)
notes that ‘most HEIs are still struggling to engage a significant percentage of
students and staff in e-learning’ and most institutions within the sector are still in this
‘bedding in’ stage.
Because of the way we selected institutions to visit, our institutions were beyond this
phase, but could reflect back on it. The institutions we visited described a variety of
evaluation methods that they used during their ‘bedding in phase’. At one end are
highly formalised processes such as external audits. Blackwater, for example,
mentioned an audit of the management of their institutional VLE conducted by a
prominent accounting firm. Such external audits of institutional VLEs are increasingly
common, being mandated by the business processes of universities.
40 The Higher Education Academy – October 2006
I think it is that notion of mission critical. People now regard it in terms of things
like the risk assessment, you know, which systems are considered to be the
most important, from a risk assessment point of view. The VLE is now well up
the list, because of the amount of stick we get if things fall over. (Eastonhall 1)
Such business-style audits are 'foreign' and distant to the academics charged with
managing the incorporation of learning technology into the daily learning and
teaching processes of their institutions. It is not clear whether or how they are used to
shape forward development of blended e-learning within their institutions.
At the other end are quite informal techniques that are closely associated with
institutional change management processes. Alvesson (2002) uses the term
‘symbolic effectiveness’ to emphasise the importance of how change processes are
perceived by institutional members. Symbolic effectiveness is likely to be measured
by things like attendance at staff development events, the tenor of discussions at
internal seminars and debates, the dominant positions taken in internal discussion
papers and so on. For example, a Kilderhill representative said:
It’s really just all down to activities; it’s just individuals talking about it, it’s the
institution talking about it, it’s things going on, it’s events going on. And the
success factor is that people get involved in that. As I said earlier, [our] one day
awareness event on e-learning had a hundred and fifty people at it. And that’s a
lot of people for Kilderhill. So a measure of success is that people came, and
they were engaged for the whole day. (Kilderhill 1)
Another informal measure is the absence of evidence of harm.
Blackwater 2: We also don’t have any real evidence of not effectiveness.…
Blackwater 1: We’re not seeing students suddenly achieving hugely better,
we’re not seeing students suddenly achieving worse, we are
seeing a lot of change in the day to day practice.
Between the two extremes mentioned above — informal indicators that show that
something is going on and high level business audits — sit a range of formal
evaluation techniques. Almost all of our visit institutions had found ways to embed
regular institutional-scale student satisfaction and/or VLE use surveys into their
practice.
For example, Longside, Deepshire, and Westington administer a survey mirroring the
National Student Survey to their final year students but with an additional question
seeking a rating of the impact of the VLE on their learning. At Deepshire the item has
two parts. The first part — ‘did you use [the VLE] in your studies?’ — separates
respondents into those who can and cannot validly respond to the main question.
This is a Likert five-scale item, 'did you find it useful to your studies?' There was
dissatisfaction with the limitations of this approach. In at least one case this was
explicit — it was ‘a very poor question as it turned out’ (Westington 1) — and in the
others it was implied in discussions about how little can be learned from a single
item.
At Longside a variety of surveys are used at various stages of the undergraduate
experience. We were told that
41
The biggest one is the induction week cycle to get them involved and engaged
in it and we get them in the system right from the very beginning. (Longside 2)
This is followed up at Longside with VLE satisfaction surveys that occur during each
academic year. Metroville and Westington also use VLE satisfaction surveys. The
VLE itself seems rarely used to administer these surveys, usually because of the
poor rate of return when this was tried. At Metroville ‘where we've tried in the past the
rate of returns would be even less on online surveys than it was in questionnaires
handed out in class, so that’s the worry.’ Students there are emailed a link to an
online survey hosted at an external site.
As discussed in Chapter 5, students do value the access to course materials that
VLEs provide. Moreover, at an institutional level the surveys themselves convey
meaningful information about the institution’s progress with embedding blended e-
learning, especially where they have been conducted regularly over several years. At
Westington they are used to inform departmental action plans. The case study in Box
4.1 illustrates a range of ways that the University evaluates blended e-learning in
order to inform future development.
Box 4.1
Systematic module-level feedback supplemented with large-scale surveys
The University of Westington uses a variety of questionnaires at different scales with
different foci.
Systematic background data …
Regular module evaluations are used in a systematic way to inform departmental
action plans. Module feedback questionnaires include items on the use of the
institutional VLE. These module questionnaires are centrally analysed and the results
fed back to heads of department and programme leads. Programme tutors need to
refer to the results in their annual evaluation reports. The cycle is completed when
these evaluation reports inform departmental and programme action plans.
… punctuated by large student and staff surveys
Along with this regular, systematic collection and reporting of VLE use at the module
and programme level, Westington has performed large, triangulated studies of both
staff and student experiences and attitudes to the VLE. These provide institutional
level data that are used to inform development work on the VLE and approaches to
providing staff and student support. They are seldom conducted more frequently than
triennially and correspond with major shifts in strategic planning and development.
The student survey involved questionnaires and focus groups in which students’
usage patterns and satisfaction ratings were gathered. They surveyed approximately
10 percent of first and final year undergraduate students, approximately one
thousand learners. This was delivered on Optical Mark Reader (OMR) sheets at the
end of lectures to maximise returns. The questionnaire data was complemented by
focus groups of students representing all the faculties.
This was preceded by a survey of staff. This was a substantial six-month effort,
conducted by a project team of five members and two research assistants and
42 The Higher Education Academy – October 2006
producing a 61 page internal report (University of Westington 2003). Questionnaires
were sent to all (over 800) academic contract staff and follow up interviews were
conducted with over 30 staff categorised as VLE users, non-users and faculty
champions. It gave the institution a very detailed snapshot of how the VLE was being
used, including distribution of use by various classifications of staff (e.g. teachers,
managers, researchers, by faculty and department) as well as the VLE tools and
facilities that tended to predominate in each staff category. Not surprisingly, it found
that the VLE was used at that time primarily as ‘an information source and
administrative tool’ (University of Westington 2003, p 3).
This staff survey revealed a variety of important issues that planners needed to
attend to. For example, there was widespread concern about inequity of student
access to IT, especially for mature, off-campus and non-standard entry students. A
variety of staff development and technical needs emerged, including too-frequent
upgrades to the VLE, a need for hands-on support for staff users and disciplinary
differences in perceptions of the value of the VLE.
Perhaps one of the most important findings was a need to ‘commission some follow-
up research into [Westington] students’ views and experiences’ (University of
Westington 2003, p 4). As a result a survey of new students that elicited their initial
views on the VLE was conducted in 2005.
4.4 Institutional evaluations of VLEs in use
Development within the institution can often be marked by changes in the survey
instruments themselves or how they are used. The institutions that we visited, having
already spent three or more years promoting the use of their VLEs, all saw
themselves as entering a new, more mature phase of embedding blended e-learning.
Metroville for example articulates the goal of using their VLE as a learning space
rather than a repository. Asked how they were monitoring progress towards this, the
response was:
We aren’t at the moment. And this is one of the things which will come in the
next iteration of the plan. At the moment the pressure and the emphasis has
just been getting staff familiar with it by and large, getting them comfortable with
it, exploring the notion of just how it might be used. (Metroville 1)
One element of Metroville’s new approach is to adapt analysis of the VLE survey to
try to identify module tutors who are using the VLE ‘in an exciting and innovative way'
(Metroville 1).
At Blackwater, a substantial research effort into the student experience involving
questionnaires, focus groups and also investigating staff perspectives is planned this
year. Underlying this effort and similar plans in other institutions is a perception that,
having achieved initial 'penetration' of the VLE into institutional life the next period
involves more sophisticated uses of educational technologies and more complex
monitoring and evaluation processes.
In a very real sense the purpose of the big evaluation this year is to set a
baseline, and say alright, we’re five years into this, four or five years into this,
we’ve got reasonably deep penetration across the university in terms of breadth
of usage of some kind… but in terms of having real evidence of real
effectiveness, the answer is probably not. (Blackwater 1)
43
Westington is exploring a variety of ideas for gathering more detailed information
about the student experience. For example, ‘things like video diaries and trying to get
alternative approaches. To give us another view on how students react’ (Westington
1). They are also aiming to capture more sophisticated evidence about staff use of
the VLE:
There were no tools that could allow you to analyse the detail of how someone
has used the [VLE]. We now have agreement with our Vice-Chancellor and the
students union that we can start doing that…. We’re looking at ways we can
start getting data on the exact use of the functionality in [the VLE]. How are staff
using it for learning and teaching? (Westington 1)
Deepshire is characterised by ‘the extent to which the schools basically run their own
affairs. I mean we’re really eight universities’ (Deepshire 3). Therefore they have
attempted to address evaluation at the school level:
There have been particular studies that some schools have undertaken to look
at courses that use e-learning. Three schools are involved in that at the
moment, and we’re also planning . . . to do a questionnaire for schools that
don’t have the time maybe to undertake a full evaluation as these three have
done. To at least have a questionnaire that they can use that will help process
and feed the details back to the school [and] will inform us as the institution
what’s happening, but also will inform the individual module leaders about how
their course went and what the students thought about it. (Deepshire 2)
Satisfaction surveys tend not to reveal ‘how’ or ‘with what’ a given application helped
students to learn. They offer little or no insight into changes in the student learning
experience. In those that we saw institutions using, the most detailed ones could
differentiate between students on the basis of module enrolment and occasionally
attributes like gender, but they did not differentiate between students with diverse
needs and preferences, e.g. disability, age, prior educational experience, work
status. Yet a variety of studies show these are meaningful and important variables to
do with the impact of the use of technologies on student learning (see Sharpe et al.
2005 for review).
A further problem with satisfaction surveys is that they can hold the implication of a
withdrawal of the service given a strong negative response. How many students
would want that, whether the VLE was important to them or not?
Triangulation — using another method or source of data to confirm or refute
observations or analysis from a single source — is a problem for all the institutions
we visited. At least there are practical difficulties with drawing together, comparing
and contrasting the variety of different data that are collected in the normal routines.
This appears to be due to a lack of resources dedicated to institutional researching of
blended e-learning. We were told a variety of stories that illustrate this. For example:
I did a very quick and dirty bit of research myself on some of the data — and it
was quick and dirty which is why I've not used it and published it, I just did it for
my own interest — but what I tried to do was look at the correspondence
between subject areas, courses and group and student satisfaction with what
was labelled [the VLE] rather than e-learning…
It did produce a kind of module by module result, which is good, that where
there was more use students tended to be more satisfied…. And I think on a
triangulation basis I suppose, although every bit of individual evidence doesn't
44 The Higher Education Academy – October 2006
stand up really to a great deal of scrutiny, if you've got enough separate ones of
them then it does begin to look like a duck at some point. (Longside 1)
Ellaway, Dewhurst and McLeod (2004) argue for a holistic and/or situated approach
to evaluating VLEs in use where ‘the question which should be asked about a VLE is
not ‘what can it do?’ but rather ‘what is it doing?’ thereby focusing on its function and
role in the situated educational context’ (p. 127).Taking such an approach at the
University of Edinburgh they developed an evaluation instrument based on Wenger’s
(1998) Learning Architecture Framework (LAF) for communities of practice. This
instrument is designed to evaluate how successfully a VLE integrated into an existing
course supports nine dimensions of the community of practice that can be presumed
to exist for that course. If the presumption of an extant community of practice cannot
be validly made, for example for a short course or in a modular programme of study,
the instrument could not be used. The application of this tool in a long-standing
medical programme at Edinburgh University revealed areas for improvement in the
LAF dimensions of ‘course coordination (e.g. timetables), jurisdiction (e.g. rules and
authority) and exploration (e.g. secondary learning materials) while other aspects are
relatively strong’ (p. 142).
4.5 Use of course review in institutional evaluation
In this section we move from a focus on satisfaction surveys at the institutional level
to course and module level evaluation. Evaluations at this level can serve a variety of
purposes. They provide institutions with a means to map blended e-learning
applications and pedagogies and they have the potential to gather rich data about the
student experience of e-learning.
All of our visit institutions described module and course evaluations as key sources of
evaluation data. Existing module evaluation instruments need to be adapted
however, as they tend not to gather appropriate data. Deepshire 1 explained:
One of the questions for our school is, how do you actually evaluate the
students’ experience of this when they’re out in the practice setting? So we just
thought, oh well, that’s not going to be a problem, we’ll just use our ordinary
module evaluation. Well it wasn’t going to work was it? It’s not asking the right
sort of questions.
Few had systematic ways of drawing module/course level data upwards for
institutional level analysis. As described in Box 4.1, Westington did this using
standard module evaluations that were used to inform department action plans.
Eastonhall was just about to accomplish this with ‘a question or two being added’ to
‘module questionnaires and end of stage questionnaires' (Eastonhall 2). Longside
has incorporated a single item into their module evaluation form asking students to
rate the impact of online learning on their work. We were told:
We find a very strong correlation between ‘yes, it has very much enhanced my
work’, and those that use it. Of course there will be twenty five, thirty percent of
‘don't knows’, or ‘can't answer’, because they're not using it. But when it is
answered, it is always, yes it is enhancing my work. We never get the other
response. (Longside 2)
As mentioned above, Westington is investigating technologies that will help it gather
data about how its VLE is being used for learning, but none of the other visit
45
institutions are doing this. Sharpe, Benfield and Francis (2006) refer to a system
being used at Oxford Brookes University to classify the ‘Mode of Engagement’ of
each VLE module by its primary learning uses. Mode 1 VLE use means providing
basic course information online, e.g. module handbooks, lecture notes, and similar
downloadable resources. Mode 2 use, blended learning, emphasises use of the VLE
as an environment for active learning. It is divided into 4 sub-categories – enhanced
content, communication, assessment and feedback, and collaboration. Mode 3 refers
to a fully online module or course. An audit database holds these descriptors about
each VLE site. This database should allow the institution to gather more detailed
information about the pedagogical enhancements being provided by the VLE. It can,
for example, allow an institutional evaluation to locate all the modules using the VLE
communication tools for more detailed investigation.
The function of regular course and module reviews in relation to blended learning at
our visit institutions was mainly quality assurance. For example, at Deepshire
there is an annual review of each program. And so within that annual
programme review, where there has been, in inverted commas, what I might
call modification or development, then you would anticipate that [feedback on e-
learning] would be mentioned. (Deepshire 1)
This approach rarely captures information about impact on students’ learning
experiences, but aims primarily at ‘problem catching’. For example,
we have student questionnaires which all students on all modules do … that
would say well, actually there’s something wrong with this module, and we
could then unpack what that is. We look at it in terms of student attainment, and
modules which are way out of line with other modules get looked at and people
get asked questions like, ok, ‘well how are you using [the VLE] help and support
to structure the student learning here?’ (Metroville 1)
Both Metroville and Blackwater explained that they saw their five-yearly course
review process as the most important and potentially valuable quality assurance
mechanism. Metroville had implemented a process ‘for the last two or three years’ in
which course teams are annually required to report upwards on the ‘basic level of
take up’ of the VLE’. But
we probably need to go beyond that. The monitoring process won’t be as useful
for [quality assurance] as the review process. So the five year review when we
go into each subject area and really unpack things in detail will be an
opportunity just to see where they are with [the VLE]. (Metroville 1)
These more rigorous five year review processes gather higher quality evidence than
regular module evaluations and because they are embedded in a course renewal
process are more likely to lead to blended learning innovations. Still, they are limited
by the long time period between successive reviews and validation systems need to
be adapted to make blended e-learning reporting an explicit element of the process.
4.6 Use of case studies in institutional evaluations
Case studies are the primary way that institutions put together a picture of the impact
of their blended learning innovations on the student learning experience.
46 The Higher Education Academy – October 2006
The examples range from very large, externally funded projects, through to
internally funded through self generated income in the law school, through to
activity at a local level with people implementing different techniques and so on.
(Kilderhill 1)
Particularly in the early, ‘bedding in’ phase of institutional e-learning, development
typically involves pump-priming projects to stimulate development and provide
exemplars for the academic community (Stiles, 2003). Such projects almost always
involve a formal dissemination requirement, and usually the larger the price tag the
more rigorous the evaluation that is required.
Eastonhall and Longside’s schools of Health exemplify this. ‘The professional bodies
for health care practitioners are currently driving the development of flexible, inter-
professional, pre-registration programmes and so both institutions have embarked on
substantial blended e-learning innovations involving large, multi-professional cohorts.
The ambitious scale of these projects means that both schools are allocating
resources to evaluating their impact and case studies are likely to be published in a
year or two.
Most of the institutions we visited used some form of incentives to establish
innovative blended e-learning projects and these ‘development projects have inbuilt
evaluation’ (Westington 1). The case studies in Boxes 4.2 and 4.3 illustrate two
contrasting approaches.
Box 4.2
Internal project funding in an old, research-focused institution
Eastonhall is an ‘old’ university with a focus on research. It has no tradition of
internally funding major teaching and learning projects. It does, like many other UK
HEIs, have a teachers’ prize that it calls the Chancellor’s Award, modelled on the
National Teaching Fellowship Scheme that ‘gives five thousand pounds just for a
person to spend on a project plus an automatic salary increment’ (Eastonhall 1). This
award will sometimes involve a blended learning innovation whose evaluation is
disseminated.
It is a significant innovation at Eastonhall to have established a bidding process for a
set of sizeable e-learning grants that lead to funded projects. Five projects of at least
£25000 each were funded in 2006. Although it was not a criterion for selection, the
projects are distributed evenly across the University’s academic schools. The key
selection criteria were impact on the students experience and evaluation of the
impact. Each of these projects will lead to a major evaluation output widely
disseminated across the institution and probably nationally as well.
The university’s small team of educational development staff are integral to this
process. They helped to formulate and implement the bidding and selection process
and will play a key role in providing support for effective dissemination of the results
of these projects.
There is a ‘widespread demand among practitioners for evaluation studies'
(Beetham, 2005b, p 85). A national project to help address this need was launched
by the JISC, which funded a set of case studies that are freely available to
practitioners. An associated programme of national workshops aimed at ‘advanced
practitioners, e-learning champions, staff developers and other people supporting
47
effective practice with e-learning’ disseminated tools such as an Effective Practice
Evaluator and a case study template developed in association with the Academy
Subject Centres (JISC, 2005b). These templates explicitly aim to assist institutions
with the resource-intensive work of collecting case studies that document the lessons
of their educational innovations.
Collecting case studies and then finding ways to disseminate them effectively is a
major staff development effort for most institutions. Westington hosts case studies
and exemplars within its VLE site. Most, however, house their case studies within
web sites of their educational development or e-learning units. Longside has an
extensive collection of institutional case studies and papers on e-learning but there
has been a noticeable drop in new additions to these in the last two years. Longside
3 explained that ‘we haven’t got a resource … we’re very understaffed’.
Although practitioners find case studies to be informative, there is little evidence of
teaching staff accessing resources for their own use without mediation from others.
(Beetham, 2002). So, educational developers, learning technologists and others with
a role in the development of blended e-learning often devote considerable time and
energy to gathering good examples to use in their work with staff. The impact of such
efforts are maximised by robust institutional mechanisms for sharing the examples.
Regular internal conferences and prominent internal journals were often cited by our
visit institutions as key methods.
Case studies present complex issues to do with making them accessible to
practitioners. Many factors influence how ‘accessible’ they are. A familiar or
consistent style can be helpful, but if they are too ‘uniform’ they may lose some
‘fitness for purpose’ in adequately describing the important features of the innovation,
especially those that are transferable to other contexts or disciplines. Practitioners
frequently prefer to see examples from their own disciplines, rather than generic
ones, and they do not want them unduly ‘sanitised’. In short, as well as the
successes they want to know what failed or was hard to accomplish. Consequently
active institutional networks of educational developers, learning technologists and e-
learning practitioners have an important role to play as ‘social repositories’ and
disseminators of such institutional case studies.
Box 4.3
Promoting pedagogic research in a new university
Longside is a ‘new’ university with a focus on teaching quality. It has invested
significant resources into embedding and documenting e-learning innovations. It has
gathered and disseminated an impressive collection of case studies of institutional
blended e-learning innovations over many years and it has well-established methods
of internal and external dissemination. The university has achieved this with effective
use of its educational development unit and Teaching Fellows.
Support for generating case studies
Longside has funded internal Teaching Fellows at Principal Lecturer level who
support blended e-learning and it has three National Teaching Fellows involved in e-
learning. Additionally Longside has a sizeable and prominent educational
development unit that has played a key role in promoting, collecting and
disseminating case studies. Members of the educational development unit offer
consultancy and support to academic staff in developing blended e-learning
innovations. They actively seek opportunities to document this work, sometimes as
48 The Higher Education Academy – October 2006
relatively short narratives of the project and others as more formal research papers.
Some are authored by the educational developers, some by the module teachers,
others are collaborative efforts.
Dissemination
The educational developers and teaching fellows are involved in externally funded
national and international e-learning projects leading to dissemination outputs and
they attend conferences that disseminate examples of blended learning innovations
from their institution. They also play a key role in the user network associated with
their institutional VLE.
Dissemination occurs in both formal and informal ways. Formally there are events
like academic courses, workshops and seminars and the educational development
unit hosts an extensive set of case studies on its web site. A well-attended annual
internal learning and teaching conference provides a ready-made internal
dissemination avenue for staff to disseminate their work. Day-to-day links between
educational developers, teaching fellows and academic staff involve informal, often
word-of-mouth dissemination of case studies and exemplars from both inside and
outside the institution.
This high profile case study collection work is supplemented at Longside by its
PGCert programme for new teaching staff. This course incorporates action learning
and work-based learning and is assessed by a portfolio of work. In one of the
modules within the certificate participants need to demonstrate an innovation applied
within a teaching and learning context. Often this takes the form of a small case
study of blended e-learning innovation and evaluation.
Strategic alignment
A recent strategic initiative by the University to increase research activity gives this
work added impetus. This policy aims to promote active research by all academics, if
not in their discipline then in pedagogic research. A workload allocation model that
includes explicit time allocations for such research is being implemented to resource
the initiative.
4.7 Recommendations for institutional evaluation
In response to the interest in conducting institutional level evaluations, we offer the
following recommendations based on our review:
For large scale surveys, careful design of the sampling strategy is critical.
Self-selecting surveys might lead to samples that do not allow conclusions to
be drawn for the general student population. Haywood et al (2004) at the
University of Edinburgh found that their self-selecting respondents were
generally representative of the undergraduate population for age and gender.
On the other hand the University of Eastonhall (2004) found that females and
just two academic schools dominated their sample in a survey of nearly 700
self-selected student users of their VLE.
Purposive sampling can be an effective way of addressing specific
institutional questions. For example Haywood et al. (2004) used VLE activity
49
logs to sample from students designated as high users of the VLE. Or an
institution might want to focus on some demographic group that is important
to the institutional strategy, e.g. those with disabilities, international students,
mature age students, etc.
Survey presentation and delivery methods should be constructed to maximise
rates of return and avoid biased samples. Haywood et al. (2004) reported
better response rates to their online survey than paper ones but their
sampling strategy targeted ‘high’ level users of the VLE. To improve on poor
rates of return for online surveys Metroville used email links to an online
survey while Westington delivered surveys in lectures using paper-based
forms suitable for optical mark reading. Offering incentives such as small
prizes to randomly selected respondents can also improve response rates.
Saunders and Pincas (2004) offered their survey both online and face-to-face
so as to prevent bias towards highly IT-literate respondents.
It can be worth trialling survey items first using follow-up interviews to
ascertain how respondents interpret them. This will help avoid using
meaningless or potentially leading questions, as for example, where a
‘satisfaction’ item might unwittingly carry an implication that the service could
be withdrawn.
Ask some questions which offer students the opportunity to give ‘new’
information. For example, Haywood et al. (2004) asked students to list three
things they chose to do with the VLE and three they were compelled to do.
These items are open-ended and so they need additional time and expertise
to code and analyse over standard ‘tick the box’ items.
Make sure the results are used. Evaluation reports can easily gather dust on
a shelf. A good example of responding to evaluation results is offered by
Weyers et al., (2004) at Dundee. They reported staff finding student
comments ‘a real eye-opener’. Amongst generally high levels of satisfaction
with their VLE a complaint was inconsistency in how staff used the VLE. The
evaluators then wrote to heads of departments giving advice on departmental
approaches to VLE including the use of standardised templates and guidance
on communication methods.
Triangulate surveys with other data. For example, Aspden and Helm (2004)
asked nine self selecting students who were registered on four or more VLE
sites to keep diaries for two weeks, after which they were interviewed. This
small amount of rich data provided insights into the results of earlier
evaluation of how students use the VLE.
Consider carrying out institutional surveys on three- or four-yearly cycles in
order to gather richer, more varied data, to better align them with strategic
planning and better resource them than annual student satisfaction surveys.
Provided these less frequent surveys are conducted against a background of
effective course/module level evaluation, they can provide results that
institutions find easier to act on than the more frequent, less thorough variety.
4.8 Approaches to course evaluations
In Chapter 2 we identified transformative course level practices, underpinned by
radical course designs, as a small but growing characterisation of blended e-learning.
This type of blended e-learning is well represented in the pedagogic literature which
is dominated by course level evaluations. Again, we found a great deal of interest in
the approaches, methods and tools employed by such evaluation research. Since
any research methodology is driven by the aims and purposes of the research, here
we summarise approaches to course evaluations according to their primary purpose.
50 The Higher Education Academy – October 2006
We use Chelimsky and Shadish’s (1997) framework for understanding the purposes
of evaluation: evaluation for accountability (to find evidence improvement in student
performance), evaluation for knowledge (to better understand the student
experience) and evaluation for development (to lead to course improvements).
4.9 Evaluations of the impact on student performance
Evaluation for accountability requires some criteria against which the evaluation may
be measured such as student performance in educational contexts. Evaluations of
blended e-learning which have attempted to show correlations between student
performance and their use of technology are relatively infrequent. In part this arises
from the difficulty of isolating the variable ‘use of technology’ in course designs, or
indeed of isolating any single variable in dynamically complex educational activity
systems. For example, Davies (2004) reports improved development of higher order
thinking skills by higher performing students through an online peer marking
exercise. No doubt the computerised marking process facilitated the design and
delivery of the exercise but it is unclear the extent to which peer marking, the use of
technology, or the combination of both led to improved outcomes. Kennedy and
Cutts (2005) investigated the impact of electronic voting systems (EVS) in a first year
computing subject in an Australian university. They found a significant correlation
between performance on the module and level of accurate responses using EVS
during lectures. They were unable to adequately account for the association, which
may have been connected with motivation, ability, attendance in lectures, or even
ways of thinking in response to use of the technology.
Davies and Graff (2005) looked for a correlation between level of interaction in online
discussion groups and final grades. Despite some associations between levels of
online activity and grades, they concluded that higher levels of participation did not
necessarily result in better grades. A variety of methodological issues emerge from
this example. An association between online participation and grades might have
been more likely if the variable had been quality rather than quantity of participation.
The difficulty with designing such an investigation is arriving at a measurable and
widely agreed definition of quality of interaction. Other factors that might impact on
outcomes from online discussions concern levels of compulsion to participate and
with who the interaction occurs, including especially the tutor.
A second reason for relatively few studies showing improved outcomes as a direct
result of blended e-learning, concerns evaluation design. Alexander (1999) reports
on a two year national study of the impact of ICT projects on student learning in
Australian higher education institutions. This study found that while 87% of the 111
projects reported intentions to improve student learning outcomes, only 37% could
actually report such an outcome. Flawed evaluation methodology was a major
contributor to this. She found that frequently project evaluations looked at student
reactions to and satisfaction with the project as measures of success and they failed
to gather meaningful evidence of improvement in student learning outcomes.
Alexander (pp 179-180) reported that using one or more of the following methods
was effective in evaluating impact on student outcomes:
comparison of performance of students who used the project with those who
do not use it;
a comparative study with control and treatment group and pre- and post-tests;
comparison of students' solutions to problems in examinations, with those of
students from other universities;
51
pre- and post-tests combined with student interviews;
review of students’ learning journals;
review of students' responses in examinations and overall performance in
assessment;
assessment of content and retention of learning;
questionnaire concerning students' experience of the project as well as their
reaction to it;
questionnaires concerning students' perceptions of learning outcomes;
questionnaires given to students before and after use of the project;
interviews with students about changes in their conceptions;
focus groups;
experts’ reviews; and
observation of students' use of the project.
Two recent examples showing improved student performance in module results
concern the use of computer aided assessment (CAA). Morris and Walker (2006) at
the University of Dundee show improved student outcomes in the form of
course/module results. Their project involved the radical redesign of a first year
Chemistry course that incorporated formative and summative use of CAA. Regular
and frequent formative online quizzes were introduced with multiple attempts allowed
on each and feedback on incorrect answers was provided. The pass rate rose from
73% to 93% between successive cohorts with the introduction of CAA. Comparison
of the entry qualifications of the cohorts showed a slightly less qualified CAA cohort
than their predecessors. Similar effects of CAA were found by Catley (2004), who
introduced optional formative quizzes in a compulsory first year law module and kept
all other aspects of the teaching and learning programme the same as before.
Students who did the optional quizzes performed significantly better than those who
did not and there was a rise in the pass rate and in the number of firsts in the module
as a whole.
4.10 Evaluations to understand the learner experience
Evaluation for knowledge aims to improve the understanding of the phenomenon.
Kember (2003) argues that experimental research designs are difficult to achieve in
naturalistic settings as control groups are difficult to establish practically or ethically.
He argues that evidence for the impact of teaching innovations should be derived
from triangulated designs. Triangulation, or the use of multiple methods, is an
approach that aims at reaching in-depth understanding of complex phenomena. It is
a strategy that ‘adds rigor, breadth, complexity, richness and depth to any inquiry’
(Denzin and Lincoln, 2003, p 8). Triangulation might involve any combination of
gathering different data sources, using different researchers, applying different
theories or perspectives to the same set of data, or using multiple methods to study
the same problem (Robson, 2002).
The Students’ Online Learning Experiences (SOLE) project (Timmis et al., 2004b)
designed a triangulated approach to provide an in-depth set of case studies based on
course modules across a range of subjects, with a wide range of data and collection
methods. The main elements, assuming a 10 week unit, are as follows.
Student questionnaires (weeks 1 and 9)
Student diaries (weeks 3 and 8)
Transaction logging (throughout)
Recording of interactions (throughout)
52 The Higher Education Academy – October 2006
Interviews with tutor (weeks 1 and 9)
Interview with students (week 9)
Box 4.4
A triangulated evaluation of the introduction of Personal Response Systems
(PRS) in first year engineering mechanics module at the University of
Strathclyde
Boyle and Nicol (2003) used a multi-method approach to evaluating the introduction
of the PRS electronic voting system. The evaluation methodology involved:
- five focus groups (each with 6 students) that met twice
- a critical incident questionnaire in the form of an A4 sheet with five questions and
spaces for comments to record immediate experiences
- a 36 statement Likert scale survey derived from issues that emerged from the focus
group data
- a focus group discussion with 6 staff.
As well as showing high levels of student satisfaction with this introduction of
technology into the classroom, the gathering of a variety of rich data allowed the
project team to understand the ways in which the technology made a difference to
students. For example, students commented on increased levels of motivation to be
actively engaged in classroom sessions because they knew they might be called
upon to respond; that the new structure of lectures allowed time for reflection and
discussion; and that they valued the time given to exploratory talk about new
concepts.
In America Cook, Cohen and Owsten (2003) evaluated technology enhancements to
a large (444 students) introductory macroeconomics course. They triangulated data
from server log files, a survey, a focus group and the text of an online discussion
thread that sought students’ reactions to the use of technology in the course. They
also compared and found significant positive correlations between students’ number
of hits and overall course mark and number of sessions and overall course mark. The
qualitative data acquired from the focus groups, open-ended survey items and online
texts helped illustrate why certain usage patterns had emerged from the log files. For
example, log files showed that peer interaction on the course was limited. The
qualitative data showed that students nevertheless valued this and that use of the
technology could be redesigned to improve student-student interaction.
Clarke et al. (2004) also use a mixed method approach which illustrates the
importance of adopting a multi-method approach in even a small study. They
combined online tracking data about students’ use of multiple choice questions
(MCQs) with focus groups that helped them to interpret this data. Students self-
selected themselves to control, paper-based or online MCQ groups. The evaluators
were surprised to find that students’ study patterns were entirely governed by the
teaching timetable and that lecture notes were seen as vitally important. Usage data
showed that students were following up links embedded in the MCQs, but they
denied it. This showed that they were predominantly using a ‘glance and bookmark’
53
strategy, saving resources for possible use at a later time rather than following them
up immediately.
4.11 Evaluations leading to course improvements
Evaluation for development implies an original aim of improvement and we noted
studies in the literature that were explicit about how the data collected had informed
the course team’s decision making process. These examples highlight the
importance of not just course redesign, but iterative course redesign which makes
use of evaluative feedback, as a critical success factor in blended e-learning.
Weyers et al. (2004) conducted an institutional survey at the University of Dundee
and reported staff finding student comments ‘a real eye-opener’. Amongst generally
high levels of satisfaction with their VLE a complaint was inconsistency in how staff
used the VLE. The evaluators then wrote to heads of departments giving advice on
departmental approaches to VLE including the use of standardised templates and
guidance on communication methods.
O’Toole and Absalom (2003) in an Australian university noted that lecture attendance
had dropped with the introduction of full text notes and recommended readings to
mirror the lectures. In this example of evidence informed teaching, they state at the
outset of the study that the results will inform what they do next: either to move
towards virtual tutorials or remove the existing material from the internet and
encourage lecture attendance. They found that students who attended lectures
outperformed all other groupings of students by about 20% in a surprise quiz in the
last tutorial of term. They propose, not to require attendance, but to change the type
of material provided to supplement the lecture. Rather than providing an alternative,
independent study type of lecture alternative, they plan to use the web to prompt
interaction e.g. lecture summaries with questions.
Hughes and Daykin (2002) evaluated an undergraduate module in nursing
management of 220 students and staff with minimal ICT skills. They used content
analysis of student and staff online communications and two focus groups,
purposively sampled from those who had read and posted the largest number of
contributions, those who had posted a moderate number and those who had hardly
posted at all. They found high levels of student anxiety associated with not initially
understanding expectations of them and over assessment. Course teams are able to
respond directly to findings such as these. This study exemplifies the potential for a
direct impact on the design and delivery of a course.
4.12 Evaluating the holistic learner experience
The third characterisation of blended e-learning was a holistic view of the learner
experience that includes the use of technology by the student to support their
learning. Such studies are extremely rare, yet we note that as students entering
higher education become more experienced in their use of technology it will be vital
that we develop experience in evaluation approaches and techniques which will help
us understand and empathise with their experience.
Aspden and Helm (2004) describe a qualitative study that builds upon earlier
institutional evaluation work and uses a smaller scale, focused study to complement
the earlier one. Although their study was not conducted at the course level, it would
apply equally well in such a context. Their approach was designed to investigate the
54 The Higher Education Academy – October 2006
student experience of using the VLE in a holistic way, i.e. to shed light on the
complex interplay of the virtual and the physical and the choices that learners make
in finding pathways to successful outcomes. The first phase of their research used
interviews and observation to investigate student experiences of using the
institutional VLE. They purposively sampled from high level users of the VLE. To
extend this work in the next phase they recruited nine learners from the first phase
investigation to keep diaries for two weeks. In this way they sought more insights into
how the blended environment facilitated interaction and making connections. For
example, the diaries were able to highlight how for students on placements, ‘the
virtual presence of the institution facilitates the maintenance of connections’ (p. 249).
They illustrated how an asynchronous discussion board can provide the time and
flexibility for a student to reflect on a problem, develop confidence and become either
more engaged or re-engaged with the class.
The Learner’s Experience of e-Learning (LEX) research study conducted interviews
and/or focus groups with 55 learners (Creanor et al., 2006b). In order to help elicit
learner beliefs, intentions, motivations and feelings during the interview, they
presented students with an artefact from their learning experience (e.g. a learning
diary, blog, discussion transcript or e-portfolio) and used it to prompt discussions.
The interview questioning and analysis of transcripts uses interpretative
phenomenological analysis which supports interviewees to express their own
detailed story. The methodology employed has enabled this project to provide a rich
dataset of student experiences of e-learning across a range of sectors and
technology uses.
4.13 Longitudinal evaluations
Robin Mason recommended in her keynote the 2001 Improving Student Learning
conference (Mason, 2001) that large scale longitudinal studies of e-learning were
urgently required. Partly this is to do with how the student experience is likely to differ
in embedded as opposed to innovative implementations. The incorporation of
learning technologies that make a real impact on student learning often involves
radical course redesign and it is possible to evaluate such projects too early, when
the real impact has yet to become measurable or stable. Stubbs, Martin and Endler
(2006) conclude from their evaluation of such a radical redesign that evaluation
should be improved through ‘careful study of rich, longitudinal data’ (p. 174).
Longitudinal studies may also be concerned with the student experience through a
number of different courses. Arbaugh (2004) followed students in an American
university over four years as they participated in online courses and reports positive
changes in student satisfaction and perceptions over this period. Students developed
more positive attitudes towards interaction with others and the ease of use of the
environment, and this difference was most marked between their first and second
experience. This is clearly an under researched area.
4.14 Ethnographic evaluations
McConnell (2005) illustrates the potential importance and power of the ethnographic
approach to understanding the student experience of technology-mediated learning
environments. Although it analyses a postgraduate course and is therefore not strictly
part of this review, we refer to it here because it offers an accessible recent example
of this approach. McConnell followed the work of three online groups working in
55
parallel. He analysed their interactions over the course of the activity, which, since
they were online groups, amounted to many hundreds of pages of texts of
asynchronous and synchronous online discussions. The paper offers a rich
description and analysis of a range of complex team dynamics to which teachers
rarely gain access. It details how members reacted to delayed responses to
messages, levels of anxiety in individuals and how this impacted on team
performance, the role of strong personalities and the decision-making processes
adopted in the groups, and the ways in which tutors’ interventions affected the teams
both positively and negatively.
56 The Higher Education Academy – October 2006
5 The learner experience of blended e-learning
5.1 Overview
The fourth research question aimed to address the impact of blended e-learning on
the student experience.
There is general agreement that the student perspective is under-represented in e-
learning research generally (Sharpe et al, 2005; Saunders and Pincas, 2004;
Beetham, 2005b) and in reviews of specific technologies such as the use of
interactive whiteboards (Smith, Higgins et al, 2005) and e-portfolios (Tosh et al.,
2005). Moreover, where students have been asked to share their perspectives, there
are examples of student experiences being markedly different from those reported by
staff (Williams, 2002) or students giving reasons for supporting the blend that were
different from why the blend was introduced (Timmis et al, 2004a; Clark and James,
2005). The Enhancing Teaching-Learning Environments in Undergraduate Courses
project have found differences between the perspectives of students and
practitioners when looking at the whole teaching and learning experience, and we
see no reason why blended e-learning should be any different:
Besides all the complexity created by marked differences across subject areas
and myriad individual differences among both staff and students which prevent
simple patterns emerging, there are additional crucial differences between the
idealized world described by research and the actual world experienced by the
participants. (Entwistle et al., 2002)
Following the three characterisations of blended e-learning introduced in Chapter 2,
this chapter starts by asking how students experience the provision of course
information and supplementary course resources. In this section, we draw mostly on
the findings from institutional evaluations of VLE use. We find that the student
response is overwhelmingly positive and students are making regular and frequent
use of electronic resources with few reported problems of access. Students value
flexible access from home, and mention course notes as the most useful resource.
The impact of the provision of course notes is discussed in relation to support for
students with disabilities, and the impact on attendance. Students are concerned
about the costs associated with downloading and printing and are critical of
inconsistent use between staff and modules.
We assess next the impact of technology on the student experience of
communication and interaction with course content and with others. Here we draw
mainly on evaluations of courses which have been redesigned with clear principles
underlying the changes they have introduced. Here the emphasis shifts away from
the VLE to take account of other technologies that are available to support and
perhaps even change how students learn. There are many case studies from
dedicated and innovative lecturers available and we have been selective in those we
have chosen to report. We find that while students recognise the value in the blend of
face to face and technology supported activities, there are large individual differences
in how they experience the blend. We have seen that students vary in how they
conceive of their own, and their tutors, roles in the learning process and this may be
critical to their success.
57
There is an increasing recognition that students are making use of their own
technology as well as those provided for them and that they are doing this in ways
that are not planned for, difficult to predict and may not be immediately visible to their
teachers and researchers. Taking a holistic view requires an understanding of the
individual and here we review the impact of individual differences. Where there is
difference such as disability or culture, we find that this dominates the experience for
the individual, although not in ways that are stable or predictable. The individual
difference which seems to have most importance is the combined influence of prior
experience and attitudes towards using computers within learning.
5.2 The student experience of online supplementary course
resources
We’ve never done any surveys, ever, that have given anything other than the
students want more of it, wider and deeper. (Longside1)
Studies which have measured student use of online resources show regular and
frequent usage (Boyle et al., 2003; Dickinson, 2005; Codone, 2004; Hibberd, 2006;
Nachmias and Segev, 2003). It appears that most students do choose to access
resources that are made available to supplement their courses even when they are
not required to do so. Molesworth (2004) was keen to find out the role of student
choice in this. In a marketing course at Bournemouth University he was careful not to
force students to use the supplementary resources through pressure from tutors or
assessment. He reports that all students accessed the resources, posted at least one
message and read several.
In an American study appropriately titled ‘If you build it, will they come?’ Cook, Cohen
and Owsten (2003) undertook a detailed analysis of the log files from WebCT from
the cohort of 444 students taking an introductory macroeconomics course. They
noted high levels of activity with students active every day. Their detailed analysis
showed that most logins were quick (60% of visits less than 1 min in length) and
speculated that students were checking for new announcements or postings. It is
likely that there are individual differences in usage hidden within these averages. For
example, Boyle et al. (2003) report that their online environment was used
extensively by 304 students, with an average of 418 accesses per students over the
semester. However, this represents a variation between two students using the
system just five times each and one student making 1469 visits.
There is a potential issue that the institutional surveys reviewed here have tended to
be returned by self-selecting students with response rates of 5-14%. It is possible
that the patterns of activity reported in them are from frequent users, indeed
Haywood et al. (2004) purposively sampled high users. The exception is the
Saunders and Pincas (2004) study which used a survey completed both online and
face to face so as not to bias results in favour of those students who might be
technologically competent. They found though that the results from the face to face
completion supported the results of the online survey.
5.3 Flexible access from home and campus
“Being able to complete IT skills/website evaluation sessions from home and in
my own time means that I can work when I want to. This means when I do
come to sit down and do the tasks, I put in more effort and therefore obtain
58 The Higher Education Academy – October 2006
more benefit from the session than I would do under non-optional timetabled
classes”. (Student quoted in Weyers et al., 2004, p.14)
Saunders and Pincas (2004) found that full time undergraduate students at the
University of Westminster reported spending an average of 14.5 hours studying
outside the classroom each week, falling to 11.5 hours for part time students. This is
clearly a significant amount of time over which students make choices about where
and how to spend their time. In addition, they report that full time students are also
working in paid employment for 5-40 hours a week (average 15). With such high
demands on student time, it is unsurprising that students are taking advantage of the
flexibility offered by technology to study at different times and places.
In their institutional VLE survey of 745 students, Weyers, Adamson and Murie (2004)
asked students at the University of Dundee about flexibility. They reported that 91%
of respondents agreed that the VLE enabled them to learn at a place and time of
their own choosing. Similarly, an internal evaluation conducted at the University
Deepshire reported that 91% of students learning a language in blended mode
agreed with the statement that ‘I liked the fact that I could access the on-line activities
any time.’ One student said:
“For me, it helped with the continuity in between face-to-face lessons in the
classroom. I like the fact that every day, or whenever I felt like it, I could just go
in and practice [sic].” (quoted in University of Deepshire internal evaluation,
2006, p.23)
This flexibility is demonstrated clearly in the times students choose to access their
VLE. Reports of significant student use both day and night and weekdays and
weekends are common (Dickinson, 2005; Haywood et al., 2004; Hibberd, 2006;
Jones and Fitzgibbon, 2002; Lugeba and Williams, 2004; Molesworth, 2004;
University of Eastonhall student evaluation of the VLE, 2004).
All surveys found that most students use a mixture of on and off campus computing
facilities. Even where student computer ownership is high, there is still a strong
dependence on university-provided computing facilities (Weyers et al., 2004). At the
University of Wales, Bangor, Smart and Holyfield (2004) report that 79% of students
used on campus computing labs and a similar number (72%) used home computers.
Similarly, Eastonhall’s VLE survey found that 73% of respondents accessed the VLE
on campus and 65% accessed from off campus (University of Eastonhall, 2004).
When asked to express a preference, students’ top preference was for their own
computer (62%) with a university computer as second choice (58%). It was noted
that there was still a significant minority (30%) using a university computer as their
first choice (Haywood et al., 2004). Gender differences in access have reduced over
time although women are still more likely to report using a ‘family resource’ at home
rather than their own computer (Kirkwood and Price, 2005).
As we might expect, facilitating flexible access is particularly important for multimedia
implementations. Davies et al. (2005) evaluating a course which provided videos to
physiotherapy students found out from focus groups that students appreciated having
copies of the videos on CD-ROM so they could play and replay at home or on
university facilities in their own time. Clark and James (2005) provided weekly online
readings with questions to prompt discussions. Students preferred this way of
accessing readings, saying they found textbooks expensive and libraries
inconvenient.
59
Aspden and Helm (2004) from Sheffield Hallam University give us just a taste of an
insight into the student experience behind all these statistics. Following a small
number of students, they relate a story of a student living a considerable distance
from the campus who only attended on days she had scheduled teaching time. Her
log and diaries showed that she was using online facilities to plan her on campus
time more effectively. Searching online meant that she spent less time on campus
searching for information and more time interacting with peers. Considering the
degree to which students are requesting and making use of flexible access, it would
be worthwhile to conduct further research like this to give more idea of the reality of
the student experience of flexible studying.
5.4 Access to course notes and note taking
“It makes it easier to take notes in class and so I tend to learn more when I only
have to write down the extras instead of the basics as well.
(student quoted in Smart and Holyfield, 2004, p. 9)
This section summarises the findings of studies which have looked at the types of
resources students are accessing in their frequent logons. From an educational
viewpoint the provision of course notes and information is frequently played down as
being ‘transmissive’ and not enhancing learning. In the USA, Caruso (2004)
undertook a survey of nearly 4500 undergraduates, asking them about their
experience of using IT in conjunction with their learning. Caruso reports that most
use of technology by students was associated with VLEs. The benefits that students
perceived around this use were to do with access to materials and readings rather
than discussions and quizzes. She concludes that ‘today’s information technology
use in instruction is primarily for convenience and not for the higher goals of
improving learning’ (pp 7-8).
Surveys in the UK show that students choose to access lecture notes when offered
online supplementary resources and that they are overwhelming positive about it.
Smart and Holyfield (2004) asked students what they used the VLE for. The top
result was course documents (92%), course information (81%) and announcements
(72%), to email tutor (40%), drop assignments (33%), quizzes (31%), follow external
links (32%) and discussion boards (23%). When asked what they like about it, 41%
said access to module document and information. Similarly, the University of
Eastonhall (2004) reports in their VLE survey, that the main reasons students give for
accessing the VLE is to obtain resources such as lecture notes (74%), followed by
lecturer encouragement (66%), course management and administration e.g.
calendars and announcements (60%), ease of use (53%), saving time (52%), ease of
access to information (51%) and then assessment (28%), communication (19%),
email (27%) and develop IT skills (24%). Metroville University (2004) asked students
how useful they found a variety of activities on Blackboard. As with the other surveys,
the top result is for lecture notes with 94.7% of students finding it useful or extremely
useful to download lecture notes.
These findings are typical, however they could be influenced at least in part by what
type of resources are made available to students. Asking a slightly different question,
which did not rely on current provision, Haywood et al., (2004) purposively sampled
576 high WebCT users and asked them, in a free text response question, what
activities and resources they would like to see more and less of. The students were
clear that they wanted more information online in the form of lecture notes and
presentations, information for learning, past exam papers and journals. A smaller
60 The Higher Education Academy – October 2006
number of students, interestingly, wanted more quizzes and tests and discussion-
communication or just more use of WebCT by courses.
When asked why they access lecture notes to such an extent, students report that
they sometimes unavoidably miss sessions due to illness, family circumstances or
employment commitments and it is useful to have the materials available to catch up.
“For some people who suffer from disabilities and have no choice but to take
time out, it is an enormous benefit in order to keep up with what is happening in
lectures and what areas to read up on. Brilliant!!!”
(student quote from University of Eastonhall VLE Survey, 2004, p.19)
Making lecture notes available before lectures reduces the disadvantages faced by
students with disabilities, particularly where they are made available in a flexible
format that allow for customisation (Ball and Campy, 2002). Indeed many students
say that having the notes reduces their reliance on note taking and allows them to
focus on the content of lectures (Codone, 2004; Smart and Holyfield, 2004). Grabe
and Christopherson (2005) in an American paper, make a useful suggestion that
what we should be doing is focusing more research on what types of notes we offer
in this format. In their work they have tried outline notes made available before the
lecture and full notes taken during class by a paid note taker and made available
after. There is already a large literature on lecture notes from before the rise of the
VLE which would be relevant here (see McMullin and Munro, 2003 for review). Given
the positive student reaction to lecture notes, and the role of student feedback in
quality assurance, it is likely that they are here to stay. It would be beneficial then to
conduct research into the impact of different forms of notes and the times when they
best support learning.
Clarke et al. (2005) in a study which was actually about multiple choice questions,
found out about their students study patterns and attitudes to study from the focus
groups. They explained that for students, lecture notes are a vitally important
resource for study, indeed the lecture handout ‘played a talismanic role in the
students study process’ (p. 258).
Knowing how students feel about lecture notes, it is perhaps unsurprising that
they make an effort to have and hold them. Many surveys reported student
complaints about the time and expense associated with downloading and
printing notes (Concannon et al., 2005; Haywood et al, 2004; University of
Eastonhall VLE Survey, 2004; Weyers et al, 2004; Williams, 2002)
“I strongly believe that students have enough of a hard time obtaining lecture
notes, tutorial tasks set from [the VLE] just for one module. It’s time
consuming, costly and frustrating to have to print off lengthy documentation
every week, documentation which could be made available to all students at
the beginning of each semester.” (student quote in University of Eastonhall
2004b, p.34)
5.5 Impact on patterns of attendance
Staff express a common concern that making lecture notes available will reduce
attendance. The surveys of VLE use show that a minority of students are honest
enough to admit that they are tempted to miss a few lectures because notes are
there (Smart and Holyfield, 2004). More systematic studies confirm that attendance is
61
an issue and that the provision of online information plays a role in decision making
for a minority of students. Newland (2004) in a study of the VLE at Durham
University, reports that 2% of students state that having access to lecture notes in
advance would be a reason for them not attending lectures. Matheos et al. (2005)
report a much higher proportion of 26% of a cohort saying that they are less likely to
attend lectures when notes are made available electronically.
The impact on attendance may increase as online provision becomes less about
retrieving notes, and more about learning activities, attendance may fall more.
Dickinson (2005) made extensive use of the VLE in a large accounting course.
Online notes and resources were placed on the VLE, along with weekly questions for
discussion and directed study tasks to complement the existing weekly lectures,
workbook and fortnightly seminars. In an end of course survey the students were
asked about their attendance in a question ‘Do you think having extensive
Blackboard support made you attend taught sessions more or less over the year?’.
67% of the 201 respondents reported that their attendance had not been affected,
23% attended less and 7% thought they had attended more.
O’Toole and Absalom (2003) in an Australian university noted that lecture attendance
had dropped with the introduction of full text notes and recommended readings on
the VLE. They state explicitly at the outset of the study that the results will inform
what they do next, either to move towards virtual tutorials or remove the existing
material from the internet and encourage attendance. They found that students who
attended lectures outperformed all other groupings of students by about 20% in a
surprise quiz in the last tutorial of term. They propose, not to require attendance, but
to change the type of material provided to supplement the lecture. Rather than
providing an alternative, independent study type of lecture alternative, they plan to
use the web to prompt interaction e.g. through providing lecture summaries with
questions.
Where attendance is considered essential, there are ways of tackling this. Stubbs,
Martin and Endlar (2006) provided lecture slides on their VLE and expected students
who missed an odd lecture to be able to use them. In addition, they rewarded regular
lecture attendees with assignment hints and tips. They report that the face to face
tutorials were characterised by more intense and in depth discussions as students
got to grips with the fact that they needed to complete online problems to gain access
to tutors and face to face support.
Scott (2004) distinguished between the value of attending lectures, where lectures
were mirrored in online PowerPoint presentations and the study guide, and
attendance at workshops. The value of attending workshops was confirmed by the
tutor’s reactions to non-attendance. Students were contacted by email if they did not
attend asking them for their reasons for non-attendance and offering support. If they
didn’t respond they were followed up by phone or letter.
It is likely that lecture attendance is a more complex issue than the provision of
notes. The findings previously reported on the amounts of paid employment students
are undertaking, and the individual case studies, show that many students are
organising complex lives around multiple commitments and concurrent modularised
classes. The reality is that students will be choosing which sessions to attend. It is
important that staff help set clear expectations to help students make informed and
beneficial choices about attendance to best support their studies.
62 The Higher Education Academy – October 2006
5.6 Inconsistency in use between staff and modules
“It was a useful source of information, so you could find out things like when
things were due in or what to hand in. I could always find that… I could go on
to Blackboard and have all the information at my fingertips… most of my
courses are paper based, and I defy anyone not to lose bits of paper. It’s very
useful having it all online, tied together like that”
(student quoted in Durkin, 2003)
Students reported one of the benefits of supplementary course resources as having
all the information together in a ‘one stop shop’ (Weyers et al., 2004; Codone, 2004)
and that they would like all modules to have an online presence:
“When we asked students if they had any other comments or suggestions, we
were pleased to read that students’ most common request was to have all their
modules supported through Blackboard.”
(Papachristou, 2003, p.5).
However, they qualified this by saying that the one stop shop was only effective if all
staff/modules used it in the same way and that lack of consistency made it difficult to
know where and how often to look for new material. Disabled students, those with
slow reading speeds or difficulties with organisation are particularly disadvantaged by
inconsistent use:
“I think each one has a different layout of the order of the buttons and what
actually is there, so I sit there for ages just looking, thinking which one do I
want”
(ALERT student reported in Newland, Pavey & Boyd, no date)
Students report that different staff, modules and departments use different facilities
as the official means of communication. These include VLE announcements, emails
or noticeboards and students have to check them all (Smart and Holyfield, 2004;
University of Eastonhall, 2004; Weyers et al., 2004). Both the University of Dundee
and Longside have called for standardization on the basis of student feedback:
The pro Vice-Chancellor of teaching and learning is getting so annoyed with
the students reporting to him that why don't all lecturers use the VLE, that he's
issued this version of the minimum standards of use. (Longside 4)
5.7 The student experience of blended courses redesigned to
promote interaction and communication.
We have seen so far that the most common use of technology is the provision of
course materials to supplement face to face teaching, usually through a VLE. It is
clear from surveys of use and satisfaction, that students value having flexible access
to course materials and do make use of such provision. For some students such as
those with disabilities or those who are not able to attend all the face to face sessions
this provision should enhance their learning experience. It is not clear that, on their
own, provision of course information enhances learning for all, despite this often
63
being a primary rationale for the development of technology supported learning and
the institutional adoptions of VLEs (Alexander, 1999; JISC, 2005a). It is more likely
that enhancements will come from blends of face to face teaching with other types of
technology supported learning activities. It is the evaluations of these types of blends
which will inform how institutions develop their blended provision in the coming years.
It is clear that while students value face to face teaching and say that they do not
want technology to replace it, they also recognise the benefits of the integration of
the two (Enjelvin, 2005; Felix, 2001; Spicer and Stratford, 2001):
In each evaluation over 5 semesters, a majority of students have consistently
supported this blended approach suggesting they thought the two components
were complementary and well integrated.
(University of Deepshire internal evaluation, 2006, p. 20).
As outlined in Chapter 1, our aim is to give an insight into the lived experience of the
learners in embedded and evaluated blended situations. Our intention is that our
findings and recommendations are based on such practices in order to reduce the
impact of extraneous variables such as innovative course leaders and exceptional
online tutors. Our interest here is in how blended courses are experienced by
students of the majority of staff, not students of the early adopters. Also, it was clear
from our interviews that institutions felt that such evaluations were something that
was missing from their own institutions (see Chapter 4). We have been selective
about which studies to include and, as in previous chapters, have highlighted some
of our key studies.
5.8 Experience of online communication and collaboration
The potential of communicative technologies is well established in the e-learning field
(Laurillard, 1993). There is a large literature around computer mediated
communications (CMC) and computer support collaborative learning (CSCL) that
promotes the benefits of asynchronous text based discussions and makes
recommendations on how to use online discussions effectively (e.g. McConnell,
2000, 2005; Salmon, 2002, 2004). CMC was one of the first uses of the technology
and taken on board by early adopters who reported in their evaluations with
undergraduates that students valued the flexibility of the asynchronous nature of
online discussions and the chance to post detailed, reflective contributions (e.g.
Boyle and Cook, 2001; Lockyer et al., 1999, 2001).
Higher education staff continue to try to use technology to promote communication
but are not reporting the same consistently positive results. In the surveys of courses
making use of multiple features of the VLE, discussions frequently appear as the part
which is least used and valued by students (Ausburn, 2004; Dickinson, 2005;
Molesworth, 2004) and as being something that is difficult to engage undergraduate
students with (Metroville VLE report, 2004; O’Leary and Cai, 2004; Westington,
2003). Even in courses where discussion is well integrated into the course design,
there are still reports of students having difficulties in making good use of both
asynchronous and synchronous discussions (Clouder and Deepwell, 2004; Cox et
al., 2004; Dron et al., 2004; Heinze and Proctor 2004), simply choosing not to use
them (Grund et al., 2004) or choosing email over a discussion forum (Concannon et
al, 2005; McHugo et al., 2004; Schmidt, 2005).
64 The Higher Education Academy – October 2006
Specifically, there are difficulties with students engaging in the level of dialogue
expected, especially where it requires giving feedback to peers or review and critique
others’ work (Kear, 2004). Clouder and Deepwell (2004) set up discussion forums for
physiotherapy students while they were on placements. They were expected to note
critical incidents, relay them on the discussion forum and comment on each other’s in
order to promote reflective learning. They found that most students did post but were
not prepared to comment on each other’s work. The staff observed that ‘we were
clearly trying to foster something novel and therefore uncomfortable to students’.
Hughes and Daykin (2002) blended online group work with face to face teaching and
found that students did critique each others’ essays. However, what tended to
happen was the stronger students posted first, and the rest of the group praised
them. Like Clouder and Deepwell, they asked was this just because this was online,
or because the students had never been asked to undertake this kind of activity
before?
One of the SOLE case studies (O’Leary and Cai, 2004) demonstrated the lack of
confidence that students feel with online work and saw this as distinct from general IT
skills. The Economics case study reports that prior to the start of the module 92% of
the 216 students were confident using the Internet and 95% were confident obtaining
information from WebCT. In comparison, only 65% were confident taking part in
online discussions and confidence in using discussions actually decreased by the
end of the module. The SOLE case studies also identified genre confusion as a
possible barrier to communicating online. It was noted that students and tutors were
using different writing styles without any consistency or understanding of what was
appropriate (Timmis et al., 2004a).
It has also been suggested that discussions are more difficult to achieve in
undergraduate blended e-learning contexts because students have more
opportunities to meet face to face (Ausburn, 2004) or that this generation of students
are more familiar with synchronous communicative technologies like online chats and
texting (Timmis et al., 2004a). Perhaps it is that the role of the tutor is so influential in
online learning and whilst the first proponents of e-learning tutored well, the next
tranche will need more training and support to develop their online tutoring skills
(Dron et al., 2004). Russell (2003) found only about 10% of students participating in
discussion forums to support a first year Fluid Mechanics and Thermodynamics
module at the University of Hertfordshire and suggested this might be due to a lack of
confidence of first year students in attributing their messages to themselves. He
found that where students where allowed to adopt an alias, most of them chose to do
so, particularly when they were starting a thread (asking a question) rather than
responding to a previous posting.
A closer examination of studies which have sought out the student perspective
shows large individual differences in the students’ reactions to online communication
(Flynn et al., 2005; Molesworth, 2004). Sweeney et al. (2004), working in Australia,
evaluated student reactions to online collaborative groups. Some students
appreciated the shift in emphasis from tutor led face to face discussions to more
collaborative discussions with peers, others expected to have a model answer from
the tutor and were frustrated when it didn’t arrive. Some students appreciated that
working online allowed more considered responses, whereas others expressed
concern at the time needed to contribute effectively to online discussions.
65
Box 5.1
Communication as a challenge and an enabler on a part-time BSc in
Information Technology at the University of Salford
In this redesigned blended course, students attended one evening session each
week and were supported between sessions with online discussion based tasks. The
discussion based activities were brought in to build a sense of community in the
cohort with the aim of improving retention.
Data was collected, over a two year period, through interviews and focus groups with
staff and students and analysis of messages on student discussion boards.
In the first year, the discussion based activities were based on Salmon’s five step
model, moving through progressively more complex tasks starting with a face to face
induction session, an online icebreaker sharing prior experiences, and culminating in
two assessed discussions around research papers. About a quarter of the 40
students participated in the introductory discussion and these were generally the
students who engaged with the assessed discussion.
There were large individual differences in the use, perceptions and impact of the
discussion activities. While some students were comfortable with the technology
affording detailed and reflective posts, others felt daunted by the lengthy posts of
their peers. For some, the impact was so severe it was citied in their reasons for
withdrawal from the course:
“Due to recent events in my personal life and the frustration of not being able to
connect to the internet at an earlier data, I have decided not to return to the course
this year. I have already achieved [a qualification] but seeing what my fellow students
were contributing online with all their experience in IT where mine is mainly
educationally based, quite frankly, scared me and made me realise that I could be
letting my ‘team’ down.” (p.8)
In year two, the number of online communications expected were reduced. Students
shared experiences online but completed their assessed work individually. The result
was a loss of any sense of community in the group and drop outs remained the same
as in previous years.
Heinze and Proctor (2004)
Ellis and Calvo (2004) have attempted to explain such individual differences in a
phenomenographic study. They gave questionnaires designed to assess approaches
to and conceptions of learning to Australian students on a blended e-commerce
course. They found that ‘students who conceive of discussions as a useful way of
learning about the subject tend to engage in online discussions in a reflective and
meaningful way’ (p. 272). Students with such conceptions of learning were able to
adapt their approaches dependent on the medium. In face to face discussions they
emphasised learning through the experience of others, in the online part of the
course, their approach emphasised reflecting on problems discussed from a variety
of perspectives. It is possible that how students conceive of their own learning could
be a crucial factor in predicting their success in blended learning environments.
Students who have a good understanding of their role and the tutor’s role may be
less likely to suffer from frustrations of communicating online. However, Goodyear et
al. (2003) report no differences in networked learning experiences which could be
attributed to conceptions of or approaches to studying.
66 The Higher Education Academy – October 2006
5.9 Experience of classroom communication systems
Technology has also been used, in quite a different way, to improve communication
in the classroom through the use of electronic voting systems (also known as
personal or audience response systems) (Banks, 2006; Draper and Brown, 2004;
Judson and Sawada, 2002; Simpson and Oliver, 2002).
Boyle and Nicol (2003) used a voting system to promote in-class discussion in the
engineering department at the University of Strathclyde. Students were presented
with multiple choice questions and give their answers via their handsets. The
responses are immediately collated and presented back on screen. In this case study
the answers given by the group were used to promote in-class discussion as
students defend their answers and challenge others. Feedback was collected from
the students for the evaluation through focus groups, in-class critical incident
questionnaires and a Likert scale survey completed at the course end. The majority
of students (74%) agreed that the teaching methods were more effective than other
classes in helping them understand engineering concepts. Interestingly, in relation to
the previous discussion about understanding their own learning, students in the focus
groups attributed their learning to being active in class, having time to think and
reflect in class (not take notes) and discussion with their peers in class. Over half the
students reported that knowing that they might be asked to explain the thinking
behind their answer encouraged them to formulate their explanations. It seems that
the way the technology has been integrated into the class and explained to the
students has helped them reach a sophisticated and beneficial understanding of their
role in their own learning.
As with online communication, a more complex picture of mixed student reactions
emerges from a review of literature on electronic voting systems with student groups
providing a mixture of supportive and sceptical comments (Simpson and Oliver,
forthcoming).
5.10 Experience of interaction with content
The studies in this section report on courses which blended face to face with
technology supported interaction with content. This particular type of blend often
stems from the teaching of complex or unfamiliar topics which students find
particularly problematic.
Introductory programming has been identified as one of these difficult topics (Sayers,
Nicell and Hagan, 2004; Scott, 2004). The case study in Box 5.2 offers one possible
solution through the creation and use of learning objects. This example highlights the
course redesign as a critical success factor. Whilst outside the scope of this review,
there is a large and growing literature on the development, use and reuse of learning
objects such as the ones developed for this case study (see for example Littlejohn,
2003; Koper et al., 2004). The challenges for the wide scale use of learning objects
include their integration into VLEs and the changes to staff practices around
curriculum development.
Box 5.2
67
A redesigned blended course for introductory programming at London
Metropolitan University and University of Bolton
The introductory programming courses were redesigned to support the learning of
difficult topics through interaction with learning objects. This was in response to poor
pass rates and a 50% increase in class sizes. Animated and textual learning aids
were developed to demonstrate some programming constructs or operations.
The course retained lectures and computer labs as contact with subject experts and
explanation of some theoretical topics. Between each session, students were
expected to use the learning aids to help them complete the weekly task sheets. In
the first half of the semester, students used a specially created graphics library which
allowed them to construct simple programs that produced and manipulated simple
shapes. This basic design was delivered on three modules at two institutions
affecting about 600 students in total.
The course was evaluated using student questionnaires at various points during the
course as well as recording final grades.
The mid point questionnaire was significant as this is often the point where some
students are not progressing well and show low motivation. Although 22% did report
being disappointed with their progress, they were better motivated at this point than
usual and able to keep working. Pass rates increased by 12-23% on the previous
year in the modules which followed this blended approach.
Boyle et al (2003) and Boyle (2005)
Holbrook and Devonshire (2005) designed computer aided tutorials to simulate
scientific thinking and facilitate understanding of abstract concepts for Australian
students studying atmospheric science and oceanography. Again the authors report
the importance of course redesign and particularly of constructive alignment of
learning objectives, learning process and assessment tasks. Staff perceived that
students engaged with the task ‘at a more rigorous and deeper level’ (p.209) and
overall student performance was better for this task than for other assessed tasks in
the course.
Boyle et al. (2003) noted in their study in Box 5.2 that there was a wide variation in
students’ responses to their preferred learning aid (lecture, text aid, graphic aid) with
the only common theme being the unpopularity of the text book. Brown and Leidholm
(2004) also found that students demonstrated individual differences in their preferred
choice of format. Having provided students with a variety of formats including
textbook, PowerPoint slides, streaming video and quizzes, they found that each
course resource was preferred as the first point of contact with the subject matter for
at least some of the students.
5.11 Experience of online assessment and feedback
As mentioned in Chapter 4, there is evidence of correlations between the introduction
of online assessment and improvements in learning outcomes and exam
performance (Clarke et al, 2004; Morris and Walker, 2006; Russell and Bullen, 2005),
at least for those students who use it (Catley, 2004). There is also evidence from
embedded studies that making at least some part of the assessment summative is a
useful way of encouraging student use (Enjelvin, 2005; Meredith and Newton, 2004).
68 The Higher Education Academy – October 2006
The reason that computer aided assessment (CAA) improves outcomes is that it
offers students timely feedback to support their learning. Students often request
feedback and so CAA implementations typically show very high levels of student
satisfaction.
“Computer aided assessment is definitely a good revision and learning tool. It
shows where you are going wrong and teaches you to think for yourself, and
also shows you the areas that need more revision. It makes things stick in your
head for future reference.”
(student quote in Morris & Walker, 2006, p.4)
Perhaps because of the relative success of computer aided assessment, in
comparison to say online discussions, there is scant literature on how learners
experience online assessments. Studies of the learner experience have shown the
spacing of tests does encourage students to work more consistently across a course,
and, as suspected, students sometimes converge in computer labs to take online
assessments (Concannon et al., 2005) which, depending on the status of the
assessment could be seen as peer learning or cheating.
Some practitioners have modified computer aided assessments to reduce the
opportunities for plagiarism. Ellem and McLaughlin (2005) made quizzes available for
one week and students were only allowed one attempt. They found that students
were sitting together to take quizzes, or passing on the questions to other people. In
their second year of implementation, they deterred plagiarism with fairly simple
changes: the quizzes were only available for an hour and presented randomized
questions from a dataset. Russell and Bullen (2005) developed student unique
tutorial sheets for use in engineering courses. Students are then free to collaborate
on the method, but their data is different and so their answers will be different.
Clarke et al. (2005) reports on a pilot of making multiple choice questions available
online or in print. Usage was purely formative and low. During focus groups, students
were asked about their reasons for using the quizzes. The authors state that “The
lack of repeated use of the quizzes was largely due to the difficulty students had in
perceiving how they would form a cemented part of their studies, as it did not fit into
the rigorous essay writing and examination process” (p.255). Again, we are seeing
here that it is students’ understanding of their learning, and their role within that, that
influences their behaviour.
Technology supported assessment does not have to use automated responses.
Scott (2004) describes a course redesign at University College Chester on an
introductory programming course. In response to poor student performance, the plan
was for students to submit their workshop tasks by email to the course tutors in the
early weeks of the course. Tutors made a commitment to provide feedback within 12
hours of receipt. Although this required some dedicated time set aside, the tutors did
note that they did not have to arrange the number of individual tutorials they usually
would with students having difficulties. As the course progressed, students and tutors
continued to communicate efficiently and conveniently via email.
There have been a number of recent reviews of computer aided assessment which
raised issues about its uptake and integration in higher education (Conole and
Warburton, 2005; Sim, Holifield and Brown, 2004) and the challenge of developing
standards compliant assessment content for delivery across different VLEs (Sclater
and MacDonald, 2004).
69
5.12 The student experience of emerging technologies
Blended e-learning will increasingly include the use of student owned technology
such as laptops, personal digital assistants, mobile phones, personal digital
audio/video players and tools that cut across course boundaries such as e-portfolios.
However, these are not yet mainstream.
It is already the case that UK undergraduates own and make use of their own
computers. Undergraduate students in the UK already report a high level of
ownership and use of technology. Computer ownership has risen sharply from 52%
in 1999 (Breen et al., 2001) to a level close to saturation of over 90% at the
University of Dundee (Weyers et al, 2004). Of these, an increasing number are
laptops. Breen et al. noted that laptop ownership doubled between 1997 and 1999
and more recently Haywood et al. (2004) found that 56% of students owned laptops
compared to 35% choosing desktop machines. In addition 60% have an Internet
connection at home (Weyers et al., 2004) and 72% have used the Internet before
coming to university (Saunders and Pincas, 2004). Students are using these
computers for course related study. Haywood et al (2004) asked students about their
use of computers for university studies and report that 87% spend more than 3 hours
a week and an additional 21% spend 10 or more hours using their computers for
university studies. The main uses of these computers are to research and create
assignments (including accessing library resources) and to download lecture notes.
We could find very few studies about emerging technologies which met the inclusion
criteria explained in Chapter 1. Where we did find studies, we felt that they did not
satisfy the inclusion criteria we had adopted, in that they weren’t yet able to tell us
much about the student experience of embedded implementatins. For example, e-
portfolios are an emerging technology which aims to provide a tool to encourage
students to view their learning within different contexts in a holistic way. Currently
there is very little information available on the undergraduate experience of e-
portfolios as part of blended experience in UK higher education, although there is
work emerging in FE and postgraduate contexts as part of the Enhancing Learner
Progression (ELP) project (Higgison et al, 2006). Tosh et al. (2005) report on data
collected from students at two Canadian institutions. 78% of students had not used
an e-portfolio before and needed to see examples and hear about its value before
they were convinced. Students had difficulty interpreting what was needed in the
reflective commentaries and felt they put in more time than was recognised in the
proportion of assessment marks allocated to the task. Much of this feedback has
been heard at some time or other about an innovation and is likely to relate to the
novelty of the situation. We await evaluations of embedded examples of the use of
emerging technologies such as from the ELP and MyWorld portfolio project (JISC,
2005c).
5.13 The holistic experience of blended e-learning
The third broad characterisation of blended e-learning introduced in Chapter 2 is a
holistic view of the student experience of blended e-learning where the focus is not
on a particular technology, like the VLE, or the experience of studying a particular
course. Studies reviewed in this final section have examined the experience from the
viewpoint of the learner and some have attempted to identify the impact of individual
differences on the student experience.
70 The Higher Education Academy – October 2006
Box 5. 3
The experience of blended learning at Sheffield Hallam University
Researchers at Sheffield Hallam University have been conducting detailed,
qualitative studies into the experiences of individual students who are taking four or
more courses which make use of the institutional VLE.
Nine students kept diaries and logs for two weeks and were then interviewed.
Maintaining a connection with the university is a recurring theme in the student
diaries and the virtual presence of the VLE facilitates the connection between the
student and the university when students are off campus for employment or
placements.
“I have contact with the university even though I am not there.” (p.249)
Students felt that connections with the institution could be damaged by a number of
reasons such as non-attentiveness of staff, timetabling changes, repetition of content
in sessions. In the blended environments, students were able to exploit opportunities
to maintain the connection such as by interacting with peers or preparing for the next
session. The authors suggest that maintaining this connection could be an important
factor in student retention.
Aspden and Helm (2004)
It is worth noting that attempts to determine the relative importance of different
individual differences are enormously complex. Concannon et al. (2005) reports that
she was interested in acceptance or rejection of technology at the individual level but
found it difficult to draw out themes because of lack of consistency between or even
within individuals. In a previous review of e-learning in post-16 sectors (Sharpe et al.,
2005; Sharpe and Benfield, 2005) we examined individual differences that might
impact on the learner experience and concluded by characterising those learners
who were effective in e-learning environments as those who could: cope with the
emotionality of the experience, reconstruct their approaches to time management
and develop the necessary e-learning skills.
The subsequent Learner Experience of E-learning (LEX) study interviewed 55
learners from all sectors of post-16 education and analysed the interviews using
interpretative phenomenological analysis which allowed the themes to emerge solely
from the students views and words. They suggested that those who will learn well in
an e-learning context display
confidence in their ability to cope with life, learning and technology; the capacity
to network with others through a variety of communication channels; highly
effective time management skills; and most crucially, the skill to integrate and
balance learning with work, leisure and family commitments are key.
(Creanor et al., 2006b, p.9)
71
5.14 International students
Most of the literature on international students’ experiences of blended e-learning
comes from the USA and Australia and from studies which have tracked student
behaviour in online environments. These have observed that Chinese students post
fewer messages than Australian students, particularly when they moved on from
socialisation to making intellectual contributions to the debate (Smith, Coldwell et al.,
2005) and that Chinese students posted more conservative and less critical postings
than American students (Thompson and Ku, 2005).
We were not able to identify any UK studies which set out to explore the experiences
of undergraduate international students in blended e-learning environments, although
one of the SOLE case studies noted that although overall participation in discussions
was low, it was higher for students whose first language was not English:
“I think almost everything was easier with WebCT, because my mother
language is Finnish… so it is much easier to read everything and get new
information.” (student quote in O’Leary and Cai, 2004, p. 9)
Thompson and Ku (2005) give some insight into the experience in a well designed
study of Chinese graduate students in the USA. Although this study does not meet
our inclusion criteria, because of the paucity of similar UK studies of undergraduates
we have included it. The experiences of seven Chinese students were explored while
they were taking a VLE based course whilst studying on campus. They used multiple
data collection methods: interviews with the students and their tutors, focus groups,
and analysis of message transcripts. Language was mainly seen as a difficulty.
Some students found writing in English daunting and time consuming. They worried
about spelling and grammatical errors and used dictionaries and friends as proof
readers intensively. Other students felt more comfortable expressing their opinions
online than face to face:
“In the online environment, I can say whatever I want. They don’t know who I
am or what I look like so I don’t feel so embarrassed.” (Thompson and Ku,
2005, p.41)
Thompson and Ku conclude that no two Chinese students’ experiences are the
same.
5.15 Gender
While there has been a great deal of interest in the different behaviour of males and
females in fully online discussions, many studies of mainstream blended e-learning
environments report no significant gender differences (Haywood et al., 2004;
Atkinson, 2004; Fletcher, 2005).
Gunn et al. (2002) provide a useful reflection on the gender issues on a variety of
blended courses and institutions. They report a closing gap over time in gender
differences in computer ownership, confidence, and expectations of use. Indeed both
this article and McSporran and Young (2001) suggest that women may outperform
men in blended environments because they are better at scheduling their time. There
was evidence that men had lower participation, not just on discussions, but also on
quiz attempts, viewing of web pages and assignment submissions.
72 The Higher Education Academy – October 2006
5.16 Disability
We saw in the previous section on VLE use that disabled students are positive about
the provision of supplementary course information. They report the benefits as
catching up when they have to unavoidably miss sessions, having materials available
in flexible formats, and reduced emphasis on note taking in class (University of
Eastonhall, 2004; Smart and Holyfield, 2004).
Some disabled students also experience the anonymity of online interactions
positively e.g.:
“Nobody online had to know that I had a disability, whereas in a classroom
environment it stands out like a sore thumb.”
(student quote in Creanor et al., 2006, p.5)
Some disabled students also point to difficulties with work in a blended environment.
As with other classifications of individual differences, these are highly specific and
contextualised. For example, Hughes and Lewis (2003), relate the experience of
dyslexic student as they tried to complete a time online test:
“I had no problems with the concept, the problem that I did have was that it was
timed. You don’t try and put dyslexics under a timed environment because they
don’t perform and when you’ve only got a minute to answer a question,
although it’s only a tick box and you have to pick, say, one out of three, one out
of two, one out of five… with us, we’re trying to get our head around what
they’re on about for a start – well that’s 45 seconds gone … I don’t like it, it’s
unfair and it disadvantages people.” (p.12)
The ALERT project makes recommendations for supporting students with course
notes which would seem to be of benefit to many students, not just those with
disabilities (Newland et al, no date). This view of an inclusive approach that takes
into account the needs of all learners is growing in popularity (see for example
Draffan and Rainger, 2006; Lessner and DeCicco, 2006; Phipps and Kelly, 2006).
5.17 Learning style
Although there has been a good deal of research, particularly in the USA, attempting
to find links between students’ use of technology and their learning styles, recent
reviews of this research have raised serious questions about its validity and
relevance to practitioners (Coffield et al., 2004a, 2004b; Mayes, 2004).
What we can learn from studies of the student experience is that learners do interact
with blended courses in many different ways. For example, Brown and Leidholm
(2004) found differences between individuals in their choice of entry points to
learning materials. It is not clear that these reflect any predictable or stable
constructs. Indeed Concannon et al. (2005) remind us of the variation not just
between but also within learners. We suggest that our energy would be more usefully
focused on developing environments in which all learners are encouraged to learn
actively and deeply. As Goodyear et al. (2003) report, having found no differences in
the networked learning experiences which could be attributed to conceptions of or
approaches to studying:
73
A practical implication of this study is that it is reasonable to expect all students
to have positive experiences on weIl-designed and well-managed networked
learning courses – not just those students with more sophisticated conceptions
of learning or deep approaches to study. (p.17)
5.18 Experience of and attitudes towards technology
There is evidence that both prior experience and attitudes influence students’
experiences of blended learning.
In terms of computing experience, the usual finding is that this is an advantage.
Students have commented that slow typing speeds disadvantaged them while taking
online quizzes (Davies et al., 2005) and in synchronous chats (Carr et al., 2004) and
IT skills impacted on their use of Computer Aided Learning (CAL) (Atkinson, 2004).
Where IT skills are already high these differences become less influential and here
Concannon et al (2005) report that attitudes are more important than experience.
Accounting students who held positive attitudes towards computers were more able
to locate and make use of online resources. It is highly probable that experience and
attitude are linked. In a rare longitudinal study, Arbaugh (2004) reports that student
attitudes towards online working become more positive as they took more online
courses.
In the future, student experiences are likely to be greatly influenced by the prior
exposure to technology of the ‘net generation’ and there is work underway to explore
the impact of this on learners and their learning (see for example Oblinger and
Oblinger, 2005). A quote from an undergraduate Business School learner from the
LEX study illustrates the ubiquity and integration of technology for some students.
“it’s the first thing I turn on in the morning before I even wake up … I think in
future people can’t cope without their laptops. My main use of it is I guess social
networking. It would be MySpace and Messenger and e-mail things like that..”
(Creanor et al., 2006b, p.13)
The use of technology for social networking is a feature of technology use that is
being observed in school children who will become the higher education students of
the future (Kent and Facer, 2004). The rise in the use of computers at home, for
social networking is likely to deepen any digital divide. Kirkwood and Price (2005)
suggest that it will be increasingly important for institutions to monitor rather than
assume their incoming students prior experience, particularly where they have a
rationale concerned with widening participation.
74 The Higher Education Academy – October 2006
6 Conclusions and recommendations
Technology use has now become so prevalent in the lives of undergraduates that
decisions about whether to use technology or not no longer seem relevant. The
pertinent questions now are around how we should use, and evaluate the use of,
technology. This illustrates a maturing of the design and application of blended e-
learning. In previous chapters we have explored some of the choices being faced by
higher education staff in the discussion of the dimensions of blended learning,
reasons for blending technology with face to face teaching, monitoring and evaluating
technology use and enhancing the learner experience.
As staff are called upon to build blended e-learning environments, design activities
and tutor and support students through them, it will become increasingly important
that we have evidence on which to draw as we make these decisions. There are
already recommendations that will inform our decision making such as the work on
the affordances of media (Laurillard, 1993), the burgeoning field of design for
learning (see Beetham and Sharpe, forthcoming) and strategies for online tutors
(Salmon, 2002; 2004). We hope that the review of learner experiences in Chapter 5
will complement these and encourage higher education institutions to move from an
‘inside out’ approach where those on the inside know what is best, to an ‘outside in’
position where we research and evaluate students’ perceptions and attitudes to
learning (Lea, Stephenson and Troy, 2003) and use them to inform our decision
making.
As Garrison and Anderson advise:
It is imperative that those involved in higher education come to grips with the
reality that technology is an increasingly important element of the educational
environment and represents opportunities and constraints for interaction that
can significantly influence students’ perceptions. (Garrison and Anderson,
2003, p. 18)
We have argued throughout this review that we need to understand the impact
technology use is having on the learner experience in order to make these decisions
about the application of technology and to inform the development the new
pedagogies.
6.1 Critical success factors for blended e-learning
The final research question concerned the success factors for blended e-learning.
Institutions were asked to identify any success factors at the end of their interview
(see Appendix 3). Although some interviewees interpreted this question as success
in embedding blended e-learning into their institution, this is not our interest here and
it has been investigated elsewhere (e.g. Beetham, 2001; Bricheno et al, 2004;
Normand and Littlejohn, 2006). Our interest is in the factors that may lead blended e-
learning implementations to be effective at the level of the institution, the course, its
tutors and importantly, the learners. The following factors emerged as being critical to
the success of blended e-learning in practice.
75
6.2 Use the term ‘blended learning’
We found in Chapter 2 that although the term ‘blended learning’ may be difficult to
define, it is gaining in acceptance. Here we go further to suggest that the using the
term may in itself be important to the success of blended implementations of e-
learning. The lack of a consistent definition may be part of the term’s strength as it
allows staff to negotiate their own meaning for it within the context of their institution,
course or student group. In terms of the dimensions of blended e-learning, protecting
face to face teaching, designing for active learning and responding to institutional
context, are all features of effective implementations.
6.3 Work with and within your context
In Chapter 3 we found that institutions that we had identified as successful
implementers of blended e-learning had highly contextualised and specific rationales
for their adoption. The issue of contextualisation was also identified in the interviews.
Deepshire identified “school-based adoption strategy rather than a blanket central
strategy” as a critical success factor and went on to clarify the importance of working
in a way that is appropriate for each institution’s context
“it does seem sometimes you look at other universities with a much stronger
top down influence maybe you could get more done. But having said that you
live with the institution you’re in, you have to understand that culture and work
with it as best you can” (Deepshire 2)
It may be that this is related to the well established finding of the importance of
visionary people in institutional change. Charlier et al. (2004), synthesising the
learning from institutions involved in the EQUEL project noted that visionary people
were crucial at the beginning, inception stages of innovative processes in higher
education, but we do not yet know enough about the visions of these people.
Perhaps they are highly contextualised visions. Similarly, we found that many course
specific rationales were contextualised by practical teaching problems driven by large
groups, the characteristics of the student group and/or demands of professional
bodies.
6.4 Use blended learning as a driver for transformative course
redesign
The importance of transformative course level designs was identified in Chapter 2 as
one of three characterisations of blended e-learning. Throughout the review, studies
repeatedly identified engaging in course design or redesign as critical to their
success. This was particularly notable where studies described a blended course
which had been developed in response to a real and relevant problem at the course
level. Within the discipline of e-learning, the importance of designing teaching
strategies appropriate to technologies has long been recognised (e.g. Ehrmann,
2003). It is reassuring to see this emphasis on course design being expressed by
practitioners based on their real experiences.
Most helpful are those studies which have described what their course design
actually involved. From these the valuable features of the course design appear to
be:
76 The Higher Education Academy – October 2006
Undertaking an analysis of the successful and less successful features of the
current course, including student feedback (Boyle, 2005). For example, Morris
and Walker (2006) was one of the few studies which engaged in an honest
appraisal of the current course identifying problems and targeting their use of
technology in response to this (see Box 3.3).
Undertaking the design as a team, ensuring that staff have the time to
properly integrate face to face and online material (Aycock, Garnham and
Kaleta, 2002) such as by allowing staff to develop only part of a module in
depth (Boyle et al., 2003).
Designs which make explicit their underlying principles. As discussed in
Chapter 3, these might be based on established pedagogical principles e.g.
being sensitive to the needs of learners as individuals (Graff, 2003), active
learning (Hinterberger, Fassler and Bauer-Messmer, 2004), repetition and
elaboration (Boyle et al., 2003), the requirement for prompt and frequent
feedback (Morris and Walker, 2006) or design principles related to the course
outcomes e.g. ‘attention to detail’ (Stubbs, Martin and Endlar, 2006).
Developing the course iteratively over a number of years. We started the
review looking for studies which were not reports of initial innovations (see
Chapter 1). The studies which discussed course design as a success factor
suggest that as many as three or four iterations of course design,
development and implementation may be needed to complete the transition
from traditional to blended e-learning course (Trevitt, 2005; Danchak and
Huguet, 2004; Ellem and McLaughlin, 2005)
6.5 Help students develop their conceptions of the learning
process
In Chapter 5 a common theme emerged from the studies of the learner experience
that it seems to be important that students understand the role of technology in their
learning and the implications for their study strategies and engagement in learning
activities. We have seen that students vary in how they conceive of their own, and
their tutors, roles in the learning process and this may be critical to their success. In
order to support students through this, it is vital that we are consistent and
transparent in communicating our expectations to students whether this is about for
instance, revised attendance patterns or how to engage in purposeful dialogue in
asynchronous discussions.
6.6 Disseminate and communicate results of evaluations
The need to promote and disseminate results from evaluations was identified as a
crucial aspect of institutional monitoring strategies in Chapter 4. When asked about
critical success factors, three institutions suggested issues around communication.
The implementation of blended e-learning at Kilderhill has had an emphasis on
communication and collaboration and they have worked to promote and sustain a
supportive community.
“A success factor is that we do have a nurturing community who do help each
other, whether it’s between academics, and academics and support staff, and
they all trust each other.” (Kilderhill 3)
77
Kilderhill highlighted the importance of maintaining communications between schools
in a devolved institution and described communication in terms of networks of
practitioners.
“ it’s about making sure that the right people are talking to each other
obviously. And it’s absolutely crucial if we’re going to be successful. And just
the fact that people are communicating in this way is for us a very important
factor of success” (Kilderhill 3)
At other institutions, communication was described as being largely dependent on a
single individual:
“I think for me it's the uptake in two schools and being able to report back on
that through university committees, things like learning teaching and
assessment conference, and staff seeing where some of .. the other schools
were going” (Eastonhall 1)
“From my perspective I think … it’s maybe just where I sit, it’s actually being
in a position where you can co-ordinate all of these areas…….being able to
see the connections between PDP and BlackBoard and their electronic
portfolios, and diagnostic assessment. And that’s partly happened only
because I’ve only recently moved into this role from various other roles
around the university, and then having the time to be able to link these things
together.” (Metroville 1)
6.7 Recommendations for policy and practice
It is clear from the uptake of technology by institutions, the rise in the use of the term
blended learning and the number of evaluative studies identified in this review, that
many institutions and practitioners are attempting to engage with blended learning
and are doing so successfully.
The following recommendations highlight issues which institutions in our sample, and
published practitioners and researchers have themselves identified as being
important to the institution:
Terminology is value laden and worthy of negotiation at the institutional level
Institutional rationales for blended e-learning should be contextualised and
specific
There is a need for more systematic institutional monitoring and evaluation of
blended e-learning
Establish institutional systems for dissemination of good practice, such as
internal conferences and journals. Such systems need to be complemented
by less formal, social dissemination practices. Institutions should attempt to
establish and nurture communities of e-learning developers and practitioners
to act as social repositories and disseminators of case studies of institutional
practice.
Staff need support in designing blended courses for diverse groups with a
focus on whole course redesign to embed the innovations of individuals.
The purpose of eliciting and highlighting the learner experience of blended e-learning
is to improve that experience for our future learners. It is clear that students need
support in learning effectively in technology rich blended courses. This might mean
78 The Higher Education Academy – October 2006
for example advice on managing their time during courses that mix timetabled
sessions and independent study and/or developing skills to access and evaluate
online resources. Much of this will be specific to the particular blend or use of
technology. There are however some recommendations arising from the studies of
learner experience which are likely to be more generally applicable to practitioners at
the course level:
The standardization of courses supplemented by online resources including
consistent expectations about how students will be contacted, what materials
are provided in print, when materials will be made available, in what formats
etc.
The provision of course notes, prior to lectures and in formats which students
can customise, with more advice and guidance on how to personalise
electronic materials.
Clear expectations and guidance for students in structuring their study time
and making good choices about attendance.
6.8 Recommendations for future research
We have also made recommendations for future research in the following
areas
Considering the degree to which students are requesting and making use of
flexible access, it would be worthwhile to conduct further research into the
reality of the student experience of flexible studying.
Given the positive student reaction to lecture notes, and the role of student
feedback in quality assurance, it is likely that they are here to stay. It would be
beneficial to conduct research into the impact of different forms of notes and
the times when they best support learning.
There is very little research which follows the learner experience of blended
e-learning over time, such as over a whole degree programme.
More purposive sampling of specific groups which are currently under
represented in the literature, including disabled, mature, working, part-time,
and international students
We suggested that student conceptions of the learning process and their role
within in it could be crucial to their experience, and suggest further research
in this area.
79 The Higher Education Academy – October 2006
References
Alexander, S. (1999) An evaluation of innovative projects involving communication
and information technology in higher education. Higher Education Research &
Development. 18 (2), 173-183.
Allison, J. (2004) Continuity and community: Supporting placement learning through
a virtual learning environment [online], ASET Annual Conference, 7-9
September 2004. Available from
http://www.ncl.ac.uk/cad/placementVLE/resources/continuityCommunity.doc
[25 August 2006].
Alvesson, M. (2002) Understanding organizational culture. London: Sage.
Arbaugh, J. B. (2004) Learning to learn online: A study of perceptual changes
between multiple online course experiences. Internet and Higher Education.
7, 169-182.
Aspden, L. & Helm, P. (2004) Making the connection in a blended learning
environment. Educational Media International. 41 (3), 245-252.
Atkinson, S. (2004) Student behaviors in the context of computer aided learning in
design and technology teacher training. Journal of Technology Studies. 30
(4), 54-65.
Ausburn, L. J. (2004) Course design elements most valued by adult learners in
blended online education environments: An American perspective.
Educational Media International. 41 (4), 327-337.
Aycock, A., Garnham, C. & Kaleta, R. (2002) Lessons learned from the hybrid course
project [online],Teaching with Technology Today, 8, Available from
http://www.uwsa.edu/ttt/articles/garnham2.htm [25
August 2006].
Ball, S. & Campy, D. (eds.) (2002) Accessible curricula: Good practice for all, Cardiff:
UWIC Press.
Banks, D. (2006) Audience response systems in higher education: Applications and
cases. London: Information Science Publishing.
Beetham (2001) Steps to success: Institutional embedding of learning technologies
(seven strategies for development). EFFECTS/ Net Culture Seminar.
Beetham, H. (2005a). e-Portfolios in post-16 learning in the UK: developments,
issues and opportunities. [online]. JISC. Available from
www.jisc.ac.uk/uploaded_documents/eportfolio_ped.doc [30 May 2006]
Beetham, H. (2005b) E-learning research: Emerging issues? ALT-J, Research in
Learning Technology. 13 (1), 81-89.
Beetham, H. & Sharpe, R. (eds) (forthcoming) Rethinking pedagogy for the digital
age: Designing and delivering e-learning. Oxford, RoutledgeFalmer.
Biggs, J. (2003) Teaching for quality learning at university. 2nd ed. Buckingham :
Society for Research into Higher Education & the Open University Press.
80 The Higher Education Academy – October 2006
Bogdan, R. C. & Biklen, S. K. (1982) Qualitative research for education: an
introduction to theory and methods. Boston: Allyn & Bacon.
Bonk, C. (2006) Blended learning: situations and solutions. Presentation to Oxford
Brookes University [online]. Available from
http://www.trainingshare.com/pdfs/SFX1D28.pdf [25 August 2006].
Bonk, C., Kim, K.-J. & Zeng, T. (2006) Future directions of blended learning in higher
education and workplace settings. in BONK, C. & GRAHAM, C. R. (Eds.)
Handbook of Blended Learning: Global Perspectives, Local Designs. San
Francisco: Pfeiffer Publishings.
Boyle, T. (2005) A dynamic, systematic method for developing blended learning.
Education, Communication and Information. 5 (3), 221-232.
Boyle, T. & Cook, J. (2001) Online interactivity: Best practice based on two case
studies. ALT-J. 9 (1), 94-102.
Boyle, J. T. & Nicol, D. J. (2003) Using classroom communication systems to support
interaction and discussion in large class settings. ALT-J. 11 (3), 43-57.
Boyle, T., Bradley, C., Chalk, P., Jones, R. & Pickard, P. (2003) Using blended
learning to improve student success rates in learning to program. Journal of
Educational Media. 28 (2-3), 165-178.
Breen, R., Lindsay, R., Jenkins, A. & Smith, P. (2001) The role of information and
communications technologies in a university learning environment. Studies in
Higher Education. 26 (1), 95-114.
Bricheno, P., Higgison, C. & Weedon, E. (2004) The Impact of Networked Learning
on Education Institutions [online]. Bradford: UHI Millenium Institute & Bradford
University - INLEI Project. Available from http://www.sfeuprojects.org.uk/inlei/
[2 February 2006]
Britain, S. & Liber, O. (1999). A framework for pedagogical evaluation of virtual
learning environments [online]. JISC. Available from
http://www.jisc.ac.uk/jtap/htm/jtap-041.html [23 Dec 2002]
Brown, B. W. & Liedholm, C. E. (2004) Student preferences in using online learning
resources. Quality in Higher Education. 11 (1), 56-67.
Browne, T. & Jenkins, M. (2003) VLE Surveys: a longitudinal perspective between
March 2001 and March 2003 for HE in the UK [online]. UCISA. Available from
http://www.ucisa.ac.uk/groups/tlig/vle/index_html [16 August 2006]
Carr, T., Cox, G., Eden, A. & Hanslo, M. (2004) From peripheral to full participation in
a blended trade bargaining situation. British Journal of Educational
Technology. 35 (2), 197-211.
Caruso, J. B. (2004). Key findings ecar study of students and information technology,
2004: Convenience, connection, and control [online]. Educause Center for
Applied Research (ECAR). Available from
http://www.educause.edu/ir/library/pdf/ecar_so/ers/ERS0405/ecm0405.pdf
[31 Jan 2006]
81 The Higher Education Academy – October 2006
Catley, P. (2004) One lecturer's experience of blending e-learning with traditional
teaching or how to improve retention and progression by engaging students
[online]. Brookes eJournal of Learning and Teaching. 1 (2), Available from
http://www.brookes.ac.uk/publications/bejlt/volume1issue2/academic/catley05
_1.html! [25 August 2006].
Chapelle, C. (2004) Learning through online communication: findings and
implications from second language research. University of York Seminar
Series on Researching Dialogue and Communities of Enquiry in E-Learning in
Higher Education.
Charlier, B., Platteaux, H., Bouvy, T., Esnault, L., Lebrun, M., Moura, A., Pirotte, S.,
Denis, B. & Verday, N. (2004). Stories about innovative processes in higher
education: Some success factors [online]. Available from
http://equel.net/mod/resource/view.php?id=166 [25 August 2006].
Chelimsky, E. & Shadish, W. R. (eds.) (1997) Evaluation for the 21st century: A
handbook. Thousand Oaks, CA: Sage.
Clark, I. & James, P. (2005) Blended learning: An approach to delivering science
courses on-line [online]. UniServe Science Blended Learning Symposium
Proceedings. Available from
http://science.uniserve.edu.au/pubs/procs/wshop10/index.html [25 August
2006].
Clarke, S., Lindsay, K., McKenna, C. & New, S. (2004) Inquire: A case study
evaluating the potential of online MCQ tests in a discursive subject. ALT-J,
Research in Learning Technology. 12 (3), 249-260.
Clouder, L. & Deepwell, F. (2004) Reflections on unexpected outcomes: Learning
from student collaboration in an online discussion forum [online]. Networked
Learning Conference. Available from
http://www.networkedlearningconference.org.uk/past/nlc2004/proceedings/ind
ividual_papers/clouderanddeepwell.htm [25 August 2006].
Codone, S. (2004) Reducing the distance: A study of course websites as a means to
create a total learning space in traditional courses. IEEE Transactions on
Professional Communication. 47 (3), 190-199.
Coffield, F., Moseley, D., Hall, E. & Ecclestone, K. (2004a). Learning styles and
pedagogy in post-16 learning: A systematic and critical review [online].
London: Learning and Skills Research Centre. Available from
http://www.lsda.org.uk/files/pdf/1543.pdf [31 May 2006]
Coffield, F., Moseley, D., Hall, E. & Ecclestone, K. (2004b). Should we be using
learning styles? What research has to say to practice [online]. London:
Learning and Skills Research Centre. Available from
https://www.lsneducation.org.uk/user/order.aspx?code=041540&cookie_test=
true [31 May 2006]
Cohen, L., Manion, L. & Morrison, K. (2000) Research methods in education.
London: Routledge Falmer.
82 The Higher Education Academy – October 2006
Concannon, F., Flynn, A. & Campbell, M. (2005) What campus-based students think
about the quality and benefits of e-learning. British Journal of Educational
Technology. 36 (3), 501.
Condron, F. (2001) Using electronic resources to support dialogue in undergraduate
small-group teaching: The Aster project. ALT-J. 9 (2), 39-46.
Conole, G. & Warburton, B. (2005) A review of computer-assisted assessment. ALT-
J, Research in Learning Technology. 13 (1), 17-31.
Cook, K., Cohen, A. & Owsten, R. (2003). If you build it, will they come? Students
use of and attitudes towards distributed learning enhancements in an
introductory lecture course. [online]. Institute for Research on Learning
Technologies. York University, Canada. Available from
www.yorku.ca/irlt/reports/techreport2001.htm [30 March 2006].
Cox, G., Carr, T. & Hall, M. (2004) Evaluating the use of synchronous communication
in two blended courses. Journal of Computer Assisted Learning. 20 (3), 183-
193.
Creanor, L., Gowan, D., Howalls, C. & Trinder, K. (2006a) The learner's voice: A
focus on the learner experience [online]. Networked Learning Conference.
Lancaster, UK. Available from
http://networkedlearningconference.org.uk/abstracts/pdfs/P24%20Creanor.pd
f [25 August 2006].
Creanor, L., Gowan, D., Howalls, C. & Trinder, K. (2006) LEX final project report.
Cullen, J., Hadjivassiliou, K., Hamilton, E., Kelleher, J., Sommerlad, E. and Stern, E.
(2002). Review of current pedagogic research and practice in the fields of
post-compulsory education and lifelong learning [online]. The Tavistock
Institute, TLRP, ESRC. Available from
http://www.tlrp.org/pub/acadpub/Tavistockreport.pdf [30 March 2006]
Danchak, M. & Huguet, M. P. (2004) Designing for the changing role of the instructor
in blended learning. IEEE Transactions On Professional Communication, 47
(3).
Davies, P. (2004) Don't write, just mark: The validity of assessing student ability via
their computerized peer-marking of an essay rather than their creation of an
essay. ALT-J, Research in Learning Technology. 12 (3), 261 - 277.
Davies, J. & Graff, M. (2005) Performance in e-learning: Online participation and
student grades. British Journal of Educational Technology. 36 (4), 657-663.
Davies, A., Ramsay, J., Lindfield, H. & Couperthwaite, J. (2005) A blended approach
to learning: added value and lesson learnt from students' use of computer-
based materials for neurological analysis. British Journal of Educational
Technology. 36 (5), 839-849.
Denzin, N. K. & Lincoln, Y. S. (2003) Introduction: The discipline and practice of
qualitative research. Strategies of qualitative inquiry. 2
nd
Ed. Thosand Oaks,
Ca.: Sage.
83 The Higher Education Academy – October 2006
Derntl, M. & Motschnig-Pitrik, R. (2005) The role of structure, patterns, and people in
blended learning. The Internet and Higher Education. 8 (2), 111-130.
Dickinson, J. (2005) Enabling e-learning in higher education. Newcastle Business
School.
Draffan, E. A. & Rainger, P. (2006) A model for the identification of challenges to
blended learning. ALT-J, Research in Learning Technology. 14 (1), 55-67.
Draper, S. W. & Brown, M. I. (2004) Increasing interactivity in lectures using an
electronic voting system. Journal of Computer Assisted Learning. 20 (2), 81-
94.
Driscoll, M. (2002) Blended learning: Let's get beyond the hype [online]. LTI
Magazine. Available from
http://elearningmag.com/ltimagazine/article/articleDetail.jsp?id=11755 [31
May 2006]
Dron, J., Siedel, C. & Litten, G. (2004) Transactional distance in a blended learning
environment. ALT-J, Research in Learning Technology. 12 (2), 163 - 174.
Durkin, C. (2003). University of Bath Blackboard pilot excerpts [online]. Available
from http://www.bath.ac.uk/e-learning/vlecmap/Resources/bbstviews.htm [25
August 2006].
Ehrmann, S. (2003). Asking the right question: What does research tell us about
technology and higher learning [online]. learner.org. Available from
www.learner.org/edtech/rscheval/rightquestion.html
Eklund, J., Kay, M. & Lynch, H. (2003) E-learning: Emerging issues and key trends.
Australian National Training Authority.
Ellaway, R., Dewhurst, D. & Mcleod, H. (2004) Evaluating a virtual learning
environment in the context of its community practice. ALT-J, Research in
Learning Technology. 12 (2), 125 - 145.
Ellem, G. K. & Mclaughlin, E. A. (2005) Tales from the coalface: From tragedy to
triumph in a blended learning approach to the teaching of 1st year biology
[online]. UniServe Science Blended Learning Symposium Proceedings.
University of Sydney, Australia. Available from
http://science.uniserve.edu.au/workshop/2005/index.html [25 August 2006].
Ellis, R. A., Marcus, G. & Taylor, R. (2005) Learning through inquiry: Student
difficulties with online course-based material. Journal of Computer Assisted
Learning. 21, 239-252.
Enjelvin, G. (2005) Investigating VAT (value-adding technologies) and e-
effectiveness in a French department. Journal of Further and Higher
Education. 29 (2), 155-167.
Felix, U. (2005) Analysing recent CALL effectiveness research - towards a common
agenda. Computer Assisted Language Learning. 18 (1-2), 1-32.
84 The Higher Education Academy – October 2006
Fletcher, K. M. M. (2005) Self-efficacy as an evaluation measure for programs in
support of online learning literacies for undergraduates. Internet and Higher
Education. 8 (2005), 307-322.
Flynn, A., Concannon, F. & Ni Bheachain, C. (2005) Undergraduate students'
perceptions of technology supported learning: The case of an accounting
class. International Journal on E-Learning. 4 (4), 427-444.
Garrison, D., R. & Anderson, T. (2003) E-learning in the 21st century: A framework
for research and practice. New York: RoutledgeFalmer.
Georgetown University (2002) Visible Knowledge Project [online]. Washington, D.C.:
Georgetown University. Available from http://crossroads.georgetown.edu/vkp/
[25 August 2006].
Goodyear, P., Asensio, M., Jones, C., Hodgson, V. & Steeples, C. (2003)
Relationships between conceptions of learning, approaches to study and
students’ judgements about the value of their experiences of networked
learning. ALT-J. 11 (1), 17-27.
Grabe, M. & Christopherson, K. (2005) Evaluating the advantages and
disadvantages of providing lecture notes: The role of internet technology as a
delivery system and research tool. Internet and Higher Education. 8, 291-298.
Graff, M. (2003) Individual differences in sense of classroom community in a blended
learning environment. Journal of Educational Media. 28 (2-3),
Greeno, J. G., Collins, A. M. & Resnick, L. (1996) Cognition and learning. In Berliner,
D. C. & Calfee, R. C. (eds) Handbook of educational psychology. New York:
Simon and Schuster Macmillan.
Grund, S., Grote, G. & Gerber, M. (2004) What do students actually use in a blended
learning course? In Kommers, P. & Richards, G. (eds) World Conference on
Educational Multimedia, Hypermedia and Telecommunications. Lugano,
Switzerland.
Gunn, C., French, S., Mcleod, H., Mcsporran, M. & Conole, G. (2002) Gender issues
in computer-supported learning. ALT-J. 10 (1), 32-44.
Hanley, L. (2002). Educational technology and academic labor [online]. Workplace
5.1. Available from http://www.cust.educ.ubc.ca/workplace/issue5p1/5p1.html
[30 May 2006]
Hart, C. (1998) Doing a Literature Review: Releasing the Social Science Research
Imagination. London: Sage.
Haywood, J., Macleod, H., Haywood, D., Mogey, N. & Alexander, W. (2004). Student
views of e-learning: A survey of university of Edinburgh WEBCT users 2004
[online]. Edinburgh: University of Edinburgh. Available from
http://www.flp.ed.ac.uk/webct/surveyresults.pdf [16 May 2006]
HEA (2006) Higher Education Academy e-Learning Benchmarking Exercise [online],
Available from http://www.heacademy.ac.uk/benchmarking.htm [25 August
2006].
85 The Higher Education Academy – October 2006
HEFCE (2005) HEFCE strategy for e-learning [online]. HEFCE. Available from
http://www.hefce.ac.uk/pubs/hefce/2005/05_12/ [16 August 2006]
Heinze, A. & Proctor, C. (2004) Communication - a challenge and an enabler for
facilitating blended learning community. Internal report, University of Salford.
Hibberd, S., Litton, C. & Chambers, C. (2006) MELEES - reflective overview on use
of a VLE in supporting student learning. HELM Conference. Loughborough,
HE Academy Engineering Subject Centre.
Higgins, S. (2003) Does ICT improve learning and teaching in schools? British
Educational Research Association.
Higgison, C., Currant, N., Murray, C., Pellow, A., Taylor, J., Hairsine, J., Hennessy,
S., Raby, S. & Sykes, R. (2006) Enhancing learner progression:
Understanding the learners' perspectives [online]. Networked Learning
Conference. University of Lancaster. Available from
http://www.brad.ac.uk/acad/tqeg/resources/elp/JISC_ELP/media/NLN2006-
Symposium73Intro.pdf#search=%22higgison%20elp%22 [25 August 2006].
Hinterberger, H., Fässler, L. & Bauer-Messmer, B. (2004) From hybrid courses to
blended learning: A case study. ICNEE, 27-30 September 2004,. Neuchâtel /
Switzerland.
Holbrook, N. J. & Devonshire, E. (2005) Simulating scientific thinking online: An
example of research-led teaching. Higher Education Research &
Development. 24 (3), 201-213.
Hughes, M. & Daykin, N. (2002) Towards constructivism: Investigating students'
perceptions and learning as a result of using an online environment.
Innovations in Education and Teaching International. 39 (3), 217-224.
Hughes, G. & Lewis, L. (2003) Who are successful online learners? Exploring the
different learner identities produced in virtual learning environments. In Cook,
J. & Mcconnell, D. (eds) Communities of Practice. Research Proceedings of
the 10th Association for Learning Technology Conference (ALT-C 2003). Held
8 - 10 September 2003. Sheffield: The University of Sheffield and Sheffield
Hallam University, UK.
JISC (2003) Managed learning environment activity in further and higher education in
the UK [online]. Bristol: Joint Information Systems Committee. Available from
http://www.jisc.ac.uk/uploaded_documents/mle-study-final-report.pdf [22 April
2006]
JISC (2005a) Study of environments to support e-learning in UK further and higher
education: A supporting study for the Joint Information Systems Committee
[online]. Joint Information Systems Committee (JISC): Bristol. Available from
http://www.jisc.ac.uk/uploaded_documents/e-learning_survey_2005.pdf [2
November 2005]
JISC (2005b) Planning and evaluating effective practice with e-learning: Web site to
support the JISC- Higher Education Academy regional workshops,
September 2005 - February 2006 [online]. Bristol: Joint Information System
86 The Higher Education Academy – October 2006
Committee. Available from http://www.jisc.ac.uk/elp_practice.html [9 May
2006]
JISC (2005c) myWORLD - Wider Opportunities for Reflection, Learning and
Development [online]. JISC. Available from
http://www.jisc.ac.uk/index.cfm?name=myworld
[25 August 2006].
Jenkins, M., Browne, T. & Armitage, S. (2001). Management and implementation of
virtual learning environments: A UCISA funded survey [online]. Available from
http://www.ucisa.ac.uk/groups/tlig/vle/ [31 May 2006]
Jones, N. & Fitzgibbons, K. (2002) For better or worse? The marriage of key skills
developments and on-line learning. Journal of Vocational Education and
Training. 54 (3), 395-411.
Judson, E. & Sawada, D. (2002) Learning from past and present: Electronic
response systems in college lecture halls. Journal of Computers in
Mathematics and Science Teaching. 21 (2), 167-181.
Kear, K. (2004) Peer learning using asynchronous discussion systems in distance
education. Open Learning. 19 (2), 151-164.
Kember, D. (2003) To control or not to control: The question of whether experimental
designs are appropriate for evaluating teaching innnovations in higher
education. Assessment and Evaluation in Higher Education. 28 (1), 89-101.
Kennedy, G. E. & Cutts, Q. I. (2005) The association between students’ use of an
electronic voting system and their learning outcomes. Journal of Computer
Assisted Learning. 21 (4), 260-268.
Kent, N. & Facer, K. (2004) Different worlds? A comparison of young people's home
and school ict use. Journal of Computer Assisted Learning. 20 (6), 440-455.
Kirkwood, A. & Price, L. (2005) Learners and learning in the twentyfirst century:
What do we know about students’ attitudes towards and experiences of
information and communication technologies that will help us design courses?
Studies in Higher Education. 30 (3),
Koper, R., Pannakeet, K., Hendriks, M. & Hummel, H. (2004) Building communities
for the exchange of learning objects: Theoretical foundations and
requirements. ALT-J, Research in Learning Technology. 12 (1), 21 - 35.
Lave, J. & Wenger, E. (1991) Situated learning: legitimate peripheral participation.
Cambridge: Cambridge University Press.
Laurillard, D. (1993) Rethinking university teaching-a framework for the effective use
of educational technology. New York: Routledge.
Lea, S. J., Stephenson, D. & Troy, J. (2003) Higher education students’ attitudes to
student-centred learning: Beyond ‘educational bulimia’? Studies in Higher
Education. 28 (3), 321 - 334.
87 The Higher Education Academy – October 2006
Lessner, E. & De Cicco, E. (2006) Don't disable the learner: All technology is, or
should be, assistive. JISC Innnovating e-Learning 2006. Online conference,
JISC.
Littlejohn, A. (ed.) (2003) Reusing online resources: A sustainable approach to e-
learning, London: Routledge Falmer.
Lockyer, L., Patterson, J. & Harper, B. (1999) Measuring effectiveness of health
education in a web-based learning environment: A preliminary report. Higher
Education Research & Development. 18 (2), 233-246.
Lockyer, L., Patterson, J. & Harper, B. (2001) ICT in higher education: Evaluating
outcomes for health education. Journal of Computer Assisted Learning. 17
(3), 275-283.
Lugeba, J. & Williams, S. (2004) Patterns of use when interacting with a managed
learning environment. In Kommers, P. & Richards, G. (eds.) World
Conference on Educational Multimedia, Hypermedia And
Telecommunications. Lugano, Switzerland.
Maharg, P. (2001) Negotiating the web: Legal skills learning in a virtual community.
International Review of Law, Computers & Technology. 15 (3), 345 - 360.
Manchester School of Engineering (2004) Problem based learning [online].
Manchester: Manchester University. Available from
http://www.eng.man.ac.uk/engineering/pbl.htm [31 May 2006]
Mason, R. (1998) Models of online courses [online]. ALN Magazine. 2 (2), Available
from http://www.sloan-c.org/publications/magazine/v2n2/mason.asp [25
August 2006].
Mason, R. (2001) E-learning: What have we learnt? In Rust, C. (ed.) Improving
Student Learning Using Learning Technology, proceedings of the 2001 9th
International Improving Student Learning Symposium. Edinburgh, Oxford
Centre for Staff and Learning Development.
Mason, R. & Kaye, A. (1989) Mindweave: Communication, computers and distance
education. Oxford: Pergamon Press.
Matheos, K., Daniel, B. & McCalla, G. (2005) Dimensions for blended learning
technology: Learners' perspectives. Journal of Learning Design 1(1), 56-76.
Mayes, T. (2004). JISC e-Learning Models Desk Study Stage 2: Learner-centred
pedagogy: Individual differences between learners [online]. JISC. Available
from
http://www.jisc.ac.uk/uploaded_documents/Stage%202%20Learning%20Style
s%20(Version%201).pdf [24 May 2005]
Mayes, T. & de Freitas, S. (2004). Review of e-learning theories, frameworks and
models. Stage 2 of the e-learning models desk study [online]. JISC. Available
from
http://www.jisc.ac.uk/uploaded_documents/Stage%202%20Learning%20Mod
els%20(Version%201).pdf [30 March 2006]
88 The Higher Education Academy – October 2006
McConnell, D. (2000) Implementing computer supported co-operative learning. 2
nd
Ed. London: Kogan Page.
McConnell, D. (2005) Examining the dynamics of networked e-learning groups and
communities. Studies in Higher Education. 30 (1), 25-42.
McGugan, S. & Peacock, S. (2005) Learning technology and its potential to support
student placements in hospitality and tourism education. Journal of
Hospitality, Leisure and Tourism Education. 4 (1), 15-29.
McHugo, C., Johnson, K. & Hall, T. (2004) An overview of learner's experiences
using blended learning techniques in engineering applications [online]. Paper
presented at EdTech Conference. Available from
http://www.ilta.net/EdTech2004/papers/mchugo.doc [25 August 2006].
McMullin, B. & Munro, M. (2003). Access to lecture notes: Review and best practice
[online]. Available from http://odtl.dcu.ie/wp/2004/odtl-2004-00.html [25
August 2006].
McShane, K. (2005). Issues in blended teaching and learning [online]. University of
Edinburgh. Available from
http://www.elearn.malts.ed.ac.uk/issues/news/ebreak13.phtml [12 October
2005]
McSporran, M. & Young, S. (2001) Does gender matter in online learning? ALT-J. 9
(2), 3-15.
Meredith, S. & Newton, B. (2004) Models of e-learning: Technology promise versus
learner needs case studies. The International Journal of Education
Management. 4 (1), 39-51.
Molesworth, M. (2004) Collaboration, reflection and selective neglect: Campus-based
marketing students' experiences of using a virtual learning environment.
Innovations in Education and Teaching International. 41 (1), 79-92.
Moore, R. & Bryant, B. (1989) Preparatory educational project, final report of a
collaborative research project undertaken in 1985 – 88 by Ruskin College,
Oxford, the Open University South Region and the WEA Thames Valley
District. Ruskin College: Oxford.
Morris, L. & Walker, D. (2006) CAA sparks chemical reaction: Integrating CAA into a
learning and teaching strategy. Evaluation of the use of the virtual learning
environment in higher education across Scotland. QAA Scotland.
Morrison, D. (2006). Supporting resource for 30 august benchmarking briefing
[online]. Available from
http://elearning.heacademy.ac.uk/weblogs/benchmarking/wp-
content/uploads/2006/08/BenchmarkingELearningDecisionMatrix0_6.doc [25
August 200