Content uploaded by Jose Albors
Author content
All content in this area was uploaded by Jose Albors on Jul 28, 2021
Content may be subject to copyright.
The impact of e-learning in
university education. An empirical
analysis in a classroom teaching
context
Jose Albors-Garrigos
Jose Carlos ramos Carrasco
María-del-Val Segarra-Oña
Universidad Politecnica de Valencia (Spain)
Abstract
The goal of this chapter is to analyse the impact of e-learning technologies and tools as a
support for teacher-led courses in the performance (efficiency) of teaching as well as in the
learner’s acceptance. It also aims at analysing and determining the moderating factors which
have an influence on the process. The paper is based on the data which has been accumulated
during an academic year at a large Spanish university offering 2500 courses and employing
1800 professors.
INTRODUCTION
This chapter will try to answer various research questions. Is e-learning an effective tool as a
support for traditional face-to-face classes? What are the moderating factors which influence e-
learning adoption by teachers? Do e-learning tools have a positive impact on learning
performance and student satisfaction? If so, what are the most promising tools? Does e-learning
facilitate network learning?
In this direction, this research paper will present the impact of e-learning as a support for
traditional teaching activities. The paper will present the results of the analysis of e-learning
data for an academic year at the Universidad Politecnica de Valencia (UPV). It will cross-
1
reference the results of e-learning with the course performance and student satisfaction surveys
at the UPV.
BACKGROUND
The relationship between IT and learning has been studied by Leidner and Jarvenpaa (1995) and
been associated with the principal learning schools and model theories. Figure 1 below relates
both schools and theories with the main IT-based learning models and tools.
Additionally, other authors have highlighted the utilisation of e-learning and the use of Internet
technologies with its broad array of learning tools for enhancing the learners’ knowledge and
performance. These authors support the evidence that these tools have an impact on the
effectiveness and acceptance of e-learning within the medical education community, especially
when combined with traditional teacher-guided activities in a blended-learning educational
experience (Ruiz et al, 2006).
Figure 1. IT-based e-learning models and tools (based on Jarvenpaa, 1995)
Various studies have consistently demonstrated the satisfaction of students with e-learning
methods. The learners’ satisfaction rates increase with e-learning usage as compared to
traditional learning, together with a perceived ease of use and access, navigation, interactivity,
and user friendly interface design. Interestingly, students do not see e-learning as replacing
2
traditional teacher-led instruction but as a complement to it, forming part of a blended-learning
strategy (Gibbons and Fairweather, 2000; Chumley-Jones et al, 2002). Its comparison and
complementarity with classroom education has been also reviewed (Bernard et al, 2004;
Letterie, 2003) as well as its constant and wide expansion (Martínez and Gallego, 2007).
On the other hand, little has been done to understand why, despite the crucial role e-learning
plays in current education methods, many users discontinue their online-learning after their
initial experience (Martínez and Gallego, 2007; Marshall and Mitchell, 2003, De la Cruz, 2005,
Liaw, 2008). However, as has been demonstrated, e-learning technologies represent a good
opportunity to ensure faster and higher development trends (Campanella et al, 2008).
Sun et al. (2008) studied six factors affecting a user’s e-learning satisfaction within a wide range
of factors that include learners, instructors, courses, technology, design, and environment
aspects. They found that learner computer anxiety, instructor attitude toward e-learning, e-
learning course flexibility, e-learning course quality, perceived usefulness, perceived ease of
use, and diversity in assessments are the critical factors affecting a learner’s perceived
satisfaction.
Marshall and Mitchell (2003) studied the main problems found when introducing an e-learning
system. In their opinion, in most situations, a personalized system is needed for each situation.
Despite these findings, they propose a general model composed of six basic steps, the so called
E-learning Maturity Model (see Figure 2 below), where the levels are not concerned with how
the particular tasks are done, but rather with how well the process is performed and controlled.
Figure 2. E-learning Maturity Model (source Marshall and Mitchell, (2003)
3
Littlejohn et al. (2006) identified twelve key characteristics, as shown in Figure 3, of learning
resources that may promote changes in e-learning practice. These authors not only identified
what types of resources are effective in the e-learning process, but conclude that what is most
important is their use in context (Littlejohn et al., 2006, Littlejohn and McGill, 2004).
Figure 3. Factors likely to have a positive influence on the use of a resource (source Littlejohn
et al. (2006).
Many companies and universities are using e-learning systems to provide improved results
but there are still various problems and barriers related to e-learning activities that require
solutions (Cook et al., 2009, Campanella, et al, 2008, Littlejohn et al., 2006). It is especially
relevant to show academic institutions how to improve learner satisfaction and further
strengthen their e-learning implementation (Sun et al., 2008).
OPEN COURSE PLATFORMS.
Several universities have adopted e-learning platforms following the ICT dynamism (Correa
and Paredes, 2009). Higher education institutions are involved in this “new technologies”
movement which requires a complex blend of technological, pedagogical and organizational
components (McPherson and Nunest, 2008).
On the other hand, there are some universities such as the Open University (UK) and the
Universitat Oberta de Cataluyna (UOC ,Spain) that have only e-learning platforms, that don’t
need to change their resources and are not suffering an adaptation process (Sclater, 2008).
Staff training courses, which try to prevent some of the main problems detected, are a common
tool in several universities (see Figure 4), (Littlejohn, 2006, Mahdizadeh, 2008, Cook, 2009).
4
University Course name Page link
University of Wisconsin ADA statement-E-learning evaluation
teaching course http://www.uwstout.edu/static/profdev/elearnin
g/syllabus.html
University of Georgia E-learning evaluation syllabus http://www.coe.uga.edu/syllabus/edit/EDIT_83
50_ReevesT_FA08.pdf
The UNC School of Education e-Learning for Educators - Designing a
Virtual Field Trip: Online course syllabus http://www.learnnc.org/lp/pages/6501
Free University of Berlin Using e-learning for social sciences:
practical lessons from the www.elearningeuropa.info/files/media/media11
894.pdf
Figure 4. E-learning staff courses (source authors).
The financial effort to develop an e-learning platform is considerable, involving changes in
pedagogic methods, communication methods, technological processes, skills development, etc.
(Sclater, 2008). Therefore, the evaluation of the model and the evidence that the e-learning
experience is fulfilling all the goals become crucial for the institution as well as for the learners
involved.
Graf and List (2005) have evaluated nine open-learning platforms: ATutor, Dokeos, dotLRN
(based on OpenACS), ILIAS, LON-CAPA, Moodle, OpenUSS (Freestyle-learning), Sakai, and
Spaghettilearning of which Sakai and Moodle are currently the most popular. These platforms
were evaluated according to eight categories: communication tools, learning objects,
management of user data, usability, adaptation, technical aspects, administration, and course
management. These categories included several subcategories which were utilised in a survey to
evaluate the platforms. Moodle dominated the evaluation by achieving the best value five times.
The strengths of Moodle were the realization of communication tools, and the creation and
administration of learning objects. Sakai was penalized since it had not been fully developed at
that time; today it has a community of 200 worldwide university users and developers. Moodle
clearly has a wider extension with 50.000 users and a broader user profile, including various
types of learning centres. In Spain, only 3 universities utilise Sakai while Moodle has more than
thirty users.
According to Babo and Acevedo (2009) the most widely- used platforms in universities are
Moodle, Blackboard, Webct and Sakai (see Figure 5).
5
Figure 5. Open-course platforms as learning, management systems (Source Babo and Acevedo,
2009)
Measure of e-learning efficiency in higher education.
In this new context, the profiles of professors of higher education should change, especially
with the expectation that education researchers will become more interdisciplinary, maintaining
awareness of the topics, frameworks, and techniques that characterize related research in other
disciplines; openness to sharing and learning from research outside their domains; and
collective reflection on their practices (Greenhow et al., 2010). Despite the crucial importance
of this issue, little research has been found concerning best practices and teacher characteristics
which would facilitate e-learning processes in universities.
Campaella et al. (2008) analysed the e-learning platforms in the Italian universities, studying 49
universities and using an evaluation model that considered five general aspects: system
parameters, administration facilities, interaction support, teacher services, and learner services,
comparing the Learning Management Systems (LMSs) adopted by each centre, but without
considering the learners or teachers’ abilities or competences.
In this respect, though learning processes have evolved from the post-industrial to an
information technology and knowledge era, younger individuals, born after the 1980s, also
labelled “digital natives”, learn in a different manner than older people. This digital divide has
had a clear impact on e-learning evolution (Prenski, 2001).
New environments, tools, and information input will have a stronger and more frequent impact.
The amount of information available to every human being, thanks to the Internet, would have
6
been considered science fiction twenty years ago, and it increases exponentially every year.
Instantaneity has arrived through search engines such as Google and repositories such as
Wikipedia (Sharples 2000).
The research question we are focusing on deals with these aspects, and tries to understand in
what way a professor´s profile determinants such as age, knowledge background or academic
seniority affect the teaching-learning process in e-learning environments.
On the other hand, several authors have analysed the main problems that institutions have to
deal with when implementing e-learning platforms (Martínez and Gallego, 2007; Marshall and
Mitchell, 2003, De la Cruz, 2005, Liaw, 2008, Campanella et al, 2008, Sun et al., 2008), but we
haven´t yet found any reference to subject efficiency and student valorisation when comparing
e-learning and classical classroom teaching.
RESEARCH METHODOLOGY
Objectives. Research Hypotheses.
The study is the result of the collaboration between researchers and the UPV IT department
leaders.
Concluding our discussion, we can present our research hypotheses as follows:
H1. E-learning requires a new context for learning.
H2. The digital divide is a barrier for e-learning development and therefore, it will be
influenced by the generation differences of professors and their context (age, knowledge
background or academic seniority).
H3. E-learning has a positive effect in the-learning impact and efficiency.
Data
Five years ago the UPV adopted the Sakai platform and some models of Open Course Ware to
develop their Poliformat model for an e-learning support of physical attendance education. The
7
analysis used is based on the results of 3001 courses and relates the utilisation of various
Poliformat tools by professors with the teaching efficiency ratios as well as with the student
satisfaction reports for each subject. A number of moderating variables have been taken into
account such as professor age, status and seniority, subject area knowledge, course level
(undergraduate, post graduate or life long learning) and other context factors. The analysis has
been based on the UPV database corresponding to the 2008-09 academic years.
The data has been selected for those courses, 288 cases, of which there was detailed information
of the professor’s profile and subject circumstances.
Furthermore, the analysis has taken in to account the results of the survey carried out during
2008 with the replies of 315 teachers (out of 2600) and 432 students (out of 36.012).
Survey and data analysis results.
Survey Results
It has to be mentioned that the majority of courses running at the UPV (97,5 %) are based
on classroom learning methodology with e-learning acting as a support tool while only 2,5 %
are partially based on online courses.
In this direction, the first part of the survey addressed the opinions of the university students
and faculty on the usefulness of e-learning tools. The replies have been summarized in Figure 6.
The first block of six questions deals with the experience of students and staff with e-learning
tools (social networks, SMS, news/ RSS, multimedia and websites), and their evaluation of their
usefulness. The first outstanding result is that students have clearly had more experience with e-
learning tools (approximately, an average 75 %) than their professors (an average of 50%).
Among the students, SMS, websites and web tests are popular while the relevance of social
networks must be emphasized. These results show a clear digital divide between students and
faculty.
1 There are 2500 courses in the database but only 300 register one sole professor and his/her
characteristics (age, seniority) can therefore be identified.
8
On the other hand, the second part of the survey addressed the use of the open course
Poliformat platform and tools. Here the differences between students and faculty are lower since
this platform was set up five years ago by the university and various programs have been
offered to motivate students and professors to its utilisation as well as reinforcement of its
utilisation for academic evaluation. Still, students do find Poliformat more useful than their
professors. PDAs, messages and calendars are more popular among students than faculty.
Finally, though Google docs and blogs are less popular among students, still these tools are
found more useful by the students that by their professors. We could conclude here that
although Poliformat is, in general, used by the faculty, students find it more useful and
therefore, there would be more potential for the tools if the faculty would extend their usage.
Figure 7 shows the results of the survey concerning the opinions of students and
faculty on the impact of e-learning tools. It shows that a majority of professors (approximately
70%) think that e-learning tools clearly contribute to improving teaching. Curiously, the
opinions of students differ slightly, but that could be attributed to the fact that the teaching
orientation of the professor is not completely changed by e-learning. The opinions related to
learning are slightly different, 88 % of students believe that e-learning contributes to improving
their learning while this opinion is held by approximately 73% of the faculty members. Again
this could be a result of the expectations of the generations within the digital divide proposed by
Prenski (2001).
Teaching efficiency seems to be improved by e-learning tools and this opinion, again,
is more strongly held by students than professors. The students are more accustomed to digital
media and therefore, these tools are better adapted to their culture. The same applies to
improving communication, though again, with some restrictions since digital tools cannot
substitute physical communication.
9
0,0 10,0 20,0 30,0 40,0 50,0 60,0 70,0 80,0
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Never
Often
Very Often
Blogs Google
Docs Teaching
Calendars PDAs MessengerPoliformat Personal
Websites WebtestsMultimedia News/
RSS SMS Social
Networks
Students
Faculty
Figure 6. Opinions of university students and faculty on the usefulness of e-learning tools.
Finally, and this is an outstanding conclusion, there are great differences between students and
faculty regarding the role of e-learning for improving team work. These results coincide with
the previous analysis related to the opinions on the usefulness of e-learning tools as regarding
social networks. Previous research has shown a certain reluctance within the academic context
to social networking (Albors and Ramos, 2008)
10
0 102030405060
Quite Agree
Tota lly Agree
Quite Agree
Tota lly Agree
Quite Agree
Tota lly Agree
Quite Agree
Tota lly Agree
Quite Agree
Tota lly Agree
Improve
Teaching Improve
Learning Improve
Efficiency Improve
Communication
Improve Team
Work
Students
Faculty
Figure 7. Opinions of students and faculty on the impact of e-learning tools.
Database Analysis results
As has been pointed out, the researchers have had access to a database of 2500 subjects which
were taught with the support of Poliformat. Of these, only 310 were selected since individual
professors were identified as the one responsible for the subject and therefore their profile could
be analysed. Though meaningful data was lost, it would have been a very complex task to
identify groups by professors and filter all the relevant data. That exercise has been left for
future research.
Table 1 shows the data that was taken into account in the analysis. It has to be mentioned that
for our purposes the utilisation of electronic learning tools was an independent variable while
teaching excellence (student satisfaction) and teaching efficiency measuring learning indirectly
were dependent variables. The rest of the variables: e-learning training, professor age, professor
category, seniority, subject type and number of registered students have been considered
moderating factors.
11
Table 1. Utilised data and associated variables for the analysis.
Variable Meaning
Teachexcel Results of student annual evaluation of faculty (range 1-10)
Training Number of courses followed by faculty on e -learning
Age Age of the Professor
Subj. Type 1: IT Engineering and Science; 2: Engineering; 3: Social
sciences and Arts
Teach Categ Professor Category: from 4, Full Professor to 1, Jr.
Associate Prof.
Seniority Years in the category
Efficiency % of students passing the subject in the first call
Utilelteach Utilisation of Poliformat tools (by number)
Matriculados Number of registered students in the course
A first factor analysis has grouped these variables as shown in Table 2 with an explained
variance of 70, 7%. Thus, both independent and dependent variables have been grouped in
component 2 while the moderating factors have been included in components 1, 3 and 4.
Subject Type always has a negative sign, which is explained by the fact that IT subjects are
taught by faculty with a background in digital learning while the highest number (3), assigned to
liberal arts and social sciences subjects, behaves the opposite way.
Table 2. Rotated components matrix
Componente
1 2 3 4
Teachexcel ,726
Training ,804
Age ,891
SubjType -,891
Teach Categ ,636
Seniority ,810
Efficiency ,799
Utilelteach ,641 ,
Matriculados ,702
Varimax Rotation with Kaiser Coefficient KMO= 0,656
Variance explained 70, 7 %.
A correlation exercise showed that Age is positively correlated with Professor Category and
Seniority, which is logical, but not with any dependent variable. Teaching Excellence is
positively correlated with e-Learning training, Efficiency and e-Learning utilisation but
negatively with Subject type.
12
When trying to build a regression model that could explain both teaching excellence and
efficiency, the following models were obtained.
Table 3. Regression model explaining Teaching Excellence
Model Standardized
Coefficients Sig.
Beta
(Constant) ,000
Utilelteach ,289 ,000
Efficiency ,175 ,005
SubjType -,157 ,009
R2= 0,451
The model shown in Table 3 indicates that Teaching Excellence (student satisfaction) can be
explained by the grade of utilization of e-learning tools, the efficiency of the subject taught and
the subject type, favouring subjects in the area of IT and engineering.
On the other hand, the model shown in Table 4 indicates that teaching efficiency can be
explained by the grade of utilization of e-learning tools, the student satisfaction level and the
subject type favouring subjects in the area of IT and engineering. Additionally, the teaching
category appears as an influencing factor.
Table 4. Regression model explaining Teaching Efficiency
Model Standarized
Coefficients Sig.
(Constant) ,000
Utilelteach ,378 ,000
SubjType -,277 ,000
Teachexcel ,158 ,007
TeachCateg ,151 ,007
R2=0,327
Finally, and following the Baron and Kenny (1986) paradigm for mediating variables, we have
built a model which could explain the utilisation of e-learning tools. The model shown in Table
5 indicates that the utilisation of e-learning tools can be explained by Training in e-learning
tools, the number of students registered in the subject, the professor’s category and the subject
type, favouring subjects in the area of IT. It is interesting to note that seniority appears as a
negative factor in explaining the digital divide.
13
Table 5. Regression model explaining utilisation of e-learning tools.
Model Standarized
Coefficients Sig.
(Constant) ,957
Training ,220 ,000
SubjType ,239 ,000
Matriculados ,153 ,010
Seniority -,129 ,021
R2=0,319
CONCLUSIONS.
As a consequence of our research we can conclude by examining our hypotheses. We could
validate H1 partially since e-learning seems to develop at a higher level in a context where
information and communication technologies prevail, and training facilitates a new background
for e-learning.
We could not find enough evidence to validate H2. The digital divide seems to be a barrier for
e-learning development and, therefore, will be influenced by the difference in the generations of
the professors and their context (age, knowledge background or academic seniority). Though
the mediating model showed seniority as a barrier for utilization of e-learning tools, the sample
composition, excessively composed of senior professors did not provide sufficient information
in this direction.
Finally it has been demonstrated that e-learning has a positive effect in the learning impact and
teaching efficiency, thus providing evidence for validating H3.
It has to be mentioned that e-learning seems to provide support for those courses with a higher
number of students registered by facilitating communication between professor and students.
Finally, the following Figure 8 shows the construct that we could draw as a conclusion for our
research.
Figure 8. Construct explaining impact of e-learning on higher teaching and moderating factors
influencing it.
14
The utilisation of e-learning tools has a positive effect on learning efficiency as explained by
student satisfaction and teaching efficiency. A number of moderating factors such as training in
e-learning, context of IT, the number of registered students, have been found that somehow
facilitates or motivates students and professors to take advantage of the e-learning tools. Finally,
the research points out that seniority appears as a barrier for the utilization of e-learning tools.
Further research should analyse more deeply the characteristics of this apparent digital divide.
REFERENCES
Albors, J., J.C. Ramos and J.L. Hervas (2008), New Learning Network Paradigms:
Communities of Objectives, Crowd-sourcing, Wikis and Open Source. International
Journal of Information Management, 28, pp. 194-202.
Babo, R., Acevedo, A. (2009, November), Learning management systems usage on higher
education, paper presented at 13th. IBIMA Conference, Marraketch, Morocco, pp. 221-
232.
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social
psychological research: Conceptual, strategic and statistical considerations. Journal of
Personality and Social Psychology, 51, pp. 1173-1182.
Bernard R, Abrami PL, Lou Y, Borokhovski E. (2004) How does distance education compare
with classroom instruction? a meta-analysis of the empirical literature. Rev Educ
Res.74, pp. 379–439.
Campanella S., Dimauro, G., Ferrante, A., Impedovo, D., Impedovo, S., Lucchese, M. G.,
Modugno, R., Pirlo, G., Sarcinella, L., Stasolla, E., Trullo, C. A. (2008) E-learning
15
platforms in the Italian Universities: The technological solutions at university of Bari
WSEAS transactions on advances in engineering education,5, (1), pp.1790-1979
Chumley-Jones HS, Dobbie A, Alford CL. (2002), Web-based learning: sound educational
method or hype? A review of the evaluation literature. Acad Med. 77, pp. 86- 93.
Cook, R. G., Ley, K., Crawford, C., Warner, A. (2009). Motivators and inhibitors for University
Faculty in distance e–learning. British Journal of Educational Technology, 40 (1), pp.
149–163.
Correa, J. M., Paredes, J. (2009). Cambio tecnológico, usos de plataformas de e–learning y
transformación de la enseñanza en las universidades españolas: la perspectiva de los
profesores. Revista de Psicodidáctica, 14 (2), pp. 261–278.
De la Cruz, O., Olivares, M., Pagés, C., Ríos, R., Moreno, F. J., López. M. A. (2005) RED,
Revista de educación a distancia. http:www.um.es/ead /red/M2, pp. 1–15.
Gibbons A, Fairweather P. (2000) Computer-based instruction. In: Tobias S, Fletcher J (eds).
Training & Retraining: A Handbook for Business, Industry, Government, and the
Military. New York: Macmillan USA, pp.410–42.
Graf, S. and List, B. (2005, April), An Evaluation of Open Source E-Learning Platforms
Stressing Adaptation Issues, Paper presented at Fifth IEEE International Conference on
Advanced Learning Technologies ICALT, pp.163.165.
Greenhow, C., Robelia, B., Hughes, J. E.(2010), Learning, Teaching, and Scholarship in a
Digital Age. Web 2.0 and Classroom Research: What Path Should We Take Now.
Research News and Comment. Educational Researcher 2009 38, pp. 246-259.
Leidner, D.E., Jarvenpaa, S.L. (1995), The Use of Information Technology to Enhance
Management School Education: A Theoretical View, MIS Quarterly, pp. 270-291.
Letterie G.S. (2003) Medical education as a science: the quality of evidence for computer-
assisted instruction. Am J Obstet Gynecol.188, pp. 849–53.
Liaw , S. (2008). Investigating students’ perceived satisfaction, behavioral intention, and
effectiveness of e-learning: A case study of the Blackboard system. Computer &
education,51, (2) pp.864-873.
16
Littlejohn, A., Falconer, I., Megill, L. (2006). Characterising effective eLearning resources.
Computers and Education, 50, pp. 757–771.
Littlejohn, A., McGill, L. (2004). Detailed report on effective resources for e-learning.
Effectiveness of resources, tools and support services used by practitioners in designing
and delivering E-Learning activities. JISC E-pedagogy Programme Project. Available
from http://cetis.ac.uk:8080/pedagogy.
Mahdizadeh, H., Biemans, H., Mulder, M. (2008). Determining factors of the use of e–learning
environments by university teachers. Computers and Education, 51, pp. 142–154.
Marshall, S., Mitchell, G. (2003). Potential Indicators of e–Learning process capability.
Educause in Australasia, pp. 99–106.
Martínez Caro, E., Gallego Rodríguez, A. (2007), El aprendizaje como ventaja competitiva para
las organizaciones: Estilos de aprendizaje y e-learning, Direccion y Organización, 33,
pp. 84-93.
McPherson, M. A., Nunest, J. M. (2008). Critical issues for e–learning delivery: what may seem
obvious is not always put into practice. Journal of computer assisted learning, 24, pp.
433–445.
Prenski, M. (2001), Digital Natives Digital Immigrants On the Horizon, 9, 5, pp. 5-25.
Ruiz, J.G., Mintzer, M.J., and Leipzig, R.M. (2006), The Impact of E-Learning in Medical
Education, Academic Medicine, 81 (3), pp. 207-212.
Sclater, N. (2008). Large scale open source e–learning systems at the Open University UK.
EDUCAUSE, Research Bulletin, Is. 12, pp.1–13.
Sharples, M. (2000). The design of personal mobile technologies for lifelong learning.
Computers and Education, 34, (3-4), pp, 177-193.
Sun, P., Tsai, R., Finger, G., Chen, Y., Yeh, D. (2008). What drives a successful e–Learning?
An empirical investigation of the critical factors influencing learner satisfaction?
Computers and Education, 50, pp. 1183–1202.
Wills, G. B., Bailey, C. P., Davis, H. C., Gilbert, L., Howard, Y., Jeyes, S., Millard, D. E., Price,
J., Sclater, N., Sherratt, R., Tulloch, I., Young, R. (2009). An e–learning framework for
17
18
assessment (FREMA). Assessment and Evaluation in Higher Education, 34 (3), pp.
273–292.
ACKNOWLEDGEMENTS.
The authors would like to thank the Vicedean of the Universidad Politecnica de
Valencia for facilitating the data which has made this research possible.