ArticlePDF Available

Abstract and Figures

The evaluation of students’ satisfaction is a key indicator of the quality of learning. It allows the collection and analysis of the students' evaluation of the quality of courses and learning process in order to improve the quality. The paper examines the possibility for automated evaluation of student satisfaction in Moodle. The developed module for automated generation of reports summarizing the results of evaluation of student satisfaction is experimented.
Content may be subject to copyright.
International Journal on Information Technologies & Security, 1, 2016
31
AUTOMATED EVALUATION OF STUDENTS’
SATISFACTION
Silvia Gaftandzhieva
University of Plovdiv ”Paisii Hilendarski”
e-mail: sissiy88@uni-plovdiv.bg
Bulgaria
Abstract: The evaluation of students’ satisfaction is a key indicator of the
quality of learning. It allows the collection and analysis of the students'
evaluation of the quality of courses and learning process in order to
improve the quality. The paper examines the possibility for automated
evaluation of student satisfaction in Moodle. The developed module for
automated generation of reports summarizing the results of evaluation of
student satisfaction is experimented.
Key words: student satisfaction, automated evaluation of quality, e-
learning.
1. INTRODUCTION
The quality evaluation being a key instrument of quality assurance and quality
enhancement in higher education is one of the most typical components of the
Bologna process.
The Bulgarian Higher Education Law (art. 6 paragraph 4, paragraph. 5 [1])
requires universities to develop internal systems for evaluation and maintaining of
the quality of learning. In conformity with the Law (art. 11), each university is
subject to periodic external evaluation and accreditation by the National Evaluation
and Accreditation Agency on relevant criteria and regulatory procedures.
According to the agency's criteria system for evaluation of distance learning in the
professional field [2] and program accreditation of professional fields and majors
from the regulated professions [3] each university must:
o periodically collect, analyze and use students’ assessments and opinion
of the learning process to enhance the quality of learning;
o conduct surveys on the feasibility and efficiency of the teaching and
assessment methods;
o discuss and presents the efficiency of the results related to the learning
quality management;
o conduct and analyze surveys on each course and teacher as well as an
annual survey with graduate students;
International Journal on Information Technologies & Security, 1, 2016
32
o conduct surveys with graduate students in order to collect and analyze
information about the realization and development of those holding a degree in
the professional field (major of regulated professions);
o conduct surveys with employers about their satisfaction with graduate
students’ preparation;
o conduct surveys with students before, during and after distance learning,
to evaluate the level of technological preparation, the way distance learning is
conducted and the access to virtual resources and activities.
Therefore, the surveys of customers’ satisfaction are an integral part of the
internal systems for evaluation and enhancement of the quality. The summarizing
of the survey results requires the collection, analysis and interpretation of a huge
amount of data, reflecting the attitude of students, experts and employers to the
courses, the software tools used, etc. Using the automated tools in evaluating the
quality is the only possible manner to use effectively and in full degree all the data
that have been collected and stored during the organization and conduction of the
learning. Interesting examples in this field are the developed tools for quality
evaluation of technology-enhanced leaning [4], quality of services [5], learners’
satisfaction [6] etc.
The paper presents an experiment of automated evaluation of students
satisfaction with the conducted e-learning courses in the Learning Management
System Moodle.
2. THE EVALUATION OF STUDENTS’ SATISFACTION WITH THE
CONDUCTED E-LEARNING COURSES
For the evaluation of students’ satisfaction with the conducted e-learning in the
experiment are implemented electronic forms for collecting, conducting and
analyzing the survey of students involved in various e-learning forms in LMS
Moodle.
The experiment is related to the integration of a centralized system for
generating queries and reports and decision making UBIS-Jaspersoft (based on
Jaspersoft BI Suite) and LMS Moodle. It aims to extract, analyze and interpret data
about students’ satisfaction with the conducted e-learning.
2.1. Survey
The primary source of information for evaluating students’ satisfaction is the
survey for evaluation of e-learning course quality. The proposed survey [7] is
created on the basis of the NEAA criteria system for evaluation of distance
learning. It contains around 50 questions with answers: 1 - bad (definitely 'no'), 2 -
satisfactory (rather 'no') 3 - good ('yes' and 'no'); 4 - very good (rather 'yes'); 5 -
excellent (definitely 'yes'), in 10 areas: course documentation and learning goals;
team for distance learning provision; infrastructure for distance learning;
preparation and conduct of distance learning; information support to distance
International Journal on Information Technologies & Security, 1, 2016
33
learning; learning materials and activities; communication; assessment; feedback;
design.
Moodle module Feedback is used for carrying out of the survey in an
electronic form. It allows teachers to create their own surveys to obtain feedback
from the course participants through using various types of questions, including
multiple choices, yes/no, text answers. In the experiment Feedback Module is used
as a tool for creation of template with questions that can be further used multiple
times. The already created survey template is included as part of the learning
activities in each electronic course so that participating students can fill it out after
completion of the training (see Figure 1).
Once the surveys are completed, the data is stored in the system’s database and
can be used for analysis of the results through the LMS Moodle module Feedback.
Figure 1. Electronic survey
2.2. Automated tools for results analyzing
The built-in module Feedback offers opportunities for the generation of
summarised and statistical reports about the poll result only within a specific
course and its participants.
Open-source LMS Moodle is developed on the basis of modular principle that
allows its functionality to be extended according to specific needs. The main steps
needed to expand the functionality through new modules are presented in Moodle
Documentation [8]. After creating the necessary modules, the system administrator
must install them in a standard way to add new modules in Moodle (Site
administration/Notifications).
A supplemental module is developed to automatically analyse the survey
results in the scope of the experiment on the basis of the modular principle stated
above. It allows synthesis and analysis of the results at a more general level, e.g.
for all e-courses in a professional field or for all e-courses in a scientific field. The
new module is developed in 4 (four) steps using the system UNIS-Jaspersoft and
International Journal on Information Technologies & Security, 1, 2016
34
the capabilities of its basic software Jaspersoft BI Suite for creating reports and
analyses by retrieving data from different sources, for storing and organizing
reports in a repository and for presenting them in the preferred by the user form.
In Step 1 UBIS-Jaspersoft is integrated with Moodle database, which is set as
a data source for retrieving of the data from students’ surveys and creating reports
that reflect student satisfaction.
Figure 2. Relations between tables
International Journal on Information Technologies & Security, 1, 2016
35
Table 1. Database tables
Table
Stored data
Course
Added courses
course_categories
Added course categories
Feedback
Added feedback in LMS Moodle
feedback_completed
Completed feedbacks
feedback_completedtmp
Temporary data about completed surveys with page
breaks
feedback_item
Questions added in feedbacks
feedback_sitecourse_map
Courses related to feedbacks added to Moodle
Homepage
feedback_template
Added templates with questions
feedback_tracking
Tracking of completed feedbacks
feedback_value
Answers values
feedback_valuetmp
Temporary data about answers of anonymous
completed feedbacks
In Step 2 templates of analytical reports are developed, which can be later used
to generate the real reports containing summarised results from the conducted
survey related to e-learning quality evaluation. The creation of analytical reports
requires detailed knowledge of all database tables that store data about feedbacks
added to e-learning course (see Table 1). The Figure 2 shows the relationships
between tables.
In the present case 7 (seven) templates are developed. They can be used in
automated surveys for the quality of e-courses in a concrete professional field or
scientific field in order to obtain summarised information as follows:
o the number of e-learning courses, in which conduction of the survey is
planned (with added questionnaires) by each professional filed offered in the
university training;
o a list of e-learning courses by the corresponding professional fields, in
which conduction of the survey is planned;
o the number of e-learning courses by professional fields, where the survey
is already conducted (with completed questionnaires);
o a list of e-learning courses by professional fields, where the survey is
conducted along with the corresponding number of participants in the survey
(number of completed questionnaires);
o summarised results of the survey by professional fields;
o summarised results of the survey by the evaluated characteristics of e-
learning courses
o summarised results of the survey by scientific fields.
International Journal on Information Technologies & Security, 1, 2016
36
The templates include a presentation of the summarized results both in the
forms of tables and charts. Jaspersoft creates a source code in XML format for each
designed template. A fragment of the code of the template for summarized results
of the survey by scientific fields is presented in fig. 3.
<!-- ... -->
<subDataset name="Table_Dataset" uuid="93dcdccf-7dc7-4955-9db6-b219b51662b8">
<property name="com.jaspersoft.studio.data.defaultdataadapter" value="PDU"/>
<parameter name="Table_VO" class="java.lang.Integer"/>
<queryString language="SQL">
<![CDATA[SELECT name, id,
ROUND(AVG (r1), 2) AS one,ROUND(AVG (r2), 2) AS two,ROUND(AVG (r3), 2) AS
three,
ROUND(AVG (r4), 2) as four, ROUND(AVG (r5), 2) as five
FROM(SELECT feedback_item.name, feedback_item.id,
COUNT(CASE WHEN feedback_value.value=1 THEN 1 END)*100/
COUNT(feedback_value.value) AS r1,
COUNT(CASE WHEN feedback_value.value=2 THEN 1 END)*100/
COUNT(feedback_value.value) AS r2,
COUNT(CASE WHEN feedback_value.value=3 THEN 1 END)*100/
COUNT(feedback_value.value) AS r3,
COUNT(CASE WHEN feedback_value.value=4 THEN 1 END)*100/
COUNT(feedback_value.value) AS r4,
COUNT(CASE WHEN feedback_value.value=5 THEN 1 END)*100/
COUNT(feedback_value.value) AS r5,
COUNT(feedback_value.value) as Total
FROM course_categories
JOIN course ON course_categories.id = course.category
JOIN feedback ON course.id = feedback.course
JOIN feedback_item ON feedback.id = feedback_item.feedback
JOIN feedback_value ON feedback_item.id = feedback_value.item
WHERE course_categories.name LIKE "$P!{Table_VO}%"
AND feedback.name LIKE "%quality%" AND feedback_item.label <>0
AND feedback_item.label<>50 AND feedback_item.label<>51
GROUP BY course_categories.name, feedback_item.name)
TMP GROUP BY name ORDER BY id]]>
<!-- ... -->
Figure 3. XML code of the template
The implemented templates are compiled in a special internal format and are
stored in the Jaspersoft repository, which is realized in Step 3 (see Figure 4). They
can thereby be both used by the level of the very same UBIS-Jaspersoft system and
by another external application for the generation of the relevant reports that are
filled with data from the given data source (Moodle database). The completed
International Journal on Information Technologies & Security, 1, 2016
37
reports can be exported to a specified document format (PDF, XLS, XLSX, XML,
HTML, XHTML, CSV, DOC, etc.).
Figure 4. Repository
Step 4. realizes the ultimate goal of the experiment to integrate LMS Moodle
with UBIS-Jaspersoft through the shared UBIS-Jaspersoft web services. A new
module is developed as a supplement to the LMS Moodle (as a Moodle block),
which adds to the system the already implemented reporting functionality (Step1-
3), using the corresponding web services. This allows the generation of reports
based on the above-mentioned templates by the LMS level.
Figure 5. Quality Evaluation Block
The class of the developed block inherits the basic class for development of
the block into Moodle. It includes methods for block management and its
visualization on the screen. In order to save time for the generation of block
contents on each page in which the block is added, the management method checks
the current value of the variable which stores the block content. Before generating
the block content the method checks whether the user is authorized to use the
block. After creating a new object and verification of user permissions the content
value is set - link to the quality reports page (see Fig. 5) with the identification
International Journal on Information Technologies & Security, 1, 2016
38
number of the block instance and the user session key. The class includes the
availability block method, which limits the block addition on pages where block
functionalities and its contents are not suitable. The block purpose is to generate
reports for quality evaluation in professional field or scientific field, and therefore
the quality evaluation block is only available on the Moodle homepage.
In accordance with Moodle Documentation, user permissions within the block
are described in a separate file. Permission to add blocks to the Moodle homepage
is given to users with roles of administrator and manager. Users with other roles
are not allowed to add quality evaluation block and have no access to the added
blocks. All string values, used within the block, including the block title and user
permissions are written in the block language files.
In the file of the page (see Figure 5) which allows users to select the
generation of a report type for quality evaluation are declared global objects,
providing access to the database and used by the output library page data. When
loading the page the user’s permissions are checked and the identification number
of the block instance is selected from the database. If such a record does not exist
in the database an error message is returned.
For each type of report that can be generated by UBIS-Jaspersoft a variable is
created which stores the connection to the appropriate client application with the
identification number of the block instance and the user session key. Global objects
and the names of report templates from the language files are used to set the
connection and access to user session key. Before printing the content page on the
page is realized verification of users' permissions for block accessing. The page
content is printed through global page object methods, allowing the printing of
headers and footers, and page title.
The relevant client application opens when choosing an evaluation quality
report type. The client application addresses the UBIS-Jaspersoft REST services,
which interact with the server through the HTTP protocol. Any client application
includes a file that loads the necessary classes. After verification of the user’s
permission for block access, the REST service client is created and the constructor
of the client application class is called. It sets the values for the fields of the class
representing a link to the UBIS-Jaspersoft repository, username and password. To
access the reports using the method for starting the service, returning the object of
template class. Reports are generated by the run report method, that run a report
template and returns report binary data. In all client files as parameters of the
method are submitted report template addresses, returned document format. Only
the file for the generation of a report for evaluation of students’ satisfaction in a
chosen professional field includes as a parameter associative array with the values
of the variable that stores data for a selected professional field. All client
application return quality reports in PDF document format.
The form class is declared to generate a form that allows selection of the
scientific field. It inherits the Moodle form basic class and overlap method for form
definition. To the form is added drop-down list from which the scientific field for
International Journal on Information Technologies & Security, 1, 2016
39
report generating can be selected. The list is added through class methods for
adding an element that receives as parameters the item type, name, default value,
associative array which keys are the numbers of scientific fields and its values are
the names of scientific fields. The values are loaded from language files on the
block. To the form is added a button and when the user presses the button, the
selected value is retained and handed to the client file through the receiving data
method. When the client application is loaded after retrieving the instance block
data from the database and verification of user permissions to use the block
through a global page object, all the data used by the display page library are
stored.
Figure 6 presents the part of the generated report for summarized results of the
survey by the evaluated characteristics of e-learning courses in scientific field 5.
”Technical Sciences”. The full report consists of a complete table with summarized
results of all questions and 10 charts, showing the summarized results for each
area.
Figure 6. Generated report
CONCLUSION
The developed module allows automatic generation of reports summarizing
the results of the evaluation of students’ satisfaction with the training. In the future
will be implemented software tools for conducting and automated analysis of
results from other surveys that form part of the internal system for evaluation and
enhancement of learning quality and the design of a complete system for automated
learning quality evaluation.
Another direction of furthering the research is to integrate the module for
automatic survey analysis with a specially designed mobile application that allows
International Journal on Information Technologies & Security, 1, 2016
40
students to complete the surveys for evaluation of e-learning courses quality (or
other surveys) using their personal mobile devices [9].
ACKNOWLEDGEMENTS
The paper is partly supported within the project MU15-FFIT-001 "A
methodology for creation and dynamic updating of test items and tests with
automated evaluation of their quality" of the Scientific Research Fund at Plovdiv
University.
REFERENCES
[1] Higher Education Law, http://www.neaa.government.bg/assets/cms/-
File/normativni%20doc/ZAKON_za_visseto_obrazovanie_Aug_2011.pdf, Last Access:
30.09.2015. (in Bulgarian)
[2] Criteria for evaluation of distance learning, http://www.neaa.
government.bg/images/OA-IA-DFO/Kriterii-DFO.pdf, Last Access: 30.09.2015. (in
Bulgarian)
[3] Criteria System for Programme accreditation of professional fields and majors from the
regulated professions, http://www.neaa.government.bg/images/OA-PA-PN/Kriterialna-
sistema-PA-PN-Dec-2011.pdf, Last Access: 30.09.2015. (in Bulgarian)
[4] Ehlers U., C. Helmstedt, M. Bijnens, Shared Evaluation of Quality in Technology-
enhanced Learning, White paper, 2011.
[5] Annamdevula S., R. S. Bellamkonda, “Development of HiEdQUAL for measuring
service quality in Indian higher education sector”, International Journal of Innovation,
Managmement and Technology, 2012.
[6] Schreurs, J., A. Al-Huneidi, An eLearning Self-Assessment Model (e-LSA), In: David
Guralnick (Ed.), Proceedings of the fifth international conference on e-learning in the
workplace, June 13th-15th, New York, USA, 2012.
[7] Totkov G., R. Doneva, S. Gaftandzhieva, An e-learning Methodology, ISBN 978-954-
8852-43-2, Rakursi, Plovdiv, 2014. (in Bulgarian)
[8] Moodle Documentation, http://docs.moodle.org/dev/, Last Access: 30.09.2015.
[9] Kasakliev N., Prospects for Mobile Learning in Bulgaria, Journal for Computer Science
and Communications, Vol 4, No 1, 2015, ISSN: 1314-7846, pp. 57-65. (in Bulgarian)
Information about the author:
Silvia Gaftandzhieva PhD Student at the Plovdiv University “Paisii Hilendarski”,
Faculty of Mathematics and Informatics, Department of Computer Science. My research
areas include e-learning and distance learning, automated evaluation of quality in higher
education, new methods and ICT in distance learning.
Manuscript received on 4 November 2015
... за специалност, професионално направление или област на висше образование, е създаден специализиран модул ST_Surveys в Moodle. Модулът разширява функционалността на представено в [7,8] софтуерно средство за автоматизиран анализ на резултати от анкетни проучвания за удовлетвореността на студенти от провеждани е-курсове. ...
... Специализираният модул ST_Surveys разширява функционалността на Moodle на базата на следните софтуерни решения [7,8]: ...
Conference Paper
Full-text available
The paper presents a model for quality evaluation of test items (TI) and tests used in various forms of e-learning in higher education. The model is based on surveys among students and experts concerning the design, development, implementation and evaluation of TI and tests in e-learning. The software module has been developed and experimented in order to processing the surveys. It allows extracting and summarizing of data from other university data sources, not only from the system where the e-learning has been conducted.
... На базата на предложените модели след направен анализ на възможностите за автоматизирано извличане на данни от университетската информационна инфраструктура [4] са създадени концептуални и компютърни модели на методики, съответстващи на критериалните системи на Националната агенция за оценяване и акредитация-НАОА [5] за акредитация на специалност с дистанционна форма на обучение, професионални направления и специалности от регулираните професии и на докторски програми. Разработени са и софтуерни средства за динамично извличане, натрупване и обобщаване на данни, свързани с оценяване на е-курсове [12,14] и на проекти с представяния (или управление), които са уеб базирани [13]. ...
Conference Paper
Full-text available
DYNAMIC QUALITY EVALUATION IN HIGHER EDUCATION (WITH APPLICATIONS IN E-LEARNING) Abstract. The paper presents a general model of process for quality evaluation (QE). Conceptual and computational models of methodology for QE are defined on its basis as well as the functional specifications of a software system for dynamic QE (SDQE). The SDQE provides possibility to generate reports (for the needs of QE procedures) based on given time schedule and data mining from different information sources. Thus, the SDQE can be used by any quality control system to prepare the necessary reports and documents automatically. Corresponding software prototype was built on an existing university information infrastructure. Experiments are performed for automation of procedures of the National Evaluation and Accreditation Agency, for e-learning quality management, etc. The results of the study can be multiplied for QE in other subject domains.
... От тях с възможност за динамично оценяване Образователна дейност 6 2 5 ...
Thesis
Full-text available
MODEL AND SYSTEM FOR DYNAMIC QUALITY EVALUATION IN HIGHER EDUCATION The main objective of the PhD thesis is to propose, investigate and test suitable means for automation of the processes for dynamical quality evaluation of objects in a given subject area, especially in higher education. The study examined the general theory, existing organizations, models, standards and systems for quality evaluation in higher education. On the basis of the theoretical study a number of conceptual and computational models are proposed, as follows: • general model of a process for quality evaluation; • model of a system for dynamic quality evaluation; • model of a methodology for quality evaluation. In consequence the architecture of a software system for dynamic quality evaluation is defined and a corresponding software prototype is built over an existing university information infrastructure. General models are applied for dynamic quality evaluation of different objects in the field of higher education, as: • objects that are evaluated according to the criteria system of the National Evaluation and Accreditation Agency (using the software prototype for dynamic quality evaluation); • e-learning courses (standalone software application is developed); • web based presented collaboration projects (standalone software application is developed). Applications are experimented in real situations at Plovdiv University and prove the adequacy of the models. The results obtained in the thesis could be replicated for quality evaluation in other subject areas.
... FETCH-DEP is developed on the base of the proposed architecture (see Figure 1) in 4 (four) steps [5,6]: ...
Article
Full-text available
The paper presents an approach towards project quality, aiming to reach a higher maturity level in the quality management of the finishing this year FETCH project (Future Education and Training in Computing: How to learn anytime anywhere) funded by the European Commission Lifelong Learning Programme. It describes the concept of this approach and the developed corresponding software tool for dynamic evaluation of the project progress.
Article
Full-text available
This paper presents a study on known approaches for quality assurance of educational test and test items. On its basis a comprehensive approach to the quality assurance of online educational testing is proposed to address the needs of all stakeholders (authors of online tests, teachers, students, experts, quality managers, etc.). According to the proposed approach is developed an original software application Test Quality Evaluation (TQE) for the automation of the stakeholders' activities for quality assurance of educational tests throughout the whole lifecycle. The application retrieves and provides analysis of data from online tests conducted and specially designed surveys for quality evaluation of educational tests by students and experts. It allows tracking and evaluating the quality of educational tests in real time and provides the related quantitative data in different levels of generalization-in the level of a separate educational test, of educational tests of an entire course, or educational tests of a subject area. The software application has been put under real-time testing for quality evaluation of educational tests, included in e-learning courses from different subject areas that prove its applicability.
Conference Paper
The paper introduces basic principles of focused visualization for adaptive data presentation in interactive software solutions according to the user’s activity in real time. It is proposed to study the challenge of targeting the 4P goals (performance, persistence, perception, and personalization) in interactive applications. This problem traditionally requires much creativity of software developers that need to balance visualization quality and computational complexity. In order to meet these goals, the system should capture and analyze the user’s actions, compare it with typical scenarios captured in a knowledge base and generate user’s attention attractors in order to provide the focused visualization. There is presented a formal ontological model and technology of visual scene complexity management based on levels of detail (LOD) coordination. Implementation is illustrated by anatomy training and surgery simulation applications: “Inbody Anatomy” anatomic atlas and laparoscopy training suite that require simulating complex 3D scenes and sustainable simulating scenarios.
Conference Paper
Full-text available
Purpose: The paper proposes an alternative to the traditional way of conducting surveys within internal university systems for quality assurance using mobile technologies in order to increase the students' activity. Design/methodology/approach: An analysis of the needs of internal university quality systems in conducting surveys as well as an overview of existing software tools for conducting mobile surveys have been made. After specifying the functional and non-functional requirements the mobile application for conducting surveys for the purposes of internal systems for quality assurance of higher education has been developed. Experiment for its application in specific surveys are conducted. Findings: The developed mobile application allows conducting surveys within university systems for quality assurance and tools for authorized group of users that allow monitoring of the students' activity in surveys and automated analysis of the results. Research implications: A solution relating to automation of the process of interviewing and summarizing the data in conducting surveys that are an integral part of the institutional quality assurance systems of universities is proposed. Thus the study supports the development of these systems in the direction of building a coherent European Higher Education Area. Practical implications: The results of the study would certainly influence positively for improvement of practices for quality assurance in higher education institutions (see. Originality/value) Originality/value: The developed and probated at Plovdiv University mobile application for conducting surveys is probably the first of its kind in the country. Its means for automated monitoring of the students' activeness as survey participants and for subsequent analysis of the survey results allow members of university quality committees to generate summary reports. But more over they could monitor ongoing surveys and analyse intermediate data at any time. The results of the presented research promise to be useful for the other educational institutions as well.
Article
Full-text available
Бурното развитието на технологиите и в частност на мобилните устройства, подобряването на техните технически характеристики и все по-богатата им функцио-налност и достъпност обуславят тенденции на засилване прилагането на неформалните форми в образователния процес. В работата е направен анализ на текущото състояние и перспективите пред мобилното обучение у нас. Посочени са множество предимства по отношение на повишаване на качеството на обучението, неговата достъпност и обръщането към нуждите и особеностите на учене на отделния човек.
Article
Full-text available
An eLearning self-assessment model (e-LSA) was developed to evaluate the quality of eLearning in an organization based on Total Quality Management (TQM) and the European Foundation for Quality Management (EFQM) excellence model. The e-LSA can be used by management team and trainers for self-assessment. An e-LSA-Guide model was also developed to help the organizers of evaluation process to select the relevant criteria and statements to be included in their self-assessment tool.
An e-learning Methodology
  • G Totkov
  • R Doneva
  • S Gaftandzhieva
Totkov G., R. Doneva, S. Gaftandzhieva, An e-learning Methodology, ISBN 978-954-8852-43-2, Rakursi, Plovdiv, 2014. (in Bulgarian)
Shared Evaluation of Quality in Technologyenhanced Learning, White paper
  • U Ehlers
  • C Helmstedt
  • M Bijnens
Ehlers U., C. Helmstedt, M. Bijnens, Shared Evaluation of Quality in Technologyenhanced Learning, White paper, 2011.