Content uploaded by Panagiotis Kampylis
Author content
All content in this area was uploaded by Panagiotis Kampylis on Dec 22, 2018
Content may be subject to copyright.
2016
2016
9TH INTERNATIONAL CONFERENCE OF
EDUCATION,
RESEARCH AND
INNOVATION
9TH INTERNATIONAL CONFERENCE OF
EDUCATION,
RESEARCH AND
INNOVATION
SEVILLE (SPAIN)
14-16 NOVEMBER 2016
SEVILLE (SPAIN)
14-16 NOVEMBER 2016
CONFERENCE
PROCEEDINGS
CONFERENCE
PROCEEDINGS
CONFERENCE
PROCEEDINGS
CONFERENCE
PROCEEDINGS
2016
9TH INTERNATIONAL CONFERENCE OF
EDUCATION,
RESEARCH AND
INNOVATION
SEVILLE (SPAIN)
14-16 NOVEMBER 2016
CONFERENCE
PROCEEDINGS
CONFERENCE
PROCEEDINGS
Published by
IATED Academy
iated.org
ICERI2016 Proceedings
9th International Conference of Education, Research and Innovation
November 14th-16th, 2016 — Seville, Spain
Edited by
L. Gómez Chova, A. López Martínez, I. Candel Torres
IATED Academy
ISBN: 978-84-617-5895-1
ISSN: 2340-1095
Depósito Legal: V-2569-2016
Book cover designed by
J.L. Bernat
All rights reserved. Copyright © 2016, IATED
The papers published in these proceedings reflect the views only of the authors. The
publisher cannot be held responsible for the validity or use of the information therein
contained.
SUPPORTING SCHOOLS TO GO DIGITAL: FROM A CONCEPTUAL
MODEL TOWARDS THE DESIGN OF A SELF-ASSESSMENT TOOL
FOR DIGITAL-AGE LEARNING
Panagiotis Kampylis1, Jim Devine2, Yves Punie1, Tabetha Newman3
1 European Commission, Joint Research Centre, Directorate B - Growth & Innovation
(SPAIN)
2 Devine: Policy, Projects, Innovation (IRELAND)
3 Timmus Limited (UNITED KINGDOM)
Abstract
This paper summarises the first phase of the development of an evidence-based self-assessment tool
(SAT) for the integration and effective use of digital technologies in schools across Europe, based on
the European Framework for Digitally-Competent Educational Organisations (DigCompOrg,
http://europa.eu/!dV98uF). In particular, the paper presents an analysis of existing school self-
assessment tools and how they correspond to the DigCompOrg conceptual model. A literature review
identified nine tools for further analysis, representing different approaches to the self-assessment of
the use of digital technologies in schools. The findings confirm that there has been no attempt until
now to develop a pan-European tool that is evidence-based and can add value by promoting
transparency, comparability and peer-learning across Europe both for schools and educational
policymakers. The findings reveal a number of general considerations that include a focus on
leadership and governance practices; emphasis on digital infrastructure and resources;
acknowledgment of teachers’ role and the need for capacity building; the need for integration of digital
technologies across the curriculum; and the need for cross-fertilisation and peer-learning for the
development and implementation of self-assessment tool. These considerations will guide the next
phases of the study, i.e. the design of a prototype SAT for digitally-competent schools. Such a tool can
empower policymaking within each country, and enable school communities (i.e., school leaders,
teachers and students) to periodically self-reflect on their school’s current state of development and on
future steps in realising effective digital-age learning.
Keywords: digital-age learning; digitally-competent schools; Digital technologies in Education;
Framework for Digitally-Competent Educational Organisations (DigCompOrg).
1 INTRODUCTION
Educators and policy makers acknowledge that learning in and for the digital age represents a
formidable challenge for policy-makers, schools, teachers, students and society in general [1-4]. There
is ample evidence of the use of digital technologies in exciting and promising ways at all levels of
education. However, in order to consolidate progress, educational institutions need to regularly review
their teaching, learning and organisational practices in order to become progressively digitally-
competent at an organisational level [e.g. 5]. To do so, educational institutions need (a) a clear
understanding of what it means to be digitally competent as an organisation, and (b) tools to assist
them in self-assessing their current state of development and in formulating plans for improvement.
The European Framework for Digitally-Competent Educational Organisations (DigCompOrg) provides
a comprehensive and generic conceptual model for the effective integration of digital technologies by
educational institutions [2]. DigCompOrg model is the result of a research study designed and
conducted by the Joint Research Centre of the European Commission for the Directorate General for
Education and Culture (DG EAC). The DigCompOrg framework identifies seven core elements and 15
sub-elements that characterise all educational organisations. There is also scope for the addition of
sector-specific elements and sub-elements (e.g., elements and sub-elements particular to higher
education). DigCompOrg expands the sub-elements to 74 descriptors that add further detail to the
elements and sub-elements of the model. Diagrammatically, the elements, sub-elements and
descriptors of DigCompOrg are presented as sectors of a circle, with an emphasis on their
interrelatedness and interdependence (see Fig. 1).
Proceedings of ICERI2016 Conference
14th-16th November 2016, Seville, Spain
ISBN: 978-84-617-5895-1
0816
Figure 1. The seven elements and 15 sub-elements of the DigCompOrg Conceptual Model [2].
The DigCompOrg model focuses on the teaching, learning, assessment and related learning support
activities undertaken by a given educational organisation. It does not extend to core administrative
tasks that would require ICT competences, as for these, one can draw on experience from any
organisation. The ultimate aim of DigCompOrg however is to improve student learning through the use
of digital technologies.
The DigCompOrg framework is designed to help both educational institutions and policymakers in
planning the effective use of digital learning technologies at all levels of the educational journey:
primary and secondary schools, Vocational Education and Training (VET) centres as well as higher
education institutions such as universities, technical universities and polytechnics. Several Member
States have translated DigCompOrg into their own languages and have been using it to promote the
integration of digital technologies in their education systems1. As its uptake expands, DigCompOrg can
facilitate a common understanding across the Member States as well as transparency and
comparability between related initiatives throughout Europe. However, DigCompOrg remains a
conceptual model, as it has not yet been piloted or implemented in real, organisational settings.
It is recognised that in order to be useful in practice in particular settings, the DigCompOrg framework
requires adaptation and customisation, and work is currently being undertaken to validate a version of
the framework specifically applicable to schools (primary and secondary) and VET (initial). This paper
traces the first phase of the follow-up study ‘DigCompOrg School Pilots’, currently being conducted by
JRC and a consortium of experts from UK, Ireland, Denmark, Spain, Italy and Estonia for DG EAC.
The study aims to (a) validate a customised version of the framework and, based on this validation
exercise, (b) to develop an evidence-based organisational-level self-assessment tool (SAT) for the
innovative and effective use of digital technologies in schools across Europe. Such a SAT can enable
school communities (via the collection of data from school leaders, teachers and students) to
periodically self-reflect on their current state of development and on their plans for future steps in
realising effective digital-age learning.
2 METHODOLOGY
We used a mixed method approach for the first phase of the DigCompOrg School Pilots project, which
involved desk research; analysis of existing tools that promote the integration of digital technologies in
1 See for instance the translations of the model in Spanish (http://bit.ly/27R4ZWx), Estonian (http://bit.ly/2cAM3WV) and
Lithuanian (http://bit.ly/2d8jocI).
0817
schools at national/international level; and expert consultation (see Fig. 2). The next phases of this
project will focus on the following:
• A user consultation survey to validate the adaptation for a ‘schools’ context of the generic
DigCompOrg framework;
• A series of expert and user consultations to underpin the design of a prototype SAT based on
the descriptors of DigCompOrg conceptual model (as adapted for ‘schools’) and the analysis of
existing tools;
• Pilot implementation of the prototype SAT in a number of schools across Europe;
• Additional (mainly qualitative) research in selected schools as well as exert and stakeholder
consultation based on the analysis of the results from the pilot implementation of the SAT;
• Consolidated version of the DigCompOrg SAT.
Figure 2. The four phases of DigCompOrg School Pilots study.
In this paper, we explain the process for selecting a number of existing tools that are of particular
interest and relevance to the DigCompOrg SAT design initiative. These are tools that schools across
Europe use to support self-assessment of their uptake of digital technologies, or maturity in the use of
these technologies. We also present the findings from the analysis of these tools and the key lessons
learnt. The analysis and lessons learnt will influence the next phases of the DigCompOrg School Pilots
study, in particular the development and piloting of the DigCompOrg SAT in at least four countries and
in five languages (English, Spanish, Danish, Estonian, Italian).
2.1 Desk research
The desk research presented here, aims to synthesise existing knowledge about tools that support
schools to self-reflect on the innovative and effective use of digital technologies for learning. The
starting point for locating and analysing such self-assessment tools was the inventory of 15 conceptual
frameworks and self-assessment tools included in the DigCompOrg report [2]. From the 15 entries of
0818
this inventory, nine were considered very relevant to the development of the SAT envisaged in the
context of DigCompOrg School Pilots. These are marked with an asterisk (see Table 1).
A further step in locating relevant tools involved consultation with the members of the Education and
Training 2020 Working Group of Digital Skills and Competences [6], an expert group comprising
representatives from the education ministries of Member States, relevant EU bodies or agencies,
education and training associations and European social partners as well as independent experts. This
consultation identified an additional four tools, not initially included in the original DigCompOrg
inventory.
Finally, an online search for more tools was conducted, covering a wide range of materials such as
technical, evaluation and policy reports; websites, wikis and blogs; journal and conference papers;
promotional literature (e.g., leaflets); and slideshow presentations.
Through this three-step approach, thirteen self-assessment tools developed and/or used at regional,
national or international level in Europe was identified. These tools are summarised in Table 1 below.
The basic criteria for the initial selection of the tools were the following:
• Verification that each tool is (or can be) used by European primary, secondary or VET schools
for the self-assessment of their practices in integrating and effectively using digital technologies
for teaching and learning;
• Verification that the development and/or implementation of each tool is recent or ongoing.
Table 1. Self-assessment tools developed and/or used in Europe.
Title / short title2
Website
Country
Launched
Funding
Charge
Assessing the e-Maturity of
your School / Ae-MoYS*
http://e-mature.ea.gr
International
2011
Public
No
Autoevaluación del grado de
implantación de las TIC /
AGITIC
http://bit.ly/2d8Eiso
Castilla-Leon
(Spain )
2014
Public
No
DigiPeegel
http://bit.ly/2drTcdS
Estonia
2015
Public
No
Digital schools of Distinction /
DSoD
http://bit.ly/2dbls0U
Ireland
2013
Public/
private
No
eLEMER*
http://bit.ly/2cN1cav
Hungary
2010
Public
No
e-Škole - Uspostava sustava
razvoja digitalno zrelih škola /
e-Škole
http://bit.ly/2d3xqdV
Croatia
2015
Public
No
Future Classroom Maturity
Model / FCMM*
http://bit.ly/2dj91kc
International
2010
Public
No
Ledning, Infrastruktur,
Kompetens, Användning /
LIKA*
http://lika.skl.se
Sweden
Public
No
Microsoft Innovative Schools
Toolkit & Self-Reflection Tool /
Microsoft SRT*
http://bit.ly/2dj8X3L
International
2009
Private
No
Opeka*
http://bit.ly/2d3mgpT
Finland
2012
Public
No
Planning and implementing e-
learning in your school /
e-Learning Roadmap*
http://bit.ly/2dlyPib
Ireland
????
Public
No
School mentor*
http://bit.ly/2dja6bp
Norway
2008
Public
No
Self-review Framework /
NAACE SRF*
www.naacesrf.com
United
Kingdom
2005
Public
Yes
2 In some cases acronyms have been adopted by the authors of this paper as abbreviations when discussing the tools analysed;
these are not necessarily acronyms created by the authors of the tools.
0819
As can be seen in Table 1, the vast majority of the tools (9 out of 13) have a national character, one is
intended for regional use, two have been developed in the context of European projects (i.e. Ae-MoYS
and FCMM) and one has international character (i.e. Microsoft SRT). It is worth noting that a majority
of the tools rely on public funding (regional, national or European) for their development and/or
implementation and only one has been developed by a private company. Also, there is a tool (DSoD)
which has been developed through public/private partnership. All of the tools are available free of
charge for use by schools, with the exception of NAACE SRF that requires an annual subscription of
£50 (plus VAT).
In order to select the tools that will be analysed further, we applied the following criteria:
• Verification that reliable data for the development and/or implementation of the tool is available
in English language3;
• Verification that each case reflects an approach that can provide insights for the development of
DigCompOrg SAT;
• Verification that the selected tools reflect the broadest possible variety in terms of:
o Implementation phase (pilot, scale, mainstream);
o Type of the tool (e.g., questionnaire, matrix, online, in print);
o Geographical coverage (regional, national, international);
o Users involved for providing the information (school leaders, teachers, students);
o Scope of usage (ranging from its use solely by individuals, to its use at the level of the
organisation or beyond, e.g., aggregated data used at the education system level).
Based on the criteria presented above, nine tools were finally selected for further analysis,
representing different approaches to the self-assessment of the use of digital technologies in schools
(see Table 2). Given the nature of this exercise, the analysis is limited to providing a narrative
overview of the tools analysed and does not in its own right present an empirical synthesis of their
effectiveness and impact.
3 ANALYSIS AND FINDINGS
The aim of the analysis presented here is to (a) identify commonalities, points of divergence and gaps
across the selected tools, (b) to gain insights about their design, focus, methodology and
implementation strategies, and (c) to synthesise best practices and lessons learnt.
Table 2 below provides a brief summary of the focus of the tools analysed, including whether they are
used for benchmarking as well as their key characteristics and policy relevance, if any.
Table 2. Focus and key characteristics of the nine tools analysed.
Tool
Focus
Benchmarking
Key characteristics and policy relevance
Ae-MoYS
Strengths and
weaknesses in the use of
ICT for teaching and
learning
No
Results are used to create the school's
action plan. An indicative percentage of ICT
integration in provided in the case of many
items.
DSoD
Aims to promote,
recognise and encourage
excellence in the use of
technology in primary
schools
No
Schools that successfully complete the 3-step
programme (step 2 is self-assessment) will
receive a nationally recognised Digital School
of Distinction Award, accredited by the Dept.
of Education.
eLEMER
ICT integration – whole
school perspective
Comparison
with the national
average
Users are asked to provide possible evidence
(e.g. lesson plans, school regulations,
e-portfolios etc.) to support their responses.
Provides a country snapshot. 66% of teachers
and 50% of students are to be surveyed.
Aggregated data are used for policy making.
3 As a result, four potentially relevant tools available in other languages (i.e., LIKA available in Swedish, e-Škole available in
Croatian; DigiPeegel available in Estonian and AGITIC available in Spanish) are not included in the analysis presented below.
0820
FCMM
Enables teachers and
schools to assess the level
of innovation with
technology
Comparability
with national &
international
average
Part of the Future Classrooms toolkit.
Diagnostic report to plan for the next level of
maturity. Under Creative Commons licence.
Microsoft
SRT
Change management tool
for ICT integration
No
Focus on creating a vision for the use of ICT.
Support to manage change process.
Opeka
Evaluation of teachers'
and schools' digital
competences and culture
Comparison
with other
teachers from
the same
school or the
same town,
teachers who
teach the same
subject, or with
all teachers
Qualitative research is conducted to validate
tool results. The questionnaire includes also
questions about the quality of the tool itself.
Information from the tool is used to modify
Finnish ICT policy in education.
e-Learning
Roadmap
Where schools are
currently positioned in e-
Learning and where they
would like to go
No
Printed planning tool, part of a Handbook for
planning and implementing eLearning. Whole
school planning and self-evaluation is
enabled.
School
mentor
Reflect on facilitation and
execution of pedagogical
use of ICT
No, only the
school has
access to the
results and can
decide to give
or not access to
the school's
managerial
agency
Intended for school heads but for use in
collaboration with other staff.
NAACE
SRF
Structured route for
reviewing and improving
schools' use of technology
No
Originally developed by Becta. School
reaches a certain level (with supporting
evidence) and applies for a national quality
accreditation ICT Mark (http://bit.ly/2d3o0iU).
Table 2 demonstrates that the selected tools follow diverse approaches for the self-assessment of the
use of digital technologies by schools. In terms of focus, the tools show a quite convergent approach:
most of them aim to guide schools to self-assess and self-reflect on their current state of development
and to support them in following a structured programme for change and improvement. Several tools
are used to create the school´s vision (e.g., Microsoft SRT) and action plan (e.g., the Ae-MoYS) for a
more effective uptake of digital technologies. The eLemer tool from Hungary follows an interesting
approach, asking users to identify possible evidence (such as lesson plans, school regulations, e-
portfolios etc.) that are available and support assertions made in their self-assessments.
Regarding the ownership of the data generated by the tools and its use for benchmarking, the
approaches are divergent. Some of the tools offer the opportunity for comparisons at local, national or
even international level. On the other hand, several tools intentionally do not offer this functionality and
the results are available only to the school itself. In the case of School Mentor, the school can decide
whether or not to give access to data to the school´s managerial agency, which is the local
municipality.
Some of the tools (e.g., Opeka, eLemer) use aggregated but anonymised data for informing policy
makers at local or national levels in order to influence the related policies for the integration and
effective use of digital technologies by schools.
Table 3 below, presents an overview of the tools analysed in terms of their type, extensiveness
(number of items included) and the scales or maturity models they use.
0821
Table 3. Type and length of the tools analysed.
Tool
Type
Who provides
the data
Total #
of items
Scale/maturity model
Ae-MoYS
Online
questionnaire
A teacher-
coordinator
30
4-point scales (from strongly disagree to strongly
agree or from 0-25% to 76-100%)
DSoD
Online
questionnaire
School head
45
Information not availble
eLEMER
Online
questionnaire
Teachers and
students
100
N/A, Not true, Partly true, Nearly true, Completely
solved
FCMM
Online matrix
School head
5x5
5-level maturity model: Exchange, Enrich, Enhance,
Extend, Empower)
Microsoft
SRT
Online matrix
School head
16x6
5-level maturity model (plus N/A): Aware,
Developing, Defined, Integrated, Ubiquitous
Opeka
Online
questionnaire
Teachers
145
Many options: from strongly disagree to disagree;
from never to daily; also multiple-choice items.
e-Learning
Roadmap
Printed matrix
School head
27x4
Maturity model: initial, e-Enabled, e-confident, e-
Mature
School
mentor
Online matrix
School head
30x5
5 levels (from 1-5)
NAACE SRF
Online matrix
Multiple options
54x4
4-level maturity matrix (from 1-4)
As can be seen in Table 3, tools can be divided in two groups. The first uses the format of a
questionnaire providing schools with a number of statements/questions for self-assessing their use of
digital technologies for learning. The second group of tools uses matrices with a (4 to 5 levels) maturity
model and a number of descriptors that have been described against this model. Opeka from Finland
includes also statements related to the quality of the tool itself and some 10 questions to elicit
background information. Several other tools also ask users for background information with the
exception of Microsoft SRT that does not ask for such information.
In most of the tools analysed, school heads provide the data. In some cases, all teachers in a given
school are involved. Only the eLemer tool includes students. NAACE SRF offers a variety of options
for providing the data: one person (e.g., school head); the whole senior management/leadership team;
staff working in teams and providing data for each area/key element; all staff working together to
review all statements.
The key elements of the selected tools were mapped against the DigCompOrg conceptual model (see
Table 4) in order to identify commonalities in the way the different tools cluster the questionnaire
statements or the matrix items they use. It should be noted that the nine tools analysed are intended to
represent the diversity of tools in this field and not to represent those which map most
comprehensively to the framework. The key observations from Table 4 are synthesised and discussed
in the next section.
4 DISCUSSION
The desk research reveals the use of a variety of tools and approaches for the self-assessment of the
use of digital learning technologies by schools in several European countries. The tools analysed map
very well to the DigCompOrg conceptual model, which is comprehensive and holistic in its nature. The
findings from the analysis of the key characteristics of existing tools have led to a number of general
observations, presented briefly below, which will inform the design process for the DigCompOrg SAT.
4.1 Focus on leadership and governance practices
All the tools include key elements and related questionnaire statements/matrix items related to
leadership and governance practices as well as on digital learning culture and vision. This is not
unexpected, as the tools aim to capture the use of digital technologies at school level and most of
them have as target user the school heads or leadership teams. In any case, the role of school leaders
0822
is crucial for the innovative and effective use of digital technologies, as they have to provide the
enabling conditions, such as vision and strategic planning.
4.2 Emphasis on infrastructure and resources
The tools vary considerably in their approaches to what they regard as the important aspects of the
innovative and effective use of digital technologies by schools. However, what they all have in
common (see Table 4) is an emphasis on the infrastructure and resources needed at school level. This
is perhaps not surprising since many schools still cope with an inadequate technological infrastructure,
e.g. an insufficient or unreliable broadband connection to leverage the merits of digital technologies. It
is also the case that for self-assessment purposes, it can be easier to capture quantitative measures
related to infrastructure than other aspects relating to specific use of digital technologies for learning.
Table 4. Mapping the key elements of the tools against the DigCompOrg conceptual model.
DigCompOrg
Conceptual
framework
Leadership &
governance
practices
Teaching &
learning
practices
Professional
development
Assessment
practices
Content &
curricula
Collaboration
& networking
Infrastructure
FCMM
Organisational
eMaturity;
Management
of Teaching,
Learning &
Assessment
Educational
Processes
(Pedagogy;
Learner
Role);
Capacity
building
Management
of Teaching,
Learning &
Assessment
Educational
Resources
(Underpinning
Technology);
Educational
Outcomes
(Learning
Objectives),
Teacher-
student
collaboration
Tools and
resources
Microsoft
SRT
Leadership &
a Culture of
Innovation
Teaching,
Learning &
Assessment
Capacity
Building
Teaching,
Learning &
Assessment
Leadership & a
Culture of
Innovation;
Learning
Environment
Learning
Environment
eLEMER
Management
Learners &
learning;
Teachers &
teaching
Teachers &
teaching
Management
Infrastructure
Opeka
Digital
learning
culture
ICT-skills
ICT-skills
Digital learning
culture
Devices and
software
School
mentor
Organisation;
Administration
& framework
conditions;
Mapping &
planning
Pedagogical
practice;
Digital
competence
In
Organisation:
Communication
& External
Communication
School
resources
Ae-MoYS
Leadership &
Vision; School
ICT culture
School ICT
culture
Professional
Development
ICT in the
Curriculum
School ICT
culture
Resources &
Infrastructure
e-Learning
Roadmap
Leadership &
planning; e-
Learning
Culture
e-Learning
Culture
Professional
Development
ICT in the
curriculum
e-Learning
Culture
ICT
infrastructure
DSoD
Leadership
and Vision;
School ICT
culture
School ICT
culture
Continuing
Professional
Development
ICT
integration in
the curriculum
School ICT
culture
Resources &
infrastructure
NAACE SRF
Leadership &
management
Teaching
and learning
Professional
development
Assessment
of digital
capability
Use of ICT in
the curriculum
Resources
0823
4.3 Acknowledgment of teachers´ role and the need for capacity building
Most of tools put emphasis on the role of teachers and the need for their continuing professional
development in order to be confident and competent users of digital technologies for learning. Some
tools, such as School Mentor, place specific emphasis on the digital competence of students and
teachers that is required to underpin the use of digital technologies in an innovative and effective way
across the curriculum.
4.4 Communication, collaboration and assessment
Digital technologies have a great potential to enable innovative ways of communication, collaboration
and assessment. Although not all the tools place explicit emphasis on these aspects, a majority have
related questionnaire statements or matrix items that try to capture these very important aspects of the
learning process.
4.5 Use of digital technologies across the curriculum
The integration of digital technologies across the curriculum is an explicit focus of five of the tools
analysed. This integration refers to the use of digital technologies in all school subjects and for a
variety of learning activities. Appropriate digital resources and assistive technologies are needed for
such an innovative and effective use across school subjects.
4.6 Cross-fertilisation and peer-learning for the development of the tools
The e-Learning Roadmap, Ae-MoYS and DSoD follow more or less the same conceptual model. The
conceptual basis for all of them is the e-Learning Roadmap that is available only in printed form. The
e-Learning Roadmap was adopted by the DSoD for the development of its online self-assessment tool.
Ae-MoYS, which has been developed within the context of the iTEC EU-funded project, has adopted
and adapted the online self-assessment tool of the DSoD. So, these three tools highlight a very
interesting example of how these types of self-assessment tool can evolve over time and between
countries, building on previous experiences and lessons learnt.
4.7 Who to involve in the school self-assessment process
Most of the tools presented here collect data about the integration and innovative use of digital
technologies from school heads. Few of them involve also teachers and only eLemer tool involves
students as well. However, in order to have a complete image of the state of the art in a given school it
is important to cross compare the information by collecting data not only from school heads but also
from teachers and students, especially for the questionnaire statements related to teaching, learning
collaboration and assessment.
The considerations presented above, based on the analysis of the nine self-assessment tools, will
guide Phase 2 of the DigCompOrg School Pilots study, i.e., the design and pilot implementation of a
prototype SAT for digitally-competent schools.
5 CONCLUDING THOUGHTS AND FUTURE RESEARCH
The digital/pedagogical capacity of schools can be enhanced using self-assessment tools exemplified
by the ones presented above, which enable them to reflect on their own state of development and
competence in the use of digital technologies for learning and to plan further improvements. The
analysis presented in this paper shows that several tools are currently used by schools in Europe.
However, fragmentation and uneven development across the Member States is evident and a
common conceptual approach capable of supporting the development of schools’ digital capacity at a
European level has yet to be formulated. Therefore, a European tool that builds on previous
experiences, expert and stakeholder consultation and on a robust conceptual model can add value
(both to policy makers and individual schools) by allowing for transparency, comparability and peer-
learning. The DigCompOrg SAT aims to be such a reference tool, one that can influence and
complement rather than supersede other tools already in use. Schools, intermediaries (e.g.,
management bodies) or project developers can flexibly use the DigCompOrg SAT as a reference tool
and adapt it to their needs and specific contexts
The analysis of existing tools presented here is a first and important step in the process of developing
the DigCompOrg SAT, a European tool for the innovative and effective use of digital technologies in
0824
schools. The next step is an intensive consultation process about the development of the tool, based
on the DigCompOrg conceptual model, involving
• More than 15 experts from UK, Ireland, Denmark, Spain, Italy and Estonia;
• School leaders, teachers and students from more than 70 schools from Denmark, Spain, Italy
and Estonia;
• Educational stakeholders and policy makers at local, regional, national and European level.
Both the analysis of existing tools and the wide consultation process aim to provide input and insights
leading to the development of a prototype SAT, to be piloted in schools from four EU education
systems: Spain, Italy, Denmark and Estonia during 2017. The pilot implementation of the tool will
involve not only school leaders and teachers but also students, so to have the most holistic view of the
innovative and effective use of digital technologies for learning in the participating schools. The
quantitative analysis of the data from the pilot implementation of the DigCompOrg SAT will be
complemented by qualitative research (e.g., focus groups and case studies) and outcomes will be
discussed with education experts, stakeholders and policy makers in order to develop and make
available the consolidated version of DigCompOrg SAT.
The primary aim of the consolidated DigCompOrg SAT, which is expected to be released by the end of
2017, is twofold. On the one hand it aims to encourage self-reflection and self-assessment within
individual schools as they progressively deepen their engagement with digital learning and pedagogy.
On the other hand, it aims to support policy makers in designing, implementing and evaluating policy
interventions for the integration and effective use of digital learning technologies. Overall, the
DigCompOrg SAT aims to stimulate evidence-based dialogue and sharing of experiences, thus
contributing to European and Member State policy priorities to modernise schools in Europe and to
promote effective digital-age learning.
DISCLAIMER
The views expressed in this article are purely those of the authors and should not be regarded as the
official position of the European Commission.
REFERENCES
[1] OECD, "Students, Computers and Learning - Making the connection," 2015.
[2] P. Kampylis, Y. Punie, and J. Devine. Promoting Effective Digital-Age Learning - A European
Framework for Digitally-Competent Educational Organisations.2015 EUR 27599 EN.
doi:10.2791/54070 [Online]. Available: from https://ec.europa.eu/jrc/en/digcomporg
[3] European Commission. Draft 2015 Joint Report of the Council and the Commission on the
implementation of the Strategic framework for European cooperation in education and training
(ET2020) - New priorities for European cooperation in education and training, {SWD(2015) 161
final} [Online]. Available: https://ec.europa.eu/transparency/regdoc/rep/1/2015/EN/1-2015-408-
EN-F1-1.PDF
[4] European Network of Education Councils, "Learning in the Digital Age - Report of the seminar of
the European Network of Education Councils, Athens, 5-6 May 2014 with the support of the
European Commission DG Education and Culture," E. N. o. E. C. (EUNEC), Ed., ed. Brussels:
European Network of Education Councils (EUNEC) Secretariat, 2014.
[5] European Commission. Opening up Education: Innovative teaching and learning for all through
new Technologies and Open Educational Resources [COM(2013) 654 final] [Online]. Available:
http://ec.europa.eu/education/news/doc/openingcom_en.pdf
[6] European Commission. ET 2020 Working Groups [Online]. Available: http://europa.eu/!Xg99VX
0825