Technical ReportPDF Available

DigComp at Work Implementation Guide

Authors:

Abstract

This Implementation Guide accompanies the “DigComp at Work” report published separately. It aims at supporting labour market intermediaries in their digital skilling actions in employability or employment contexts. It offers specific guidelines, examples, tips and useful resources for the use of DigComp for defining specific job’s digital competence needs, for assessing digital competences and for cataloguing, developing and delivering training on digital competences.
DigComp
at Work
Implementation
Guide
EUR 30204 EN
This publication is a Technical report by the Joint Research Centre
(JRC), the European Commission’s science and knowledge service. It
aims to provide evidence-based scientific support to the European
policymaking process. The scientific output expressed does not
imply a policy position of the European Commission. Neither the
European Commission nor any person acting on behalf of the Com-
mission is responsible for the use that might be made of this pub-
lication. For information on the methodology and quality underlying
the data used in this publication for which the source is neither
Eurostat nor other Commission services, users should contact the
referenced source. The designations employed and the presentation
of material on the maps do not imply the expression of any opinion
whatsoever on the part of the European Union concerning the legal
status of any country, territory, city or area or of its authorities, or
concerning the delimitation of its frontiers or boundaries.
Contact information
Name: Clara Centeno
Email: clara.centeno@ec.europa.eu
EU Science Hub
https://ec.europa.eu/jrc
JRC120645
EUR 30204 EN
PDF
ISBN 978-92-76-18581-9
ISSN 1831-9424
doi:10.2760/936769
Luxembourg: Publications Office of the European Union, 2020
© European Union, 2020
The reuse policy of the European Commission is implemented by
the Commission Decision 2011/833/EU of 12 December 2011 on
the reuse of Commission documents (OJ L 330, 14.12.2011, p. 39).
Except otherwise noted, the reuse of this document is authorised
under the Creative Commons Attribution 4.0 International (CC BY
4.0) licence (https://creativecommons.org/licenses/by/4.0/). This
means that reuse is allowed provided appropriate credit is given
and any changes are indicated. For any use or reproduction of pho-
tos or other material that is not owned by the EU, permission must
be sought directly from the copyright holders.
All contents © European Union 2020, except: p. 16, images courte-
sy of the contributors.
How to cite this report: Centeno, C., DigComp at Work Implemen-
tation Guide, Okeeffe, W. editor(s), EUR 30204 EN, Publications
Office of the European Union, Luxembourg, 2020, ISBN 978-92-76-
18581-9, doi:10.2760/936769, JRC120645.
Design and layout: Valentina Barsotti / Takk studio
DigComp at Work Implementation Guide - This Implementation
Guide accompanies the “DigComp at Work” report published sepa-
rately. It aims at supporting labour market intermediaries in their
digital skilling actions in employability or employment contexts. It
offers specific guidelines, examples, tips and useful resources for
the use of DigComp for defining specific job’s digital competence
needs, for assessing digital competences and for cataloguing,
developing and delivering training on digital competences.
Clara Centeno
DigComp
at Work
Implementation
Guide
Editor: William O’Keeffe
TABLE OF CONTENTS
5 Welcome message
6 1. INTR0DUCTION
6 Background
6 Purpose
6 Who is this Guide addressed to?
6 Use of DigComp for which skilling functions?
9 About DigComp
10 How to use this Guide
11 2. GENERAL GUIDELINES FOR DIGCOMP BASED IMPLEMENTATION
12 G1. Getting started with your DigComp based project
13 G2. During your DigComp based project
13 G3. Aer your DigComp based project finishes
14 G4. Additional strategic considerations
16 3. SPECIFIC GUIDELINES FOR DIGCOMP IMPLEMENTATION
17 S1. Defining competence needs
20 S2. Assessing competences
28 S3. Training
34 Acknowledgements
5
WELCOME MESSAGE
These two reports, DigComp at Work: The EU’s digital
competence framework in action on the labour market,
and its Implementation Guide with practical guidance
for labour market intermediaries on the use of Dig-
Comp are a new chapter in the success story of Dig-
Comp.
DigComp was first published in 2013 and since then has
been used in national and international policy-making
and in the design and delivery of digital skills develop-
ment across the EU.
The story of DigComp is a story of commitment by
stakeholders. DigComp stakeholders have translated,
adapted, interpreted and applied DigComp in a variety
of inspiring ways. DigComp stakeholders have also be-
come ambassadors for co-operation on digital skills in
Europe through working together in innovative projects
and communities of practice.
These reports highlight the important use of DigComp
by labour market stakeholders. It is a given now that
digital skills are essential for life and work and are the
foundation for employability and accessing information
and support throughout our careers.
Support for managing the digitals transitions are at the
heart of the European Skills Agenda adopted by the
European Commission on the 1st of July 2020. Dig-
Comp has and will play a role in supporting the work of
countries, companies and social partners to support the
development of digital competences. The case studies
showcase practical examples of the development of
the digital competences, and the Implementation guide
offers specific guidelines, examples and useful resourc-
es for the use of DigComp.
Our hope is that they serve as a call to action for great-
er uptake of DigComp and delivering on the goals of
the European Skills Agenda.
I wish to give special thanks to those organisations: The
Associazione Emiliano Romagnola di Centri Autonomi
di Formazione Professionale and Ervet (Italy), Anpal
Servizi (Italy), Ikanos project, Basque Government and
Ibermática (Spain), ECCC Foundation (Poland), Exper-
tise France (France) and Lai-momo (Italy), Hellenic
Open University – DAISSy research group (Greece),
Smartive (Italy), Tecnalia (Spain), Adecco, Mylia and
Advancing Humanity srl (Italy), that gave such rich and
inspiring information on their case studies. Thanks also
to the team in the Commission’s Joint Research Centre
for their work in creating this report as well as their
ongoing work on the implementation and development
of DigComp.
Alison Crabb
Head of Unit, Skills and Qualifications
DG Employment, Social Affairs and Inclusion
European Commission
6
Background
This guide has been developed to support actors with digital skilling functions they
provide in employability or employment contexts1 to respond to the digital transfor-
mation in the labour market.
Labour market intermediaries (LMI) provide a wide set of skilling related services
which aim to develop the digital competences of employees or people looking for a
job, including: collection and dissemination of labour market information; outreach
and individual support; career guidance; support for adult learning; fostering transpar-
ency of skills and qualifications; job experience and placement; job search assistance
and matching; and monitoring and aer care.
In providing these functions, DigComp, the European Digital Competence Framework,
provides support to LMIs due to its distinctive features. Its consensus-based origin
and EU endorsement provides it with credibility and reliability. DigComp brings a new
view of what digital competence is, and its clear and solid structure, completeness,
flexibility and neutrality, have been key enabling and success implementation factors.
This Guide offers specific guidelines and tips on the use of DigComp for the implemen-
tation of digital skilling services.
The Guide has been developed based on the knowledge gathered through the case
analysis described in the DigComp at Work report, and complemented with previous
knowledge compiled and described in the Guidelines on the adoption of DigComp
(Kluzer, 2015) and the DigComp into Action guide (Kluzer S. and Pujol Priego L., 2018).
1 See “Developing digital competence for employability: Engaging and supporting stakeholders with
the use of DigComp : stakeholders´ consultation workshop Bilbao, June 19-20, 2019”, http://dx.doi.
org/10.2760/625745
Purpose
The aim of this Implementation Guide is to complement the knowledge gathered
at the DigComp at Work report which contained a thorough analysis of 9 cases of
use of DigComp in employment or employability contexts. It offers recommendations
on practical steps, key actions, tips and on-line resources for the implementation of
DigComp in these contexts. The Implementation Guide is to be read together with the
DigComp at Work report.
As a first edition, it is necessarily incomplete. Stakeholders may wish to keep this
Guide alive with contributions based on their own experiences. If such an enriching
process shows to be of value for stakeholders, we will explore how this process could
be managed.
Who is this Guide addressed to?
This Implementation Guide is addressed to a wide range of Labour market inter-
mediary (LMI) actors who are involved in digital skilling functions for job-seekers or
workers that need re-skilling or up-skilling of their digital competences, to be able to
respond to changes in the labour market.
T.1 aims at classifying LMI types and subtypes according to the people they serve:
students in initial education, non-employed and employed people, and those LMIs
working across all types of individuals.2
Use of DigComp for which skilling functions?
The research3 has shown that stakeholders or LMIs make use of DigComp for a large
set of skilling-related functions, as illustrated in T.2 (source: DigComp at Work report
T.5).
2 As shows the research carried out by Visionary Analytics, under contract number 936043-2018 A08-
LT (Lot 1) “Mapping DigComp and EntreComp Use Lot 1: Analysis of Labour Market Intermediaries active
in digital and entrepreneurial skilling services”
3 Carried out by The Woman Organisation, Bantani Education and Stefano Kluzer, under the contract
number 936054-2018 A08-GB “Mapping DigComp and EntreComp Use Lot 2: Cases Analysis of DigComp
and EntreComp Use”.
1. INTR0DUCTION
7
1. INTRODUCTION
T.1 DESCRIPTION OF LMI TYPES AND SUBTYPES
Type 1.
LMIs working primarily with
students in initial education
1.1
Formal educational institutions that deliver formal (to some point compulsory) schooling. They include: Primary and secondary schools
(corresponding to ISCED levels 1-3;), as well as vocational education and training (VET) schools (ISCED levels 3-5 with vocational
orientation) and higher education institutions (HEIs; ISCED levels 5-8);
1.2 Non-formal education providers that offer extracurricular activities that complement the programmes of educational institutions (e.g. an
aer-school language course);
1.3 Informal education providers that facilitate students’ self-learning in the areas of their interest with no imposed course structure,
external requirements, and assessment (e.g. a school chess club).
Type 2.
LMIs primarily working with
the non-employed
2.1 Public Employment Services, i.e. a public body, either a part of ministry of labour or, less oen, a separate executive agency, that provide
comprehensive support to the unemployed, and have legal obligations towards them;
2.2 LMIs addressing critical barriers to employment – typically NGOs or social businesses that target specific vulnerable groups that need
more intensive or specific professional support.
Type 3.
LMIs primarily working with
the employed
3.1 Trade unions, i.e. collective associations of employees;
3.2 Employers and employer associations - collective associations of employers.
Type 4.
LMIs working across
all the target groups
4.2
Upskilling providers that provide adult training. These include formal, non-formal and informal professional and adult education
providers. They consist of: Face-to-face and online training providers, as well as MOOC (Massive Open Online Course) platforms, that
congregate distance learning courses offered by different HEIs.
4.3
Job experience providers that enable allocation of individuals in a real work environment. They include: Providers of work-based learning
(WBL, including internships, traineeships, apprenticeships, job-shadowing, etc.), volunteer opportunities; public works programmes and
social enterprises and cooperatives.
4.4
Job brokers that facilitate matching job seekers with vacancies. They include: job websites (platforms used to exchange information
about vacancies and jobseekers’ profiles), short-term employment facilitators (organisations that assist individuals / employers in
finding temporary work / worker), and private recruitment agencies (that help employers fill their medium- and high-level vacancies).
8
1. INTRODUCTION
T.2 USE OF DIGCOMP PER LMI FUNCTION
LMI FUNCTION DIGCOMP USE
CASES LMIS WORKING WITH: STUDENTS IN INITIAL EDUCATION, EMPLOYED, NONEMPLOYED, ALL TARGET GROUPS
A PEI B DCDS PRODIGEO IKANOS ECCC COMPASS MU.SA SMARTIVE A BAIT B P4E ADECCO
Labour market skills
analysis
Analysis of digital competence requirements in
various occupations ✓ ✓
Design of professional digital profiles ✓ ✓ ✓ ✓
Benchmarking services in business sectors
aiming at identifying skill requirements at
organisational level and at comparing your skill
level with organisation´s competitors
✓ ✓
Career advice
DigComp used for career advice/guidance
Link self-assessment and training offer to
specific careers ✓ ✓
Personal development plan
DigComp used for further training and/or career
advice/guidance
Link self-assessment and training offer to
specific occupational profiles ✓ ✓
Design and delivery of
training DigComp use for training offers ✓ ✓ ✓ ✓ ✓ ✓ ✓
Workforce development DigComp-based skills assessments
Assessment of skills Design of DigComp-based (self-)assessment
tools ✓✓✓✓✓
Certification of competence Certification of DigComp competence or
Certification of course completion ✓ ✓ ✓ ✓ ✓
9
1. INTRODUCTION
F.1 DIGCOMP COMPETENCE AREAS AND COMPETENCES
DigComp
Communication
and
collaboration
Digital content
creation
Safety
Problem solving
Information and
data literacy
Interacting
Sharing
Engaging
in citizenship
Collaborating
Netiquette
Managing
digital identity
Developing
Copyright
and
licences
Integrating
and
re-elaborating
Programming
Devices
Health
and
well-
being
Personal
data
and
privacy
The
environment
Technical
problems
Creatively using
digital
technologies
Identifying
needs and
responses
Identifying
digital
competence
gaps
Managing
Evaluating
About DigComp
DigComp was first published in 2013 as a reference
framework to support the development of digital com-
petence of individuals in Europe. It describes which
competences are needed today to use digital technol-
ogies in a confident, critical, collaborative and creative
way to achieve goals related to work, learning, leisure,
inclusion and participation in our digital society.4
The DigComp Framework has 5 dimensions:
1. Competence areas (5) identified to be part of digital
competence (see F.1 );
2. Competence descriptors and titles (21) that are per-
tinent to each area (see F.1 );
3. Proficiency levels for each competence;
4. Knowledge, skills and attitudes applicable to each
competence, and
5. Examples of use, on the applicability of the compe-
tence to different purposes.
4 Based on the definition of Digital competence described in the
Recommendation of the European Parliament and of the Council
of 18 December 2006 on key competences for lifelong learning
(2006/962/EC)
10
1. INTRODUCTION
A.
If you are unfamiliar with DigComp,
please first read Chapter 1: Introducing DigComp
of the DigComp into Action guide for a detailed
presentation of the framework, its related
dimensions and the reference documentation.
C.
You are now ready to use this Guide, which is structured in two key parts:
General Guidelines for the use of DigComp, which needs to be read by all LMIs planning to provide
any of the above referred skilling functions. These General Guidelines are organised as follows:
G1. Getting started with your DigComp based project
G2. During your DigComp based project
G3. Aer your DigComp based project finishes
G4. Additional strategic considerations
Specific Guidelines for the use of DigComp for specific but interrelated skilling functions.
Readers can choose the section of their interest:
S1. Defining competence needs: defining the digital competences needed
for a specific professional profile or sector
S2. Assessing competences: assessing and/or certifying the digital competences of a job seeker or
employee, or of an organisation, or of a part of it
S3. Training: cataloguing, designing, developing and delivering training on digital competences
The Specific Guidelines are organised as follows:
What is to be
produced Purpose Examples
Key actions Tips Resources
B.
Get acquainted
with the content of DigComp at Work report, in
particular its Chapter 2: Mapping DigComp use in
the labour market.
How to use this Guide
G1 Getting started with your
DigComp based project G2
During your DigComp based
project G3
Aer your DigComp based
project finishes
1. Using DigComp in your language
2. Internalise DigComp´s view of digital
competence
3. Learn from others
4. Re-use existing resources
1. Invite all relevant stakeholders
2. Identify the suitable stakeholders’
strategic cooperation model
3. Adopt the DigComp naming convention
4. Adapt DigComp to your needs and
context
5. Consider implementing monitoring tools
1. Share your own resources and lessons
learnt
2. Continue learning from each other
3. Revisit your outputs
P.1 2 P.1 3P.1 3
G4 Additional strategic considerations
1. DigComp needs clear
communication
2. Communicate
“compliance” with
DigComp of learning
resources and
programmes
3. Adopting DigComp’s
holistic view of
digital competence
in learning and
teaching methods
4. Developing new
learning resources
and methods in line
with DigComp and
Training Teachers
5. Certification
P.1 4P.1 4P.1 5P.15P.15
2. GENERAL GUIDELINES FOR DIGCOMP BASED IMPLEMENTATION
11
12
2. GENERAL GUIDELINES
Getting started with your DigComp based project
G1
1. Using DigComp in your language is an important starting point.
Look for a translated version of DigComp in your language(s)
If no translation is available, it is best to prepare your own translation (involve
both a linguistic and a specialist on digital competence)
In line with the open copyright licence which allows re-use of DigComp material
and its translation following certain guidelines, you can consult the following
resources:
Recommendations and Template for translating DigComp 2.1
General rules to translate the DigComp reports
Linked OpenData interface for DigComp 2.0
Language versions of DigComp.
2. Internalise DigComp´s view of digital competence. The distinguishing
feature of DigComp’s view of digital competence is to look beyond the techni-
cal ability needed to use specific digital tools and services to include a broader
view as well as critical and reflexive capabilities. This is crucial to grasp fully
the opportunities and risks of today’s digital world. For instance, knowing how
to use a search engine is important. However, from a DigComp perspective, it
is even more important to know why search results are listed in a certain way
and how they might reflects users’ preferences as a possible result of profiling
practices by search services suppliers.
3. Learn from others. Identify if there exist any projects with similar aims as
yours, in your own country or abroad, and contact the project leader / owner to
learn from their experience. For that purpose:
Join the DigComp Community of Practice (CoP) hosted by All Digital
Consult the DigComp into Action guide
Consult the DigComp at Work report.
4. Re-use existing resources. Identify if there are on-line accessible resources
similar to the ones you want to produce so as to share knowledge and resourc-
es, systems, etc. Consider translating and re-using existing material, soware,
etc. For that:
Ask for help in the DigComp Community of Practice (CoP) hosted by All Digital
Consult the DigComp into Action guide
Consult the DigComp at Work report.
13
2. GENERAL GUIDELINES
During your DigComp based project
G2 Aer your DigComp based project finishes
G3
1. Ensure all relevant stakeholders are invited to your project. Consider
inclusion of employers, trade unions, employment services, training providers
and other actors. Use the LMI types (Table T.1) to scan for possible actors.
2. Identify the suitable stakeholders’ strategic cooperation model for the
project. Any activity to develop digital skills and to implement DigComp can
require the participation of a wide-range of diverse stakeholders. Consider op-
tions such as project or institutional based collaboration, continuous dialogue,
or community of practice, as possible co-operation models. In any case, ensure
shared ownership, trust and flexibility for a successful partnership-based ap-
proach.
3. Adopt the DigComp naming convention to facilitate communication: Once
you have your version of DigComp, share it among the different stakeholders
that will participate in your project. Some training sessions could be necessary
to become familiar with DigComp, its 5 dimensions and 21 competences. Espe-
cially with different stakeholders involved, some learning is required to under-
stand how to fully and effectively use the framework.
4. Adapt DigComp to your needs and context. One of the key features of
DigComp is that it can be tailored to your context. You can use DigComp to iden-
tify relevant competences and levels of proficiency, and to design meaningful
learning outcomes for your purpose and context.
5. Consider implementing monitoring tools on the outputs of your activities,
such as number of training courses provided, number of people that followed
and completed the courses, number of assessment tests performed or certifi-
cates issued. In addition, follow-up processes to measure impacts of the skilling
actions on users’ increased employability might also be relevant, for accounta-
bility, managerial, social or communication purposes.
1. Share your own resources and lessons learnt from your experiences with the
DigComp Community of Practice (CoP) hosted by All Digital
2. Continue learning from each other.
3. In fact, your DigComp based project never finishes! Revisit your outputs (pro-
fessional digital profiles, tests and training courses) periodically to incorporate
the constant evolution of digital technologies and any social and labour changes
following technology adoption that might impact on the competences needed.
14
2. GENERAL GUIDELINES
1. DigComp needs clear communication
DigComp describes the knowledge, skills and attitudes that are needed today to
use digital technologies in a confident, critical, collaborative and creative way to
achieve goals related to work, learning, leisure, inclusion and participation in our
digital society. It also states that digital competence today entails more than
the ability to use given tools and must develop beyond operational functions
offered by the tools. These notions may look at first sight obvious to everyone,
but they are not.
This holistic approach to digital competences is rarely considered in ICT skills
training offers or considered by employers when recruiting new staff.
It is important to promote and explain DigComp’s vision to teachers, trainers
and e-facilitators, but also to employers, who play a key role in driving the de-
mand for digital competences.
DigComp should also be promoted among policy-makers and other stakehold-
ers who rule over digital competence development, and of course to the very
learners involved in it. For this, it is recommended to produce, illustrate and
disseminate effective communication materials about DigComp, starting from
simple ones with clear examples and explanations about its novelty and why its
view of digital competence is important in today’s world.
For that purpose, the following DigComp infographics and flyers support mate-
rial is available.
5 The whole content of this section and some sections in the Training section are drawn from Guide-
lines on the adoption of DigComp (Kluzer, 2015) available here
2. Communicate “compliance” of learning resources
and programmes with DigComp
DigComp was born out of the study of many ongoing initiatives for the develop-
ment of digital skills in Europe to encourage their evolution. At the same time,
DigComp aims to help citizens understand what digital competence can mean
today for them, by providing an articulated and well-structured framework. It
is therefore important that when learning resources are developed, used or
adapted using DigComp then the alignment to DigComp should be clearly high-
lighted. Courses, individual lessons, learning materials, self-assessment tests
and so on, should be clearly “tagged” according to the DigComp framework and
to its specification in each context.
Additional strategic considerations5
G4
15
2. GENERAL GUIDELINES
3. Adopting DigComp’s holistic view of digital competence
in learning and teaching methods
Learning to use specific tools, devices or applications is usually an inevitable
part of the development of digital competence and, teachers and learners tend
to focus on this. However, DigComp competences are generic competences
that need to be learned regardless of the specific ICT application. For example,
for the competence “searching for information”, one can learn it using Google
search, but it’s also important to understand that there are also other ways of
finding information, e.g. via Facebook, twitter, newspapers, alternative search
engines, etc…
DigComp competences are usually transversal to and/or independent of specific
technology and they need to be fully acknowledged and dealt with in appro-
priate ways. This concerns both learning content and teaching methods. For
instance, practical guidance (e.g. through step-by-step instructions) should be
enriched whenever possible with contextual and critical reflections and with ex-
ercises producing supportive and clarifying evidence for these reflections. Dig-
Comp advocates that the learning approach itself should promote critical think-
ing, creativity, autonomy, confidence and safety of the learners. This promotion
can be done across any delivery mode (face-to-face, distance learning, etc.).
4. Developing new learning resources and methods
in line with DigComp and Training Teachers
Convincing educators to adopt the DigComp perspective is not enough. While
there are abundant learning materials and teaching methods available for the
traditional approach to ICT skills development, fewer learning materials and
methods are readily available to support educators and learners address the
critical and reflexive components of DigComp. Any use of DigComp must en-
visage and allocate resources for the development of effective and sustainable
learning materials and teaching methods.
Refreshment initiatives and introductory training, continuous support and su-
pervision efforts, and the encouragement of peer collaboration are all likely to
be necessary. Measures are also needed to support educators face this new
challenge/opportunity and to overcome the resistance to change that might
occur.
5. Certification
It is important to consider certification, whether adapting existing systems or
establishing news ones, with the DigComp in mind. Still uncommon, this formal
recognition could contribute to the qualification of digital competence training,
thus helping to create much needed bridges between the world of education
and the business sector. Certification would also give a clear signal to both
learners and educators alike that digital competence in DigComp’s perspective
can also be accurately assessed and is an important achievement for fuller
participation in our society.
G4 Additional strategic considerations (continued)
2. SPECIFIC GUIDELINES FOR DIGCOMP IMPLEMENTATION
The Specific Guidelines are structured along 3 major
activities:
S1 Defining competence needs
Defining the digital competences
needed for a specific professional
profile or sector.
P.1 7
S2
Assessing competences
Assessing and/or certifying the digital
competences of a job seeker or
employee, of an organisation, or of a
part of it.
P.20
S3 Training
Cataloguing, designing, developing
and delivering training on digital
competences.
P.28
Although these three activities can be carried out in
isolation, they are oen interlinked. T.3 shows a pos-
sible logical connection between them inspired by the
Ikanos project (Case C3 in DigComp at Work report).
     
( ) 
Discover Learning about DigComp ✓ ✓ G4
STRATEGY
Audit
Self-assessment test S2
Personal digital profile report S2
Analyze
Digital organisational profile S2
Professional Digital Profile S1
Definition of training objectives ✓ ✓ S3
Assessment Results Analysis Tool S2 S3
Organisational Diagnosis Report ✓ ✓ S2
Guide
Training Orientation Guide ✓ ✓ S3
Resource Cataloguing ✓ ✓ S3
ACTION
Learn
PLE configuration ✓ ✓
e-Portfolio construction
Evidence Digital skills Certification S2
T.3 EXAMPLE OF ACTIVITIES PROVIDED BY STAKEHOLDERS AUTHOR’ S ADAPTION OF IKANOS ACTIVITIES
16
17
3. SPECIFIC GUIDELINES
A definition of the digital competences needed for a specific occupation category or
sector. This definition should specify the digital knowledge, skills and attitudes that
a professional must possess to adequately perform the tasks that require the use
of digital tools and applications in a given job or occupation category.
DigComp provides a reference tool to guide stakeholders (HR departments, man-
agers, etc) to walk through all aspects of digital competence (across the 5 com-
petence areas and 21 competences) to identify and describe current and possible
future digital competence needed for a specific job. It is not be noted that DigComp
suggests a set of 5 areas and 21 competences to be acquired to become fully
digitally competent. However, a subset of these may be sufficient for a specific
job profile.
DigComp can be used to develop a Professional Digital competence Profile
(PDP) for specific jobs. The PDP will list the set of digital competences and so
skills needed and their proficiency levels required for the job. The PDP can be ac-
companied with a set of relevant learning outcomes, which would be useful to
design a related training.
PDPs can be developed for a variety of jobs and job categories including:
broadly defined existing occupations (e.g. administrative worker in the
public administration, general office clerk, primary school and early childhood
teacher, etc.)
generic business functions (Operations and industrial services, Marketing &
Sales, etc.)
generic work conditions (entrepreneur, virtual office worker, consultant for
the Third Sector, employment services staff)
new IT-intensive jobs in different economic sectors distinct from IT specialist
job profiles (Industry 4.0 jobs in manufacturing, new digital jobs in museums).
PDPs can:
be the reference for the design and administration of (self-)assessment or
certification tests (see the S2. Assessing Competences Section for more details)
be used to assess the potential and suitability of an individual for a job
constitute the basis for the design, development and delivery of a specific
training course focused on the specific job profile
be used to evaluate work performance.
S1 Defining competence needs: defining the digital competences needed for a specific occupation category or sector
What is to be produced Purpose
18
3. SPECIFIC GUIDELINES
Examples Key Actions
DigComp into Action guide and DigComp at Work report include a number of cases
that have developed PDPs for some occupations (some of which available online).
In the DigComp into Action guide:
C2  for the Basque 4.0 Industry
C15    for Digital facilitators (public library staff and volunteers)
C20  for Youth e-facilitators.
C30     .
In the DigComp at Work report:
C3  for Administrative staff in public organisations, Industrial machine
operator, Sales representative, Entrepreneur, Mechatronics/robotics technician,
Industrial machinery operator and CNC prog., Advanced manufacturing main-
tenance technician, 3D designer for additive manufacturing, Additive manufac-
turing machinery operator, SME digital transformation manager, Consultant on
services / programs for Third sector, Economist – Business Manager, Economist
– Consultant, Economist – Specialist in digital marketing.
C5  for Vocational education teachers; Primary school teachers; Fi-
nance professionals; Sales, marketing and PR professions; General office clerks;
Secretaries; Authors, journalist and linguists; and for Creative and performing
artists.
C6 . Emerging Job Roles for Museum professionals includes the four new
job roles (Digital Strategy Manager, Digital Collections Curator, Digital Inter-
active Experience Developed, Online Community Manager) and the digital and
transversal competences that characterize each role.
C8  for the Virtual Office Worker and the (self-) Entrepreneur
Check if an existing PDP has already been designed in another country or region
(see Examples and Resources Sections).
Identify knowledgeable actors from employer(s) organisations and other enti-
ties that will contribute to the definition of the (digital) job profile. Recommend-
ed stakeholders include business experts, human resource managers and other
managers, and a digital competence expert.
Use DigComp to develop a common understanding and language of what
means digital competence.
Identify which of the 21 DigComp competences are required for the specific job,
at which level, and which learning outcomes are relevant.
Identify if complementary digital competences (not specified in DigComp) are
needed for the job. For specialised IT competences we recommend the use of
e-Competence Framework (e-CF).
Identify which so skills (communication, collaboration, team working, creativi-
ty, etc) are needed for the job, and how the development of digital competences
can support the development of these.
19
3. SPECIFIC GUIDELINES
To define a PDP, the following approach can be followed:
Step 1. Experts first describe all the main activities performed in the selected
job, reflecting different levels of experience and proficiency and identifying crit-
ical tasks from the point of view of work output.
Step 2. Experts identify those activities that can be performed using digital
tools (at a later stage hardware and soware options are also specified). These
digital activities are then mapped to DigComp competences and proficiency lev-
els. Not all DigComp competences are necessarily used in every profile, because
the given job may not require those specific competences.
Step 3. Aer that, field experts (professionals, human resources managers, vo-
cational training specialists, etc.) can be consulted in order to identify the digital
aspects of job tasks in detail and the competence descriptors necessary for the
technical solutions used in the selected job. DigComp descriptors are by their
nature written in general terms, to be applied in different contexts, but they can
form a useful basis to articulate job-related specifications. As a result, the same
DigComp competence may have, in practice, different detailed descriptions, de-
pending on the prevailing tasks of different occupations.
Method: Complementary methods to define the digital competence needs in
specific occupations can be: focus groups, direct interviews and online surveys,
with DigComp always utilised as a reference vocabulary and guide.
DigComp into Action guide
contains a description of the cases provided in the Examples Section.
DigComp at Work report
Section 2.3 - How DigComp is helping stakeholders, sub section The use of Dig-
Comp for the analysis of competence requirements and the definition of profes-
sional digital profiles (pp.20-24) details how the different cases analysed have
made use of DigComp to define a PDP. More in particular:
T7 lists the aims of the PDPs defined by the different cases and the sets of
(digital, advanced digital and so) skills considered
T8 details the complete list of PDPs defined by the different cases
T9 provides an analysis of the DigComp digital competences and levels used
across PDPs.
The Annex provides a detailed description of each case.
In addition, the following resources are available:
C3  Professional digital profiles
C3  Guide for professional digital profiling
C3  Tool 4: Ikanos Professional Digital Competence Profiles
C6 . report: Emerging Job Profiles for Museum Professionals
C6 . Tool 5. Linking eCF, EQF and DigComp competences of the 4 new
Mu.SA PDP
C6 . Tool 6. Transferable competences of the 4 new Mu.SA digital profiles.
Tips Resources
20
3. SPECIFIC GUIDELINES
Assessment tools support awareness development, evaluation or certifica-
tion of the level of digital competence of an individual or organisation.
Assessment and self-assessment may take place in different contexts and have
different purposes. An important distinction is to be made: while ‘assessment’
refers to a process where the user’s digital competence is evaluated by a second
party (and therefore leading to an objective rating and possibly to a certificate),
‘self-assessment’ refers to the user perceiving its digital competence by its own,
i.e. a subjective dimension is added.
Different approaches to testing and types of questions may be used:
Self-perception or self-reflection questions ask respondents how confi-
dent they feel with respect to a topic or activity, what or how much they know
and/or are able to do, or what the actual behaviour is. They play a very impor-
tant role in helping people to understand their digital competence.
Knowledge-based questions check whether the respondent knows a given
piece of knowledge, knows the right action to achieve a result or the right
behaviour in a given circumstance by picking the right answer among a set
of options. They check factual knowledge and/or procedural knowledge. They
provide a more accurate picture of a user’s digital competence compared to
self-perception or self-reflection questions.
Performance-based questions require users to perform some tasks in order
to give the requested answer or complete an assignment. This approach gener-
ates the most accurate picture of one’s digital competence.
DigComp components (competence descriptors, learning outcomes at different
proficiency levels, examples of skills, knowledge and attitudes) can be used to
prepare self-perception, self-reflection and knowledge-based questions or as a
reference to for more detailed and contextualised questions. They can also in-
spire the definition of authentic tasks and challenges for evaluation of both knowl-
edge-based and performance-based perspectives.
Some criteria for choosing the type of questions to be used:
Self-perception and self-reflection questions used in self-assessment tests will
help users become aware of the wide scope of digital competence that they
have not previously considered or been aware of. These tests can also be used
to identify how respondents feel (rather than measuring their initial compe-
tence), before recommending them to take an initial digital competence course.
Knowledge-based and self-perception and self-reflection questions are the
most manageable, can be administered through online procedures and can de-
liver more immediate feedback.
Performance based questions either require sophisticated simulations or ad-
ministration of real-life challenges. Both approaches require more complex
technical solutions, including for automated verification, direct observation and/
or intervention of an evaluator. Hence, tests that make use of these questions
can take longer to be executed.
Each approach to testing, and hence, the complexity, quality and reliability of
the measurement tool (and related costs) need to be adapted to the purpose.
For example, a certification tool would require a more complex and reliable
measurement than a self-assessment test or a pre-training assessment test. In
particular, building a valid and reliable assessment tool is a task of significant
complexity that requires a multi-disciplinary team with know-how on meas-
S2 Assessing competences: assessing and/or certifying the digital competences of a job seeker or employee,
of an organisation, or of a part of it
What is to be produced
21
3. SPECIFIC GUIDELINES
urement tools design, psychometric analysis and digital competences, as well
as significant investment in development, operational and maintenance costs.
Taking into account the above, reusing or sharing certification platforms, sys-
tems, questions or even services are probably effective alternatives to starting
from scratch.
An important output of the assessment and self-assessment processes is the
feedback or results provided to the user. This will depend on the context and
purpose of the assessment. Different types of certificates or credentials can
be provided to the user as a result of an assessment test. So far, the cases ana-
lysed issue two types of digital credentials for learning achievements: certificates
and badges. The choice of format reflects what is commonly seen today among
credential issuers:
Digital certificates are used when an achievement takes a long time to com-
plete (e.g. a course that takes more than 40 hours); assessment of the achieve-
ment is formal (summative with proctored and assessed examination); employ-
ers are likely to view the achievement (e.g. professional certification of a skill).
Digital/Open badges are used when an achievement does not take too long to
complete (e.g. an online course that takes 2 hours); assessment of the achieve-
ment is informal (formative, as with an unmarked quiz); recipients complete
many achievements of a similar type (e.g. a set of modules within a longer
course, or a university degree).
(this summary is drawn from www.accredible.com/credentials/)
Digital competence assessment tests may be taken by different entities for differ-
ent purposes in different contexts:
By individuals:
to develop their self-awareness of their own level of digital competences in
each of the DigComp (5) areas and (21) individual competences; to benchmark
one’s digital competence profile with others in the labour market; or to decide
on a learning path
to obtain a digital competence or PDP badge or certification for employability
purposes or other reasons (e.g. following a regulatory requirement)
to identify the gaps between the individual current competences’ level and
those needed for a particular job (defined through a PDP), identify the available
training offers that address the specific competences and related level needed
and develop a personal development plan.
By training providers:
to assess participants’ skills before starting a course to provide advice on
personalised learning paths
for summative assessment purpose during a course
for issuing a final course badge or certificate (depending on how the course
and the assessment are organised, these credentials may refer to specific Dig-
Comp areas or competences, to a specific course completion or to a specific
PDP profile).
By employers (companies or other organisations):
to support candidate’s selection processes for specific vacancies
to map employees’ profiles and identify their training needs and paths
Purpose
22
3. SPECIFIC GUIDELINES
to support digital transformation processes, by identifying talents and po-
tential digital champions and monitoring improvements
for organisation-level analysis and benchmarking with sectoral data.
By employers’ (sectoral) associations:
for sector-level analysis, by aggregating a large number of test results, in
order to plan training and learning activities to overcome sectoral competence
gaps.
By (public and private) employment services:
in support of their job matching and career advice functions.
DigComp into Action guide and DigComp at Work report include a number of cases
that have developed different types of tests, some of which are available on-line.
In the DigComp into Action guide:
C2, T2  Self-assessment tool for any citizen
C12  web platform for Digital skills evaluation and certification for any citizen
C16, T11, T12  Self-assessment tool for students and young workers
C17    for adult citizens in Danish and in English, for
teachers, students and employees of both private and public organisations,
available in Danish
C21     (GINOP-6.2.1) Self-assessment tool for the basic
two levels of low skilled working age population
C27     , self-diagnostic pre-training test for
civil servants
C29      , for Enterprise Trainers and
Educators.
In the DigComp at Work report:
C3  self-assessment test (standard version), for all citizens, available in
Euskera, Spanish and English
C5  self-assessment test for low skilled unemployed youth, to guide
the user towards a learning path
C7  self-assessment test standard version. available in Italian
and in English, for organisations
C8  self-assessment test Virtual office worker and Entre-
preneur profiles.
Examples
23
3. SPECIFIC GUIDELINES
Define your specific requirements:
Clarify the purpose and the nature of the test (self-assessment or assess-
ment) and the target audience.
Decide the type of measurement: per competence area, competence or other
grouping criteria.
Decide the type of questions to be asked, based on the test’s purpose.
Decide the contextual conditions of the test: financial resources, maximum
time to carry out the test, number of questions (per competence, area or group),
the size of the pool of questions.
Decide about test management aspects: the need for an advertising cam-
paign to target audiences, test registration procedures, and management pro-
cedure for claims to the test results.
Decide the test operational conditions: if the instrument will be self-adminis-
tered or assisted, the location where the test will be administered, i.e. on-line or
on a specific physical location, and if the user will have (limited) access to inter-
net or not, and the compatibility of this choice with knowledge-based questions.
Decide the type of feedback and output that will be provided to the user.
Consider if the test results are to be integrated in a personal learning en-
vironment (e.g. an e-Portfolio, internal HR system), to support competence
development in a lifelong learning context.
Before starting the development of the assessment tool:
Explore if synergies can be developed with actors already administering tests
in your local environment or in another region or country to reduce costs and
time required.
Analyse if any components can be (re-) used, complemented or shared such as
questions, platforms, services, or even the [type of] digital credentials issued
(presuming they can be recognised in another context).
A number of technical design and quality tasks in relation to the implementation of
the measurement instrument need to be considered. For example:
the different measurement approaches needed to test knowledge, skills and
attitudes, to comply with the definition of competence
to ensure that the instrument is psychometrically sound (good validity and reli-
ability are two main psychometric characteristics of a sound instrument)
to identify who will write and review the questions
to ensure the set of questions is moderate, non-redundant, and sufficient, tak-
ing into account a good distribution of questions per competence and proficien-
cy level
to design how responses will be linked to competence proficiency levels
to tackle common problems like the tendency to over or undervalue individual’s
responses to self-perception questions (the introduction of fake questions can
be considered)
to pilot the instrument with an acceptable sample of test-takers
or to acknowledge the limitations of the instrument, to name a few.
See the Resources Section for some useful sources on the above matters.
Key Actions
24
3. SPECIFIC GUIDELINES
Paying due attention to end users’ profiles and needs becomes crucial.
DigComp was intentionally designed for all citizens, not for a specific category
of people. Citizens, however, have diverse educational levels and backgrounds,
mastery of the local language, and experience with the digital world and so on.
Although DigComp is technology neutral in assessment and training contexts,
it is necessary to use some concrete tools. Combining commercial and open
source tools can be a suitable choice.
For competence assessment, it is important that DigComp’s descriptors and
related examples are “translated” into concrete questions that use plain
language and refer to everyday circumstances or popular examples. The
questions’ wording should be adapted to the target groups.
Performance based questions might be more difficult to implement to meas-
ure some competences. Implementation options include simulation environ-
ments vs. real life challenges and automated vs. manual verifications. Although
the desired options would be automated verification of real life challenges,
these are not always feasible.
Measuring user’ attitudes with sufficient reliability (for certification purpos-
es, for example) remains a complex issue. Items that combine knowledge and
attitudes may be a practical option in some cases. Providing separate evalua-
tion results for attitudes may be another option. In self-assessment tests, this
issue is however less relevant.
The fast evolution of the digital technologies needs to be carefully consid-
ered in the design and updating of measurement instrument which can have an
impact on maintenance costs.
Considerations on the questionnaire’s length:
A questionnaire addressing all 21 DigComp competences may be long. The
questionnaire should thus be carefully designed to avoid that respondents do
not finish it. An approach focusing on a set of competences may be suitable to
address this challenge.
Questionnaires could present some initial questions that can immediately filter
low-competence respondents and allow them to end quickly the questionnaire.
Questionnaires can also be self-scalable adapting the number of questions
depending on the level of the respondent on each competence.
Longer and deeper assessment processes could be used for those with a higher
level of digital competence and incorporate more entertaining and engaging
elements, while being a source of useful information and learning.
Tips
25
3. SPECIFIC GUIDELINES
DigComp into Action guide provides a set of examples (see Section Step 2: Com-
petence Assessment pp. 34-35 and the Annex).
In addition, the following resources and tools are available:
T13 Digital competence self-diagnosis tool of the andalusia regional gov-
ernment, for all citizens
T14  : Self-assessment test and Digital learning opportunities, for
people of all ages, especially children, youth and their parents
T16 : Online self-assessment Tool on Digital Skills for the Job Market
for youth
T17 : The Digital Competence Certification system in Castilla y León,
Spain, for all citizens
T17 : open-source soware platform and database of over 1,400
questions for the certification of citizens’ digital competences, available for free
to any other public organization within the EU, and already shared with the
regional government of Andalusia (Spain) to work jointly in the certification on
digital competences
T17 : demo of a certification test
T18 Digital self-assessment tool for employees and managers of the Dach
Region.
DigComp at Work report includes several additional resources:
Table T.4, under Fostering transparency of skills and qualifications lists the cases
that have developed assessment and certifications of skills.
Section 2.3 - How DigComp is helping stakeholders, sub section The use of Dig-
Comp for assessment tests, recognition and certification (pp.24-26) details how
each case has made use of DigComp. In particular, T10 details which types of tests,
uses and outputs are implemented in each of the cases.
The Annex includes a detailed description of the example cases provided and re-
lated resources.
In addition, the following related resources are available:
C1 Tool 1. Regione Emilia-Romagna´s Entry questionnaire on digital compe-
tence for 3i Digital literacy courses
C1B  Contents of the self-assessment tool
C3  Standard Ikanos Self-assessment Test (SAT)
C5  Self-assessment test
C7  Free SmartiveMap self-assessment test
C7  Corporate SmartiveMap self-assessment test
C8A  Self-assessment test
C8A  Self-assessment test
C9  Company Digital Maturity Test
C9  PHYD Platform, that aims to support people to evaluate, maintain,
and increase their employability, in a life-long learning perspective.
Resources
26
3. SPECIFIC GUIDELINES
References on the development of measurement instruments:6
Bing, M. N., Kluemper, D., Davison, H. K., Taylor, S., & Novicevic, M. (2011). Over-
claiming as a measure of faking. Organizational Behavior and Human Decision
Processes, 116(1), 148-162.
Cizek, G. J. (2012). The forms and functions of evaluations in the standard
setting process. Setting performance standards: Foundations, methods, and in-
novations, 165-178.
Cizek, G. J. (Ed.). (2001). Setting performance standards: Concepts, methods,
and perspectives (Vol. 510). Mahwah, NJ: Lawrence Erlbaum Associates.
Cizek, Gregory J., and Michael B. Bunch. Standard setting: A guide to establishing
and evaluating performance standards on tests. SAGE Publications Ltd, 2007.
Cronbach, L. J. (1988). Five perspectives on validity argument. Test validity,
3-17.
DeMars, C. (2010). Item response theory. Oxford University Press.
Gibbons, R. D., & Hedeker, D. R. (1992). Full-information item bi-factor analysis.
Psychometrika, 57(3), 423-436.
Gibbons, R. D., Bock, R. D., Hedeker, D., Weiss, D. J., Segawa, E., Bhaumik, D. K., ...
& Stover, A. (2007). Full-information item bifactor analysis of graded response
data. Applied Psychological Measurement, 31(1), 4-19.
Hambleton, R. K., & Pitoniak, M. (2006). Setting performance standards. In R. L.
Brennan (Ed.), Educational measurement (4th ed., pp. 433-470). Westport, CT:
Praeger.
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of
6 Extracted from Pokropek, A (2020) Methodological guide on constructing and validating reflection,
self-assessment and measurement instruments (JRC internal document)
item response theory. Sage.
Holden, R. R., & Passey, J. (2010). Socially desirable responding in personality
assessment: Not necessarily faking and not necessarily substance. Personality
and Individual Differences, 49(5), 446-450.
Jodoin, M. G., & Gierl, M. J. (2001). Evaluating type I error and power rates using
an effect size measure with the logistic regression procedure for DIF detection.
Applied measurement in education, 14(4), 329-349.
Koretz, D. M. (2008). Measuring up. Harvard University Press.
Kruger, Justin; Dunning, David (1999). Unskilled and Unaware of It: How Difficul-
ties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.
Journal of Personality and Social Psychology. 77 (6): 1121–1134.
Lindquist, E. F. (1951). Preliminary considerations in objective test construction.
Educational measurement, 119-158.
Messick, S. (1998). Test validity: A matter of consequence. Social Indicators
Research, 45(1-3), 35-44.
Mitzel, H. C., Lewis, D. M., Patz, R. J., & Green, D. R. (2013). The bookmark pro-
cedure: Psychological perspectives. In Setting performance standards (pp. 263-
296). Routledge.
Muthén, L. K., & Muthén, B. O. (2002). How to use a Monte Carlo study to decide
on sample size and determine power. Structural equation modeling, 9(4), 599-
620.
Oliver JP. and Veronica Benet-Martinez. “Measurement: Reliability, construct
validation, and scale construction.” Handbook of Research Methods in Social
and Personality Psychology, edited by H.T. Reis and C.M. Judd. New York: Cam-
bridge University Press, 2000, pp. 339-369.
27
3. SPECIFIC GUIDELINES
Paulhus, D. L. 1991. “Measurement and control of response bias.” In J. P. Robin-
son, P. R. Shaver, and L. S. Wrightsman (Eds.), Measures of personality and social
psychological attitudes (pp. 17-59). San Diego, CA: Academic Press.
Paulhus, D. L., Harms, P. D., Bruce, M. N., & Lysy, D. C. (2003). The over-claiming
technique: Measuring self-enhancement independent of ability. Journal of per-
sonality and social psychology, 84(4), 890.
Phillips, D. L., & Clancy, K. J. (1972). Some effects of” social desirability” in sur-
vey studies. American Journal of Sociology, 77(5), 921-940.
Reise, S. P. (2012). The rediscovery of bifactor measurement models. Multivari-
ate behavioral research, 47(5), 667-696.
Reise, S. P., Moore, T. M., & Haviland, M. G. (2010). Bifactor models and rotations:
Exploring the extent to which multidimensional data yield univocal scale scores.
Journal of personality assessment, 92(6), 544-559.
Rost, J. 1991. “A logistic mixture distribution model for polychotomous item re-
sponses.” British Journal of Mathematical and Statistical Psychology 44:75–92.
Snow, Eric & Katz, Irvin. (2009). Using cognitive interviews to validate an inter-
pretive argument for the ETS iSkills™ assessment. Communications in Informa-
tion Literacy. 3. 10.15760/comminfolit.2010.3.2.75.
Van der Linden, W. J. (1998). Optimal assembly of psychological and education-
al tests. Applied Psychological Measurement, 22(3), 195-211.
Wang, N. (2003). Use of the Rasch IRT model in standard setting: An itemmap-
ping method. Journal of Educational Measurement, 40(3), 231-253.
Wilson, M. (2004). Constructing measures: An item response modeling ap-
proach. Routledge.
Wright, B. & Stone, M. (1979). Best test design. MESA Press: Chicago, IL
28
3. SPECIFIC GUIDELINES
Training: cataloguing, designing, developing and delivering training on digital competences
The identification and/or design of training content for digital competenc-
es can follow different, complementary processes:
Mapping and cataloguing existing digital competence courses to DigComp
is an important process:
to make most of existing training resources
to enable sharing existing resources among organisations, regions, countries,
etc.
to facilitate the identification of existing training gaps to facilitate the prepara-
tion of training development plans, and
to enable a greater diffusion of existing DigComp-compliant training offer.
Either manual (can be laborious) or automated analysis of online training of-
fers can be carried out.
The more similar the cataloguing approaches are among stakeholders, the eas-
ier will be the sharing of training resources.
Comparing existing training offers with the DigComp competences to be
developed can highlight competence and level gaps to be filled.
Training design which involves the specification of learning outcomes and
training content based on user needs and guided by DigComp components.
Training development based on in-depth user needs analyses which identify
training priorities and content in view of user profiles, the professional digital
profiles or the development goals defined in terms of DigComp (and possibly
other) competences.
Some relevant course design and development aspects:
DigComp’s value is in giving structure and direction while maintaining openness
and flexibility to digital competence development initiatives. Openness and flexi-
bility offer the opportunity to address the needs of specific target groups and their
context, rather than imposing pre-defined solutions.
Delivery mode:
Training can be delivered either fully on-line, face-to-face in the classroom or
in a blended form, i.e. as a combination of the previous two.
If full on-line training will be provided, it is recommended that courses also
include an introductory basic course for those with no experience at all, to be
able to follow the course.
It has to be considered that face-to-face teaching and peer interaction are
more suitable for learners with low educational background and low digital
skills, especially at the beginning of the learning process.
Course design elements:
Definition of the structure of the training modules. This can be done aligned
to the DigComp 21 competences and 5 competence areas, or adapted to the
course objectives and context.
Definition of the objectives, composition and duration of each training module.
Assigning the learning outcomes to each competence proficiency level.
Design of the desired learning paths towards an overall objective as a series of
modules to be followed.
The adoption of a microlearning approach to facilitate training of busy employ-
ees could be envisaged (see as an example Anglia Ruskin University’s Five Days
of Digital Literacy (5DODL) initiative, DigComp into Action guide T1 p.120).
S3
What is to be produced
29
3. SPECIFIC GUIDELINES
Evaluation design:
Definition of the “pass” criteria and tests for each learning module.
Design the criteria for the attribution of learning badges to modules, and how
module learning badges constitute a certificate for a specific course.
Design if and how the result of the training evaluation will be included in a per-
sonal ePortfolio as a storage component of a Personal Learning Environment,
to support continuous digital competence development in a life-long learning
perspective.
Content validity:
An important criterion in training content selection and production is to mini-
mize its rapid obsolescence risk, which can be addressed by avoiding focusing
on specific soware tools and rather focusing on competences to be developed,
on services and their access. DigComp´s technology neutrality and the general
treatment of many topics have proved particularly useful to cope with this
requirement.
Openness:
The question on commercial vs open educational resources (OER) needs to be
considered (if and how).
Separation of training design from delivery: when the design and develop-
ment of a training offer is done by a different entity from that delivering the train-
ing, tasks may be separated as follows:
1. Development of methodological documentation and training guides for edu-
cators such as syllabi, learning outcomes, and guide for exam´s participant
2. Training development, training delivery and evaluation.
Training courses can be developed with several purposes and addressed to differ-
ent audiences. Some of these are outlined here:
Develop / provide a full training offer for all DigComp areas and competenc-
es at all levels (although this is unlikely in the world of work, where normally
some content priorities and target users would be specified, given actual user
proficiency levels). See the Cases of ECCC (4) and DCDS (1) of DigComp at Work
report.
Develop a specific training offer adapted to a professional sector and
jobs (possibly defined making use of PDPs) to be used by employers or training,
employment or career advice organisations to increase the employability of
candidates aiming to apply for a particular job position.
Develop a training offer targeting the needs of specific groups of people (in-
troduction and lower level courses for those with no digital competence and ex-
perience, courses addressed to older people, increasing employability of young
people, etc).
Purpose
30
3. SPECIFIC GUIDELINES
DigComp into Action guide and DigComp at Work report describe a set of cases,
some of which offer online resources, in particular:
DigComp into Action guide includes 36 examples relevant to the training related
activities (p.33 and pp. 39-40) of which a selection is provided here:
C16  has compiled 200 MOOCs and OERs for Students and young
workers and a guide on how to use them
C20  Curriculum to train e-facilitators working with youth; Curriculum
for e-facilitators to train young people at risk of social or economic exclusion;
Course materials for e-facilitators working with youth
C21     (GINOP-6.1.2) training package for the basic two
levels of DigComp with study materials for low skilled working age population
C27  (Instituto Nacional de Administración Pública) Training model for pub-
lic administration civil servants in Spain
C29 : Digital Innovations for Growth Academy: Training for Enterprise
Trainers and Educators, and a Learning Programme Guidance on how to use
the materials and resources along with further information for programme fa-
cilitators. Available in English, Lithuanian, Slovenian, Bulgarian and Spanish.
DigComp at Work report:
C1    designed and developed basic literacy courses and the 3i
(informatica, inglese, industria) informatics courses for unemployed people run
within public employment services
C1  designed and developed a blended learning system to develop all 21
DigComp competences at level 1-2, aiming at adults 25+ years old
C2  designed and developed Digital Competence courses for (public
and private) employment services staff
C3  Training Orientation Guide, Guide to Catalogue the existing train-
ing offer, and Personal Learning Environment
C4  is an example of an organisation defining the methodological and
training requirements, but not developing nor delivering the training
C5  Training Course for unemployed low skilled youth
C6 . developed a modular training path and related tools (including a
MOOC) to develop specific to the PDP digital competences for museum pro-
fessionals
C8B  designed and developed a training offer for current and future civil
servants.
Examples
31
3. SPECIFIC GUIDELINES
Step 1: Training Definition
Define the purpose of the training, its target beneficiary group and their
needs (both digital competences and related so skills).
This definition task should involve digital competence experts, education pro-
fessionals, employers´ Human Resource managers and managers, employment
professionals and employees corresponding to the Professional Digital Profiles.
Refer also to the S1. Defining competence needs Section of this guide and the
development of PDP.
The analysis methods used can include focus groups and interviews. In all cas-
es, DigComp should be thoroughly explained to all intervening stakeholders and
used to define the user needs.
It is relevant to think what level of digital competence proficiency can be real-
istically expected/aimed at in general terms, given the target users’ character-
istics (e.g. education background, age, employment conditions and perspectives
etc.) and the broader context and goals of the initiative (e.g. retraining people
who lost their job and want to re-enter into the labour market; overcoming con-
dition of total digital exclusion, and so on).
The outcome of this step would be the list of DigComp digital competence
areas, individual competences and proficiency levels that need to be
taught.
(Optional: in the case user needs have been defined without the use of DigComp)
Step 1b: Mapping users’ targeted competences onto DigComp
The list of targeted digital competences (and levels) should then be compared
and/or mapped onto the DigComp framework’s descriptions and examples. A
good way of doing this is clustering the results from the needs analysis with
DigComp competences, or mapping them to a competence area might be suffi-
cient. Then, proceed with checking how much the two sets of descriptions match
with one another. Once this is done by area or competence, the levels can also
be mapped.
The comparison may reveal that some competences (and levels) in DigComp
have been overlooked in the step 1 analysis and therefore should be added to
the list of targeted digital competences, in order to aim at a fuller digital com-
petence development. The comparison may also reveal that some desired/tar-
geted competences or components are not included in DigComp, and therefore
need to be addressed through complementary tests for these competences.
This part of the exercise may turn out to be somewhat complex to perform. In
some cases, the precise meaning of DigComp’s descriptors and examples is
not straightforward, in other cases the level of abstraction and details of the
descriptions being compared will likely be different. This is the step where a
process of interpretation/specification of DigComp’s content is necessary.
Step 2: Define goals, descriptors and learning outcomes
DigComp is a framework that suggests a way of looking at and developing the
digital competence of citizens, without providing a standardised, detailed and
compulsory solution for it.
To apply DigComp for different purposes or contexts, you must “translate” the
framework’s descriptions and examples to fit the specific target groups´ needs.
Next, choices must be made about the digital applications, services and devices,
the pieces of knowledge, the examples of attitudes etc. that can best illustrate
the selected DigComp competences and levels for the implementation with the
end users.
Key Actions (for training design)
32
3. SPECIFIC GUIDELINES
Step 3: Design and develop training courses
Before developing the training, perform an analysis to identify if existing
courses can address the training needs and the gaps to be addressed
Delivery mode: decide which training delivery mode will be used according to
the project context and target users.
Decide if MOOCs are a good option for the training delivery.
Open sources: decide if the training material developed will be made acces-
sible and how.
Course design (see considerations in the What is to be produced Section
above).
For the courses to be developed, define and develop training content and
activities consistent with the learning needs and target levels.
Consider who will deliver the training: trainers may need to be trained as
well, as teaching “digital competences” is quite different from training on the
use of digital devices, applications and services.
Step 4: Evaluation and assessment
Decide how the evaluation will be performed and credentials provided (see
S2 Assessing Competence Section).
Consider if a self-assessment tool is needed to identify the learners’ previous
knowledge so as to allow to customize their personal learning path based
on the competence objectives to be acquired, eventually defined by the PDP.
For Step 2:
Taking as an example DigComp competence 4.1 Protecting devices, the ex-
ample: “Is able to install an anti-virus” may be substantiated in various ways.
It could be turned simply into the suggestion of a specific soware and the
step-by-step guidance to the installation of its free version. Alternatively, it
may include learning how to purchase online the licence for the professional
version of that soware, and possibly knowing about the availability of several
anti-virus soware products on the market, and learning which criteria to follow
in order to compare them and make a choice.
DigComp focuses attention on device protection and recommends that users
should be able to install an anti-virus soware: all the other aspects just men-
tioned have to be addressed and decided in the implementation process.
For Step 3:
Detailed choices have to be made about learning activities, their duration, and
the materials to use, in view of each targeted competence and proficiency lev-
el. As there is no established system to guarantee ex-ante that given choices
will produce the expected learning outcome, detailed design choices have to
be made. Sharing with a larger group of experts and teachers can be a good
option to improve the effectiveness of the design and to achieve wider consen-
sus among those who will have to implement them. Feedback and adaptation
mechanisms must also be envisaged to incorporate learnings from practice.
Liaise with relevant stakeholders that are knowledgeable on training design,
the content of the course (possibly with some teaching experience) and the
needs of the target group, so as to develop a meaningful course content. These
stakeholders can be VET organisations, employment services organisation, em-
ployers or employer associations, unions.
Tips
33
3. SPECIFIC GUIDELINES
DigComp into Action guide:
Provides a set of indications on Adaptation and specification (pp. 31-33) with
a list of examples.
Provides a set of indications on Training Trainers and end-user learning (pp. 37-
40) with a list of examples.
The Annex includes a detailed description of the cases provided in the Examples
Section.
Related to the above cases, the following resources are also available:
C24 Digital Competence Self-Diagnosis Tool of the Andalusian Regional Gov-
ernment, Spain, Guide for cataloguing training resources based on DigComp
with a view to incorporate them into an information system. (Email contact)
T17 , MOOC on DigComp, provides an introduction to DigComp and
provides training at basic-intermediate level.
DigComp at Work report includes several resources:
Table T.4, under Support for adult learning lists the cases that have designed
and developed and delivered training offers and or carried out workforce de-
velopment.
Section 2.3 How DigComp is helping stakeholders, sub section The use of Dig-
Comp to design training offers (pp.26-27) details how each case has made use
of DigComp.
The Annex includes a detailed description of the cases provided in the Examples
Section, and related resources.
In addition, the following resources are available:
C1    learning resources (in Italian)
C1  Digital Competence Development Methodology
C2 Anpal´s YouTube playlist contains all Prodigeo’s videos
C2  TOOL 3. Digital Competence course videos on YouTube (in Ital-
ian) for a set of DigComp competences
C3  Guide for training providers to catalogue their offer (available by
end 2020) according to DigComp categories, so as to obtain a DigComp label
and to facilitate matching self-assessment test results (showing gaps in given
competences) with available training offers
C3  Orientation Guide for intermediaries (available by end 2020) which
explains how to use the self-assessment results to help customers choose/de-
sign effective learning paths towards selected job profiles
C3  has developed an ePortfolio as the storage component of the
Ikanos Personal Learning Environment (iPLE). An ePortfolio will be online as
example to follow
C4  Syllabi in Polish: and in English
C4  Learning outcomes in Polish
C4  DigComp 1.0 report in Polish
C4  DigComp Handbook for Trainers and trainees (in Polish; partly in Eng-
lish and Ukranian)
C4  DigComp handbook for trainers and trainees available in 2 versions:
volume 1 with 3 DigComp areas and volume 2 with the other 2 areas; or one
book with all 5 areas. Table of contents in English is available
C5  Training Course
C6 . MOOC on Essential digital skills for museum professionals
Resources
34
The author would like to thank Alessandro Brolpito (European Training Foundation),
Stefano Kluzer (Consultant), Roberto Lejarzegi (Ibermática, Spain), José Antonio
González Martínez and Graciela Parrilla Ramírez (TuCertiCyl, Regional Government of
Castilla y León, Spain) and Gabriel Ángel de la Cuesta Padilla (Regional Government of
Andalucía, Spain) for their review of earlier versions of this Guide, providing practical
and inspirational recommendations for its improvement.
The author would also like to acknowledge the valuable expert support provided by
her European Commission JRC colleagues Yves Punie and Marcelino Cabrera.
ACKNOWLEDGEMENTS
GETTING IN TOUCH WITH THE EU
In person
All over the European Union there are hundreds of Europe Direct information centres. You can
find the address of the centre nearest you at: https://europa.eu/european-union/index_en
On the phone or by email
Europe Direct is a service that answers your questions about the European Union. You can
contact this service:
by freephone: 00 800 6 7 8 9 10 11 (certain operators may charge for these calls),
at the following standard number: +32 22999696, or
by electronic mail via: https://europa.eu/european-union/index_en
FINDING INFORMATION ABOUT THE EU
Online
Information about the European Union in all the official languages of the EU is available on the
Europa website at: https://europa.eu/european-union/index_en
EU publications
You can download or order free and priced EU publications from EU Bookshop at: https://publi-
cations.europa.eu/en/publications. Multiple copies of free publications may be obtained by con-
tacting Europe Direct or your local information centre (see https://europa.eu/european-union/
index_en).
doi:10.2760/936769
ISBN 978-92-76-18581-9
KJ-NA-30204-EN-N
... There have been several follow-up publications to the original framework [17], translating the DigComp for use in the workplace and in the labor market, such as 'DigComp into Action' [18], a user guide for anyone seeking to promote digital competences in the workplace; 'Developing digital competence for employability' [19], a report of a consultation workshop with stakeholders; and the 'DigComp at Work Implementation Guide' [20], which offers specific guidelines for the development of training offers in digital competences. ...
... The 'DigComp at Work Implementation Guide' [20] suggests using DigComp in the respective language version and reusing existing resources if they are accessible and have a similar purpose. Such a transfer was done for Austria: the Austrian national framework, DigComp 2.2 AT [21], is, in large part, a translation of the European DigComp 2.1 framework into German. ...
Article
Full-text available
Digital skills are now essential, not only in information and communications technology (ICT) jobs, but for employees across all sectors. The aim of this article is to detail how employees’ digital skills can be fostered through a Massive Open Online Course (MOOC), how such an offer is used and what the effects of such a measure are. Using an approach oriented at action research and design-based research activities, the authors describe the basics of their finding on existing European competence frameworks for digital skills and European projects that used MOOCs, the development and design of the MOOC, the evaluation on the basis of learning analytics insights and a questionnaire, as well as a reflection. The MOOC was offered as Open Educational Resources (OER) on the Austrian MOOC platform iMOOX.at from March to April 2021, with 2083 participants, of whom 381 fully completed the course (at end of June 2021) and 489 filled out the final questionnaire.
... Over a period of 2.5 years the Digital4Humanities project dealt with the need for digital competence in the Humanities and how they are communicated to students. Essential theoretical and learning-centered assumptions against which these considerations are based include framework conditions for future learning and teaching in a digital world and the future field of activity of students on the labor market 4.0 (e.g., Carretero et al., 2017;Bellanca & Brandt, 2010;Redecker, 2017;Centeno & Okeeffe, 2020). As part of this project, video tutorials were designed, used, or reused by lecturers across Europe, the focus of which was computeraided methods and executable software for processing research topics in the Humanities context. ...
Conference Paper
This article focuses on findings from the Digital4Humanities project, which has been producing video tutorials with lecturers from across Europe for the last 2.5 years, explaining digital research tools to Humanities students.
... The Digital4Humanities project addresses the need for digitally based research skills in the Humanities. It not only refers to requirements for framework conditions for future learning and teaching (e.g., Carretero et al., 2017;Bellanca & Brandt, 2010;EC, 2017), but also for the later work of the students on the labor market 4.0 (Centeno & Okeeffe, 2020). Within the project video tutorials have been produced combining computational, digital-based methods and executable software for processing research topics in the Humanities context. ...
Conference Paper
Full-text available
This article gives an overview of the first student evaluation results from the BMBF-funded research project Digital4Humanities. As part of this, video tutorials on digital research methods and tools for students in the field of Humanities are being created in cooperation with lecturers across Europe. The videos and associated materials or data already created in the project were evaluated by lecturers and students after they had been made available. In this paper, the focus is on the students' perspective and how they evaluated these videos as a learning resource itself and in terms of its design. In the following, the project, the evaluation method and the first results of the analysis are presented. The latter are based on the qualitative data obtained from the open questions of the survey: positively and negatively rated aspects of the videos, positively and negatively rated design aspects and the general opinion on videos as learning material in the context of courses during the study. It becomes clear that the students surveyed rated the comprehensibility, structure, and visualization as important criteria of the viewed videos. Negative reviews result from the format itself and the composition of the themed components. This is also reflected in the design aspects: a layout with understandable and clear visualizations as well as carefully selected text building blocks are essential. If videos meet minimum requirements in terms of design and content, they can be seen as important formats for students and as material in the field of Digital Humanities teaching.
... The Commission has also produced guiding material to help different type of users take up the frameworks (Bacigalupo, Weikert García, Mansoori & O'Keeffe, 2020;Centeno, 2020) and it is working on developing a set of guidelines and teaching strategies to support educational practitioners in promoting life skills as a follow-up action to LifeComp (Sala, Forthcoming). In addition, in 2022, a series of three online courses on "Teaching Life Competences" will be launched in collaboration with the School Education Gateway. ...
Article
Full-text available
This paper provides an overview of what European key competence frameworks are designed to do, how they are created and how they can be used for different lifelong learning purposes. It explains that they are built on multi-stakeholder consensus, which makes them well suited to provide a common language for all actors that are interested in transversal competences. The paper also highlights that competence frameworks are not binding, and users are not expected to comply with them, but rather to use them flexibly, to unbundle them and to re-bundle them to achieve their own goals. Este artículo proporciona una descripción general de para qué están diseñados los marcos europeos para las competencias clave, cómo se han creado y cómo se pueden utilizar para diferentes propósitos en el mundo del aprendizaje permanente. Explica que los marcos se basan en el consenso de múltiples partes interesadas, lo que los hace muy adecuados para proporcionar un lenguaje común para todo actor interesado en las competencias transversales. El artículo también destaca que los marcos de competencias no son vinculantes, y no se espera que los usuarios los cumplan, sino que los utilicen con flexibilidad, para desagregarlos y reagruparlos para lograr sus propios objetivos.
... According to Palsa and Ruokamo (2015, p. 110), multiliteracy could be described both as a pedagogical approach to the implementation of education and also as an educational outcome although "Compared with the media literacy articles, clear definitions of multiliteracies as an educational outcome or ability were hard to find in the multiliteracies articles, but teaching practices and pedagogical questions and content were addressed more thoroughly" (see also Ruokamo et al., 2016). As media literacy is widely described as an ability that accumulates throughout life and is seen as an outcome of media education rather than as a pedagogical approach, which also has a long history in the pedagogical and academic tradition, distinguishing, for example, the concept of digital competence, which has its roots in the EU's and Organisation for Economic Co-operation and Development's (OECD) skills policy on 21st-century skills (Voogt & Pareja Roblin, 2012; see also Centeno, 2020;Forsman, 2018), it is used in the present study through a traditionally understood definition (Aufderheide, 1993). Moreover, there is a desire in this study to use a slightly more limited concept to describe an issue that was originally used in the first sub-study from 2016 onward and has not needed to be changed since then. ...
Article
Full-text available
The logistic regression (LR) procedure for differential item functioning (DIF) detection is a model-based approach designed to identify both uniform and nonuniform DIF. However, this procedure tends to produce inflated Type I errors. This outcome is problematic because it can result in the inefficient use of testing resources, and it may interfere with the study of the underlying causes of DIF. Recently, an effect size measure was developed for the LR DIF procedure and a classification method was proposed. However, the effect size measure and classification method have not been systematically investigated. In this study, we developed a new classification method based on those established for the Simultaneous Item Bias Test. A simulation study also was conducted to determine if the effect size measure affects the Type I error and power rates for the LR DIF procedure across sample sizes, ability distributions, and percentage of DIF items included on a test. The results indicate that the inclusion of the effect size measure can substantially reduce Type I error rates when large sample sizes are used, although there is also a reduction in power.
Article
Full-text available
A method of item factor analysis based on Thur stone's multiple-factor model and implemented by marginal maximum likelihood estimation and the EM algorithm is described. Statistical significance of suc cessive factors added to the model is tested by the likelihood ratio criterion. Provisions for effects of guessing on multiple-choice items, and for omitted and not-reached items, are included. Bayes constraints on the factor loadings are found to be necessary to suppress Heywood cases. Numerous applications to simulated and real data are presented to substantiate the accuracy and practical utility of the method. Index terms: Armed Services Vocational Aptitude Bat tery, Beta prior, EM algorithm, Item factor analysis, TESTFACT, Tetrachoric correlation.
Article
This book addresses an important issue for the design of survey instruments, which is rarely taught in graduate programs beyond those specifically for statisticians. Item Response Theory is used to describe the application of mathematical models to data from questionnaires and tests as a basis for measuring abilities, attitudes, or other variables. It is used for statistical analysis and the development of assessments, often for high stakes tests such as the Graduate Record Examination. This volume includes examples of both good and bad write-ups for the methods sections of journal articles.
Article
A plausible factorial structure for many types of psychological and educational tests exhibits a general factor and one or more group or method factors. This structure can be represented by a bifactor model. The bifactor structure results from the constraint that each item has a nonzero loading on the primary dimension and, at most, one of the group factors. The authors develop estimation procedures for fitting the graded response model when the data follow the bifactor structure. Using maximum marginal likelihood estimation of item parameters, the bifactor restriction leads to a major simplification of the likelihood equations and (a) permits analysis of models with large numbers of group factors, (b) permits conditional dependence within identified subsets of items, and (c) provides more parsimonious factor solutions than an unrestricted full-information item factor analysis in some cases. Analysis of data obtained from 586 chronically mentally ill patients revealed a clear bifactor structure.
Article
We investigated the moderating effects of socially desirable responding on the criterion validity of self-report personality. For 602 undergraduates responding to a Big Five personality measure, eight different moderator indices failed to demonstrate that correlations between self-report and peer-report were influenced, either linearly or quadratically, by socially desirable responding. Measures of socially desirable responding were associated with various self-report personality scales but generally independent of peer ratings of personality. It was concluded that socially desirable responding can be method variance that is unrelated to faking and unrelated to substantive personality.
Article
Researchers have recently asserted that popular measures of response distortion (i.e., socially desirable responding scales) lack construct validity (i.e., measure traits rather than test faking) and that applicant faking on personality tests remains a serious concern ([Griffith and Peterson, 2008] and [Holden, 2008]). Thus, although researchers and human resource (HR) selection specialists have been attempting to find measures which readily capture individual differences in faking that increase personality test validity, to date such attempts have rarely, if ever succeeded. The current study, however, finds that the overclaiming technique captures individual differences in faking and subsequently increases personality test score validity via suppressing unwanted error variance in personality test scores. Implications of this research on the overclaiming technique for improving HR selection decisions are illustrated and discussed.
Article
In this note I comment briefly on Keith Markus's illuminating article on Science, measurement, and validity: Is completion of Samuel Messick's synthesis possible? Markus's analysis bears directly on the controversial status of the consequential basis of test validity in relation to the more traditional evidential basis. After addressing some key points in his argument, I then comment more generally on sources of the controversy over the claim that empirical consequences of test interpretation and use constitute validity evidence.