Technical ReportPDF Available

PERFORMANCE DATA GOVERNANCE AND MANAGEMENT IN HIGHER EDUCATION LEARNING AND TEACHING: A GUIDELINE

Authors:

Abstract

See also https://www.evalag.de/sqelt/ and https://ec.europa.eu/programmes/erasmus-plus/projects/eplus-project-de-tails/#project/b8a93e06-2000-4a82-9fac-90b3bcacadec
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 1/95
SQELT PROJECT
SUSTAINABLE QUALITY ENHANCEMENT IN HIGHER EDUCATION LEARNING AND
TEACHING. Integrative Core Dataset and Performance Data Analytics
Key Action: Cooperation for innovation and the exchange of good practices
Action Type: Strategic Partnerships for higher education
Partners: EVALUATION AGENCY BADEN-WUERTTEMBERG, UNIVERSIDADE DE AVEIRO, BIRMINGHAM CITY
UNIVERSITY, UNIVERSITEIT GENT, UNIWERSYTET JAGIELLONSKI, UNIVERSITÄT FUR WEITERBILDUNG KREMS,
UNIVERSITEIT LEIDEN, UNIVERSITÀ DEGLI STUDI DI MILANO, UNIVERSITETET I OSLO, CENTRO DE INVESTIGAÇÃO
DE POLÍTICAS DO ENSINO SUPERIOR
https://www.evalag.de/sqelt/
https://ec.europa.eu/programmes/erasmus-plus/projects/eplus-project-details/#project/b8a93e06-2000-4a82-9fac-
90b3bcacadec
Intellectual Output O11:
PERFORMANCE DATA GOVERNANCE AND MANAGEMENT IN
HIGHER EDUCATION LEARNING AND TEACHING: A GUIDELINE1
Project coordinator/Contact person:
Prof. Dr. Dr. Theodor Leiber
Sect. 3: Science Support, evalag (Evaluation Agency Baden-Wuerttemberg)
PO Box 120522, D-68056 Mannheim, Germany
Tel 0621-12854525, leiber@evalag.de
http://www.evalag.de/
Release 30 November 2020
The creation of this resource has been (partially) funded by the ERASMUS+ grant program of the European Union under grant
no. 2017-1-DE01-KA203-1HLWKHUWKH(XURSHDQ&RPPLVVLRQQRUWKHSURMHFW¶VQDWLRQDOIXQGLQJDJHQF\'$$'FDQEH
held responsible for the content or liable for any losses or damage resulting of the use of this resource.
1 This title has been replaced for the title used in the SQELT Application Form: ³&RPSUHKHQVLY H0DQXDOµ64(/7&RUH'DWDVHWLQ/7¶´EHFDXVHRI
DGLIIHUHQWDQGLPSURYHGXQGHUVWDQGLQJRIWKHLQWHQGHGVFKHPHWKDWHPHUJHGGXULQJWKHZRUNLQWKH64(/7SURMHFW7KHWHUP³JXLGHOLQH´LVXVHG
in the XQGHUVWDQGLQJRI³LQIRUPDWLRQLQFOXGLQJGDWDNQRZOHGJHH[SHULHQFHLQWHQGHGWRDGYLVHSHRSOHRQKRZVRPHWKLQJVKRXOGEHGR ne or what
VRPHWKLQJVKRXOGEH´
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 2/95
IMPRINT
The SQELT Consortium Partners, November 2020 ±
Evaluationsagentur Baden-Württemberg (evalag; Mannheim, Germany)
Universidade de Aveiro (UA, Aveiro, Portugal)
Birmingham City University (BCU, Birmingham, United Kingdom)
Universiteit Gent (UGhent, Ghent, Belgium)
Uniwersytet Jagiellonski (JU, Kraków, Poland)
Universität für Weiterbildung Krems (DUK, Krems, Austria)
Universitá degli Studi di Milano (UNIMI, Milan, Italy)
Centro de Investigação de Políticas do Ensino Superior (CIPES, Matosinhos, Portugal)
Universiteit Leiden (U Leiden, Leiden, The Netherlands)
Universitetet i Oslo (UiO; Oslo, Norway)
See also https://www.evalag.de/sqelt/ and https://ec.europa.eu/programmes/erasmus-plus/projects/eplus-project-de-
tails/#project/b8a93e06-2000-4a82-9fac-90b3bcacadec
Contact persons:
evalag: Prof. Dr. Dr. Theodor Leiber
University of Aveiro: Ass. Prof. Dr. Maria J. Rosa
Birmingham City University: Dr. James Williams
Ghent University: Joke Claeys
Jagiellonian University in Kraków: Dr. habil. Justyna Bugaj
Danube University Krems: PD Dr. David Campbell
University of Milan: Ass. Prof. Dr. Roberto Cerbino
CIPES: Prof. Dr. Cláudia Sarrico
Leiden University: Prof. Dr. Maarja Beerkens
University of Oslo: Prof. Dr. Bjørn Stensaker
Editorial: Theodor Leiber (evalag; Mannheim, Germany)
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 3/95
PERFORMANCE DATA GOVERNANCE AND MANAGEMENT IN
HIGHER EDUCATION LEARNING AND TEACHING: A GUIDELINE
Table of Contents
Prologue............................................................................................................................... 5
Used Abbreviations ............................................................................................................ 5
Executive Summary ............................................................................................................ 6
Introduction: The SQELT Strategic Partnership as a Case Study and Its Goals and
Methodology ........................................................................................................................ 7
The Benchlearning Model Applied ..................................................................................... 8
Areas of Benchlearning in Performance Data Governance and Management (PDGM)
and Their Strategic SWOT Analyses .................................................................................. 9
PDGM Policy ...................................................................................................................... 9
(Digital) Performance Data Management (PDM) System...................................................12
Performance Indicator (PI) Set ..........................................................................................13
Ethics of PDGM .................................................................................................................15
Conclusions ........................................................................................................................16
Open Questions and Limitations of the SQELT Case Study ...........................................17
References and Literature Overview .................................................................................18
Acknowledgment ................................................................................................................29
Appendix 1: SQELT Template of PDGM Policy ................................................................30
Policy Statement ...............................................................................................................30
Policy Provisions ...............................................................................................................31
Policy Review ....................................................................................................................36
Further Assistance ............................................................................................................36
Addendum 1: Data Quality Lifecycle ..................................................................................40
Addendum 2: (Performance) Data Governance Roles and Responsibilities ......................41
Appendix 2: SQELT Performance Data Management (PDM) Systems Model ................43
Appendix 3: SQELT Comprehensive PI Set for L&T in Higher Education ......................46
Pragmatic Understanding of Performance Indicators and Working Definitions ..................46
General Comments on the SQELT PI Set .........................................................................50
SQELT PIs for Teaching Competences and Processes ....................................................52
SQELT PIs for Learning Competences and Processes .....................................................55
SQELT PIs for Learning Outcomes and Learning Gain and Their Assessment .................56
Higher Education for Sustainable Development (HESD) Learning Goals .......................62
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 4/95
SQELT PIs for L&T Environment .......................................................................................71
Appendix 4: Learning Analytics ........................................................................................78
Learning Analytics and Performance Indicators .................................................................78
Twelve Principles of Learning Analytics .............................................................................79
Learning Analytics in the Six SQELT Case Study Universities ..........................................81
Focus Group Discussions and Interviews about Learning Analytics ...............................81
Abstracted SWOT Analysis of Learning Analytics ..........................................................82
Appendix 5: SQELT Ethical Code of Practice for (Performance) Data Management .....84
Executive Summary ..........................................................................................................84
Introduction and Overview .................................................................................................84
Conditions of Application of this Ethical Code ....................................................................86
Ethical Principles of Data Management .............................................................................87
Moral and Lawful Bases for Data Management .................................................................89
Individual Rights Related to Data Management .................................................................90
Insights and Development Options from the SQELT Case Studies ...................................92
Project Workbook Framework on Data Ethics ...................................................................94
Appendix 6: Exemplary Survey Questionnaire about PIs in Higher Education L&T .....95
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 5/95
Prologue
This Guideline for Performance Data Governance and Management (PDGM) in Higher Education Learning
and Teaching (L&T) is meant to be an inspirational guide with fundamental methodological issues and clari-
fications next to implementation of methodology and practical application (including results of the Erasmus+
Strategic Partnership SQELT) and also with warnings of possible failures and threats when establishing
PDGM in L&T including performance indicators (PIs) in higher education L&T. Thus, the guideline also con-
tributes to bridging the theory/practitioner gap to some extent. The guideline is based on the work of the
SQELT partner consortium carried out during the three years of project duration from December 2017 to
November 2020 including the evaluation of the developed PI set, public presentations at the SQELT meet-
ings and results published (and to be published) by SQELT partners and invited experts.
The guideline presents and discusses the main theoretical perspectives and general methodological ele-
ments of a comprehensive PI set for L&T in higher education institutions (HEIs) as well as relevant issues
concerning PDGM policies, Performance Data Management (PDM) Systems, Learning Analytics and Ethi-
cal Codes of Practice for PDM. In this sense, the guideline analyses basic elements and desiderata of
PDGM in the light of and informed by the European case study SQELT.
This guideline shall be useful for any HEI interested in quality assurance and enhancement in L&T because
the results of the SQELT project can be used by any HEI for their further development of performance mon-
itoring, evaluation and governance, in short: PDGM in L&T.
Used Abbreviations
DPDM ± Digital performance data management
ECTS ± European Credit Transfer System
EIOD ± Evidence-informed organisational development
ESD ± Education for Sustainable Development
FTE ± Full-time equivalent
GDPR ± General Data Protection Regulation (of the European Union)
HEI(s) ± Higher education institution(s)
HESD ± Higher Education for Sustainable Development
ICT ± Information and communication technology
LMS ± Learning management system
L&T ± Learning and teaching
MOOC ± Massive Open Online Course
PDGM ± Performance Data Governance and Management
PDM ± Performance Data Management
PDRLA ± Personalised data required for Learning Analytics; such data are, as a rule, under specific protection by national data
and privacy law and particularly by the GDPR (European Union General Data Protection Regulation)
PI(s) ± Performance indicator(s)
QM ± Quality management
SAS ± Student admission system
SDG(s) ± Sustainability Development Goal(s)
SDL ± Self-Directed Learning
SIS ± Student information system
SQELT ± Sustainable Quality Enhancement in Learning and Teaching
SUSTEX ± (Satisfaction) Surveys of students, surveys of teaching staff and assessment reports by experts/peers (other than
students and teaching staff) [abbreviating acronym for three basic appropriate ways of performance data assessment]
SWOT analysis ± Strengths, weaknesses, opportunities and threats analysis
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 6/95
Executive Summary
This Guideline for Performance Data Governance and Management (PDGM) in Higher Education Learning
and Teaching (L&T) investigates in PDGM of L&T in higher education institutions (HEIs). The analysis is
based on the Erasmus+ Strategic Partnership SQELT (Grant no. 2017-1-DE01-KA203-003527) spreading
out over nine European countries and including six European case study sample universities: University of
Aveiro (Portugal); Birmingham City University (United Kingdom); Ghent University (Belgium); Jagiellonian
University in Kraków (Poland); Danube University Krems (Austria); University of Milan (Italy). An intensive
case study was carried out during the SQELT project including generic results about basic elements and
desiderata of PDGM. Core to this case study was a 14-step benchlearning process that focused on the four
basic benchlearning areas of PDGM: PDGM Policy; (Digital) Performance Data Management ((D)PDM)
System; Performance Indicator (PI) Set; and Ethics of PDGM. Among other things, this case study was
based on document analysis, focus group interviews, strategic SWOT analyses and an online survey with
European (and a few international) stakeholders.
In the Introduction, the Guideline reports on the SQELT Erasmus+ Strategic Partnership as a case study
including its goals and methodology. Then, the applied benchlearning model is described which helps to
differentiate and specify the benchlearning process and open application possibilities. Subsequently, the
above-mentioned four basic benchlearning areas of PDGM are introduced and their strategic SWOT anal-
yses are carried out leading to recommendations for taking action to use strengths to overcome weak-
nesses, to exploit opportunities and to avoid threats, or to recognize the requirement to introduce additional
interventional measures for fixing the deficits identified in the SWOT analysis.
This Guideline for PDGM in Higher Education L&T is complemented and completed by six Appendices giv-
ing a SQELT Template of PDGM Policy (Appendix 1), a SQELT PDM Systems Model (Appendix 2), a
SQELT Comprehensive PI Set for L&T in Higher Education (Appendix 3), an overview of Learning Analytics
(Appendix 4), a SQELT Ethical Code of Practice for (Performance) Data Management (Appendix 5) and an
Exemplary Survey Questionnaire about PIs in Higher Education L&T (Appendix 6).
Based on the data and information about the SQELT benchlearning process, the PDGM SWOT analyses
and the detailed Appendices, this Guideline for PDGM in Higher Education L&T should support activities of
any HEI to develop, establish and further improve an adequate PDGM in L&T.
Keywords: benchlearning; ethics of performance data governance and management (PDGM); Learning
Analytics; PDGM policy; performance indicators; strategic SWOT analysis
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 7/95
Introduction: The SQELT Strategic Partnership as a Case Study and
Its Goals and Methodology
The SQELT Strategic Partnership (SQELT 2020) is an in-depth case study about basic elements and desid-
erata of PDGM2 in L&T in HEIs. As such it relies, as one source of empirical data and information, on a se-
lected sample of six European universities: Danube University Krems, Austria; Ghent University, Belgium;
University of Milan, Italy; Jagiellonian University in Kraków, Poland; University of Aveiro, Portugal; and Bir-
mingham City University, United Kingdom.3 These universities, together with four further institutional part-
ners from Germany, the Netherlands, Norway and Portugal, formed the strategic partnership of the SQELT
project (SQELT 2020) which was co-funded by the European Commission in the context of the Erasmus+
programme. In terms of size, the partnership universities covered a range from 9,000 to 63,000 students.
The four other SQELT partners were the German evaluation agency Evaluationsagentur Baden-Württem-
berg coordinating the project and three quasi-external experts not permanently involved in the development
of every project output and in charge of giving critical feedback to the project outcomes at certain develop-
mental steps.
The SQELT project¶VFRUHVWHSVDQGJRDOV are shown in Figure 1 which depicts the scheme of the SURMHFW¶V
workflow and mainly involved stakeholders and participants. The basic idea of the project was to develop a
core set of elements required for PDGM in higher education L&T. In the SQELT SDUWQHUV¶understanding,
the two basic PDGM elements are a PDGM Policy and a PI set. In the SQELT project, pursuing the goals
mentioned above was supported by benchlearning at the partner HEIs about PDGM policy, PDM system
models, a comprehensive PI set for L&T, the role of Learning Analytics, and an ethical code of practice for
PDM.
Figure 1: Workflow of the SQELT Erasmus+ Strategic Partnership, adopted from (Leiber 2020, Fig. 1)
In accordance with the characterisation of case study research (e.g. Harrison et al., 2017; Ridder, 2017),
this Guideline is based on an in-depth case study that has been realised by the SQELT project as a whole.
This in-depth case study focused the object of contextualised PDGM systems at six European HEIs (repre-
senting the bounded system case) and used multiple sources of evidence for a descriptive, exploratory and
2 Throughout this guidelineLI³GDWD´DQG³LQIRUPDWLRQ´DUHQRWH[SOLFLWO\GLVWLQJXLVKHGWKHQRWLRQRIெGDWD´LVPHDQWWRFRPSULVH
quantitative data as well as qualitative information.
3 This text closely follows (Leiber 2020).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 8/95
evaluative case study design (Harrison et al. 2017, Section 4) which should tend to produce generic results.
The sources of evidence comprised focus group interviews with several stakeholder groups ± teachers, stu-
dents, quality management (QM) staff, leadership; an online survey with the same stakeholder groups that
were approached on national and international levels; expert feedback on selected project outputs; a strate-
gic SWOT analysis ± strengths, weaknesses, opportunities, threats; a comprehensive reception of research
literature; and discussion groups at several multiplier events.
The above-mentioned goals of the SQELT project also comprised the development of knowledge and tools
to support the optimisation of documentation and monitoring processes of L&T in HEIs (e.g., data integra-
tion, standardisation, reporting efficiency) in the service of different purposes such as reporting and evi-
dence-informed decision-making. The project waVDOVRPHDQWWRFRQWULEXWHWRWKH³UHVHDUFKRQLQGLFDWRUVRI
WHDFKLQJ>DQGOHDUQLQJ@TXDOLW\´DVLWZDVUHFRPPHQGHGQRWVRORQJDJRWRWKH(XURSHDQ3DUOLDPHQW
(Wächter et al. 2015, p. 78).
7KH64(/7SURMHFW¶VGHYHORpment and progress was structured into eleven Intellectual Outputs (see Table
1) the most important of which are available on the SQELT website (SQELT-MIO 2020).
Table 1: Intellectual Outputs of the SQELT Erasmus+ Strategic Partnership
O20
Impact analysis questionnaire (initial PI set)
O1
6 Benchlearning Reports
O3
6 Baseline Reports
O4
Comprehensive PI set I
O5
Comprehensive PI set II
O6
Comprehensive PI set III
O7
Evaluation Report
O8
PDGM Policy & Ethical Code of Practice for (Performance) Data Management
O9
Comprehensive PI set IV
O10
5HSRUWRQ9DULRXV6WDNHKROGHUV¶$VVHVVPHQWRIWKH64(/73,6HW(including a Survey about Selected PIs)
O11
SQELT Guideline
O12
Peer-reviewed publications
The Benchlearning Model Applied
According to the SQELT partners, systematic and substantial benchlearning is fundamental to any develop-
ment and implementation process of PDGM (as it was to the SQELT project). Therefore, in the background
of such organisational development and learning process is an elaborate benchlearning model which com-
prises an analysis phase which may be best realised by a strategic SWOT (strengths, weaknesses, oppor-
tunities, threats) analysis that is described below. Figure 2 shows that such benchlearning process can be
subdivided into 14 interrelated steps which can be grouped under the following five main steps: Planning
and preparation; Analysis; Integration; Action; and Maturity (cf. Camp 1994, p. 21; Freytag & Hollensen
2001). This concept of benchlearning relies on the understanding that it is a way of monitoring and as-
sessing the strategies and performance of an institution or organisation against comparable, good-practice4
competitors, that includes an ongoing performance improvement strategy and change management pro-
cess.5
For reasons of limited space and because of its importance for the present Guideline, the focus here is
mainly on the second main step of the benchlearning process: This Analysis step comprises the bench-
learning sub-steps 6 and 7 (see Figure 2) that should
x Identify current performance gaps, for example by means of a SWOT analysis;
x Identify future performance potential, i.e. options for organisational learning, for example by elabo-
ration of a strategic SWOT analysis (core of benchlearning) including insights, assessments and
recommendations (see below).
As shown in Figure 2, the other main steps of the benchlearning process capture the Planning and Prepara-
tion phase (main step 1), the Integration phase (main step 3), the Action phase (main step 4) and the Ma-
turity phase (main step 5). The time schedules for the different steps of the benchlearning process can be
4 5HO\LQJRQµJRRGSUDFWLFH¶ZKLOHµEHVWSUDFWLFHLVDP\WK¶)HUQLH7KRUSHS
5 For an embedding of such organisational development processes into a Seven-Step Action Research Process Model
(SSARPM) comprising the Deming (Plan-Do-Check-Act/PDCA) cycle, see (Leiber 2019b, pp. 324ff.).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 9/95
very different depending on the type and complicatedness of the benchlearning entities. In the present case
of PDGM of L&T in HEIs, a completed benchlearning process up to Maturity will usually take years if the
VWDUWLVIURPµ]HUR¶6 This is in accordance with the experience of the SQELT Strategic Partnership that had
a formal lifetime of 36 months under the condition of Erasmus+ co-funding. Therefore, it was not to be ex-
pected that the full range of the 14 benchlearning steps could be completed within the project time. Instead,
the Planning and Preparation phase and the Analysis phase could essentially be completed during the pro-
ject period, while most part of the Integration phase and the phases of Action and Maturity are left to the
SQELT partner universities for further work in the near and mid future.
Figure 2: The 14 steps of benchlearning processes, adopted from (Camp 1994, p. 21) and further devel-
oped
Areas of Benchlearning in Performance Data Governance and Man-
agement (PDGM) and Their Strategic SWOT Analyses
Based on the SQELT project, in the context of this Guideline the following entities or areas for benchlearn-
ing around PDGM in HEIs were identified: 1) PDGM Policy; 2) (Digital) PDM system; 3) Performance indica-
tor (PI) set; 4) Ethics of PDGM; 5) Resources, for example IT resources and software solutions; human and
financial resources.7 Of course, this is a pragmatic choice that could be diversified or extended.
PDGM Policy
According to contemporary understanding of HEIs as autonomous, multiple-hybrid organisations (cf. Leiber
2019b), an actionable PDGM Policy is indispensable for them to fulfil their basic organisational needs: It
regulates issues of strategy, governance and management as well as ethics and responsibility, including
sustainability, quality, accessibility and usability of information and data about HEI performance as well as
investments of human and financial resources. Therefore, the core purposes of a generic PDGM policy in-
clude
x Defining roles and responsibilities for different data creation and usage types, cases or situations,
and establishing clear lines of accountability.
6 This is a qualitative estimate based on experience with organisational development of QM systems in HEIs.
7 This benchlearning area was not further pursued in the SQELT Erasmus+ Strategic Partnership since comparative data could
not be provided (because of limited project resources).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 10/95
x Developing good quality practices for effective management and protection of (performance) data.
x Protecting the HEI¶VGDWDDJDLQVWLQWHUQDODQGH[WHUQDOWKUHDWVSDUWLFXODUO\ assuring protection of
privacy, academic freedom, intellectual property, information security and compliance.
x Ensuring that the HEI handles (performance) data in accordance with applicable laws, regulations
and standards.
x Ensuring that the HEI effectively documents a (performance) data trail within the processes associ-
ated with accessing, retrieving, exchanging, reporting, managing and storing of data.
Against this backdrop, a SWOT analysis of PDGM including iterative steps of reflective refinement is an ap-
propriate method and tool to realise the second main step of benchlearning, Analysis, which comprises
identifying current performance gaps and future performance potential (of PDGM) (see Figure 2). Such
SWOT analysis must be based on relevant and reliable data and information about PDGM of the respective
study case (cf. steps 4 and 5 of the Planning and Preparation phase of the benchlearning process, see Fig-
ure 2).
In the SQELT case study, it was mainly the Benchlearning Reports and Baseline Reports (SQELT-MIO
2020) including focus group interviews with engaged university stakeholders that served for collecting the
required benchlearning data and information. These reports contained descriptions of (aspects of) the part-
QHU+(,V¶3'*0DSSURDFKHVDQGFXOPLQDWHGLQVL[XQLYHUVLW\-specific SWOT analyses. Based on these the
most widespread and important strengths and weaknesses of PDGM were combined while no notable op-
portunities and threats can be named, see Table 2. These SWOTs may be supportive for and guiding simi-
lar analyses of any interested HEIs on their way to establishing and improving a PDGM Policy.
Table 2: Abstracted SWOT analysis of PDGM of the six SQELT case study universities (information source:
focus group interviews and Benchlearning Reports), adopted with revisions from (Leiber 2020, Table 2)
Strengths Weaknesses
1)
1)
No (well-)developed PDGM at the institutional and/or
faculty/department levels
2)
2)
No or poor representation of PDGM in mission state-
ments on various organisational levels
3)
3)
Performance data and information is mainly, if not ex-
clusively, used for reporting (accountability towards
higher education politics and the public), less for the
enhancement of performance
4)
4)
Lack of leadership commitment to PDGM
5)
5)
$IDLOLQJFRRUGLQDWLRQEHWZHHQWKHJRDOVRIWKH+(,¶V
management and the goals of the faculties with re-
spect to PDGM
Opportunities Threats
±
±
± [No notable threat can be named]
In the SQELT project the abstracted SWOTs of PDGM were used to implement a strategic SWOT analysis
(see Tables 3 and 4). For that purpose, the SWOT analysis had to be supplemented by and integrated into
a strategy matrix the general template form of which is given in Table 3. Such strategy matrix supports the
analysis of how strengths can be used to overcome weaknesses, to exploit opportunities and to avoid
threats (Leiber et al. 2018, p. 355). Accordingly, the central steps of the strategic SWOT analysis are that
the empty cells in template Table 3 are filled by recommendations for taking action to utilise strength num-
EHUL>L «@WRRYHUFRPHZHDNQHVVQXPEHU j) [j) = 1), 2), «@WRH[SORLWRSSRUWXQLW\QXPEHUN>N 
«@DQGWRDYRLGWKUHDWQXPEHUO>O «@8 In general, it may also happen that some weak-
nesses, opportunities and threats cannot be adequately treated by any of the identified strengths. In such
8 In (informal) logical terms the described central steps are material inferences (Brigandt 2010).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 11/95
cases, other measures must be looked for (see Table 3, lower lines) thereby introducing an additional crea-
tive-strategic element into the process of a strategic SWOT analysis.
Table 3: Template of strategy matrix for SWOTs of a selected area of analysis or dimension of a bench-
learning object, adopted from (Leiber et al. 2018, p. 355, Table 3)
The filled-in strategy matrix of PDGM SWOTs, that was developed in the SQELT intensive case study, is
given in Table 4 thus presenting recommendations for action for evidence-informed organisational develop-
ment (EIOD) relating to PDGM. As the entries in Table 4 show, it is often a desideratum to have a widely
shared understanding of PDGM at leadership level and other relevant institutional levels. More specifically,
there is often a need to clarify and understand the multiple variety of purposes of PDGM, namely: evalua-
tion; controlling; budgeting; motivating people; promoting issues; celebrating successes; learning from in-
sights; and improving on performance-related issues. This must include linking these goals to the various
stakeholder groups of teachers, students, researchers, department heads/programme directors, deans, pol-
icy makers, donors, and administrators to mention a few. Another recommendation for EIOD is that a
PDGM Policy should be made explicit and incorporated into the strategic documents of HEIs. At SQELT
sample HEIs where PDGM was restricted to reporting and controlling it could be recommended to develop
PDGM dimensions focusing on performance enhancement. Finally, deficits of working communication and
coordination channels between the central management and faculties of HEIs with respect to PDGM-related
issues are often not well developed and should be overcome.9 These recommendations for organisational
development in the area of PDGM are implemented in the SQELT project through the proposal to establish
a PDGM policy (Appendix 1): It should enable the identified weaknesses to be overcome.
Table 4: Strategy matrix of PDGM SWOTs and its recommendations for EIOD, adopted from (Leiber 2020,
Table 4) (n/a = not applicable)
W
O
T
1)
2)
3)
4)
5)
±
±
S
S/W
S/O
S/T
1)
Establish shared
understanding of
the various pur-
poses (evaluate;
control; budget;
motivate; pro-
mote; celebrate;
learn; improve) of
PDGM at institu-
tional leadership
level and across
the largely auton-
omous institu-
tional (sub-) units
Introduce PDGM
SROLF\LQ+(,¶V
strategy docu-
ments (e.g., mis-
sion statements,
structure and de-
velopment plans)
on various organ-
isational levels
Develop PDGM
focus on perfor-
mance enhance-
ment (to supple-
ment reporting
and controlling)
(e.g., establish
improvement-ori-
ented QM)
Improve on lead-
ership commit-
ment to PDGM
(e.g., define rel-
evant leadership
roles in PDGM)
n/a
n/a
n/a
2)
n/a
3)
Establish working
communication
and coordination
channels between
HEI management
and the faculties
with respect to
PDGM-related is-
sues (e.g., define
the roles of leader-
ship, management
and academics)
4)
5)
n/a
n/a
n/a
n/a
M
M/W
M/O
M/T
n/a
n/a
n/a
n/a
n/a
n/a
n/a
n/a
9 A broadly applicable template of a PDGM Policy is presented in Appendix 1 which should help to implement these insights.
Weaknesses (W)
Opportunities (O)
Threats (T)
1)
2)
«
1)
2)
«
1)
2)
«
Strengths (S)
Strengths-based strategies to
overcome weaknesses (S/W)
Strengths-based strategies to take
advantage of opportunities (S/O)
Strengths-based strategies to
avoid threats (S/T)
1)
2)
«
Other measures
Other measures to overcome
weaknesses (M/W)
Other measures to take advantage
of opportunities (M/O)
Other measures to avoid
threats (M/T)
1)
2)
«
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 12/95
It is worthwhile to note that for any SWOT analysis, particularly a strategic SWOT analysis, to become use-
ful it is important that the strengths, weaknesses, opportunities, and threats are clearly defined, character-
ised and prioritised. Only if these requirements are fulfilled a SWOT analysis can provide reliable infor-
mation and contribute to reasonable strategy development in the form of recommendations for action. Ful-
filling these requirements is not trivial, in turn, because restrictions with regard to clearness of definitions
and priorities are abound and stem, for example, from ambiguities and differentiation difficulties between
strengths and opportunities on the one hand and weaknesses and threats on the other hand.10 It is also ap-
parent from Tables 2 and 4 that a (strategic) SWOT analysis does not solve the problem at hand per se, or
overcome the identified performance gap; rather it merely suggests recommendations for EIOD instructions
and very first steps to proceed. As noted above, these recommendations then need to be elaborated into
broader organisational development strategies and measures.
In addition to the above discussed SWOTs of PDGM Policy, the following weaknesses and threats were
identified at partner HEIs of the SQELT case study (while no notable opportunities and threats can be
named11): 1) Bottlenecks in communication exist as performance data and information are accessible only
to a few responsible people (weakness). 2) There is a lack of integrated digital performance data manage-
ment (DPDM) systems (e.g., elaborate data warehouse) covering all PIs, while parallel existing island solu-
tions prevail, i.e., numerous performance data and information are stored locally and in unstructured forms
leading to difficulties to operationalise it systematically (weakness). 3) Performance data reporting some-
times depends RQWKHFRPPLWPHQWRILQGLYLGXDOSHUVRQVVXFKDVSURJUDPPHV¶GLUHFWRUVZHDNQHVV). 4) At
certain HEIs different subject areas are under different ministerial authorities (e.g., medicine and other fac-
ulties of a university) leading to different and sometimes conflicting PDGM policies (threat). 5) At certain
HEIs an imbalance towards policy-driven PIs exists (e.g., focus on economy PIs; available performance
GDWDLVSDUWO\QRWDQDO\VHGRUDQDO\VHVDUHQRWSXEOLVKHG³EHFDXVHRISROLWLFDOGHFLVLRQV´ which overlay or
turn off evidence-informed performance management) (threat). 6) A wide-spread issue is the complicated-
ness of decision-making processes because of institutionalised understanding of open-ended knowledge-
based deliberative decision-making and acting in the collegial university of academics (weakness); this
weakness is counteractive to the broad and effective acceptance of a PDGM Policy and to the idea of the
university as an organisation as such. 7) At certain HEIs there exists low involvement of users in the design
and validation processes of the PDM-suggested improvements to be implemented (weakness). 8) Often,
relevant PI data and information is not available to every relevant stakeholder (weakness). 9) It happens
that ministry-driven PI sets are put in place although they GRQRWHQWLUHO\ILWWKH+(,¶VSURILOHand needs
(threat). 10) Ministry-driven changes in PDM of higher education could restrict the autonomy of HEIs and
faculties, e.g., in the context of PDM relating to debates about student fees, value for money etc. (threat).12
(Digital) Performance Data Management (PDM) System
To make performance data and information operational and coherent, a digital PDM (DPDM) system is re-
quired at any contemporary HEI. Such DPDM system RSHUDWLRQDOLVHVVWDNHKROGHUV¶XVDJHRIYDOLGDQGUHOLD
ble performance data and information by regulating the collection, processing, categorising, aggregating of
performance data and information. Another practical task often to be solved by a PDM system is to match
different performance data (management) systems and data bases.
In the SQELT case study, the following weaknesses and threats of PDM systems were identified at certain
sample HEIs that may be helpful for universities to guideline their own attempts to establish and improve a
(D)PDM system: 1) Resources allocated for the implementation and permanent functioning of the DPDM
model are sometimes/often not adequate (weakness), particularly against the backdrop of structural underfi-
nancing in L&T (threat). 2) Various uncoordinated and/or incompatible software solutions in DPDM are used
in certain HEIs (weakness) which renders complicated the integration of different PI data. 3) While Learning
10 Experience with SWOT analysis shows that such semantic ambiguities occur more often than one might expect. For further
limitations of SWOT analyses cf. (Leiber et al. 2018, pp. 353-354).
11 Not all SQELT partners may agree with this statement because there is not always accordance in how SWOTs are identified
and assigned.
12 For reasons of limited space, these weaknesses and threats are not further analysed here as part of a strategic SWOT matrix.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 13/95
Analytics can be understood an integral part of a DPDM system, it is still in its early infancy at most sample
HEIs and many HEIs in general (weakness) (also cf. HEC 2016).
Within the SQELT case study, the benchlearning area of (D)PDM systems could not be fully exploited be-
cause of lack of available resources. However, some insights useful for benchlearning around PDM sys-
tems could be developed and analysed, see Appendix 2. Accordingly, the identified weaknesses and
threats just mentioned can be approached through the 64(/7SURMHFW¶VSURSRVDOWRHVWDEOLVKD(D)PDM
system: It should enable to overcome the weaknesses and counteract the threats.
Performance Indicator (PI) Set
It can be considered virtually indisputable that quality assurance and quality development in higher educa-
tion cannot be achieved without a powerful set of quantitative and qualitative PIs (e.g. cf. Leiber 2019a).13 In
other words, a suitable set of PIs to monitor, measure, process, store and report information and data re-
lated to certain performance areas including L&T is core of any PDGM in higher education.
As done for PDGM Policy and DPDM system a strategic SWOT analysis was also carried out for the third
benchlearning area of performance indicators in the SQELT case study, again comprising a SWOT table
and a related strategy matrix and its recommendations for EIOD to overcome weaknesses, exploit opportu-
nities and encounter threats (see Tables 5 and 6). As before, the identified SWOTs occur at certain sample
HEIs but not necessarily at the majority or all of them.
In general, the situation with PIs at different sample HEIs of the SQELT case study was rather different and
complicated as well, see also the SQELT Benchlearning Reports and Baseline Reports (SQELT-MIO 2020).
This is also reflected by the SWOT analysis shown in Table 5. For example, the VWDNHKROGHUV¶ interviews at
the sample universities showed that the view is widespread that PIs are (purely) quantitative magnitudes,
and that the interviewees often found it difficult to accept qualitative PIs as PIs and integrate them into their
PI set.14
Table 5: SWOT analysis of PIs, adopted with revisions from (Leiber 2020, Table 5) (n/a = not applicable)
Strengths
Weaknesses
1)
Availability of improvement-oriented conceptualisa-
tion of existing (quantitative) PIs of L&T
1)
Not all (quantitative) PIs that could be relevant for
L&T quality improvement at the HEI are defined
and/or collected and/or used
2)
High comparability degree of (quantitative) PIs in na-
tional higher education system because of Ministry-
driven standardisation
2)
Existing small PI collection fails to adequately ad-
dress current needs of the HEI (e.g., because PIs are
driven by higher education politics)
3)
Availability of close-to-complete HEI-specific set of
quantitative PIs
3)
Reliability of PI data and information is often question-
able
Opportunities
Threats
1)
Introducing additional (quantitative) PIs in L&T and
completion towards close-to-complete, HEI-specific
set (e.g., filling gaps; completing profile such as con-
tinuing education and Lifelong Learning; Learning An-
alytics; Education for Sustainable Development)
1)
Expectation of the environment that HEIs can or will
be characterized and qualified by a few simple (quan-
titative) PIs (e.g., based on rankings)
2)
Gaining more transparency with respect to organisa-
tional performance through use of internal (quantita-
tive) PIs
2)
Danger of reducing PDM to only quantitative (under-
complex) PIs15
3)
Enhancing the availability of data and information on
social impact of HEI performance after integration on
QDWLRQDOVWXGHQWV¶VXUYH\
As Table 5 shows, some of the SQELT case study universities report the strength of having improvement-
oriented PI sets in place, one university assesses the high comparability degree of PIs in the national higher
13 This does not mean to overlook that stakeholders of higher education exist who think that PIs, in particular quantitative PIs are
GLVSHQVDEOHRUHYHQPRUHWKDQWKLVWKHXVHRI3,VLVFRUUXSWLQJDFDGHPLFV¶TXDOLW\DSSURDFKHV6XFKWKRXJKWVwere ex-
pressed, for example, in full-text answers to the SQELT online survey reported in Intellectual Output O10 (see SQELT-MIO
2020).
14 Such opinion was also given in full-text answers to the SQELT online survey reported in Intellectual Output O10 (see SQELT-
MIO 2020).
15 This point was also reported in full-text answers to the SQELT online survey (see Intellectual Output O10, SQELT-MIO 2020).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 14/95
education system as a strength (even though it is Ministry-driven) while a further university seems to have a
close-to-complete HEI-specific set of (quantitative) PIs. Four typical weaknesses of PI sets that were identi-
fied in the SQELT sample HEIs are also depicted in Table 5: 1) It occurs that not all PIs that could be rele-
vant for L&T quality improvement at the HEI are defined and/or collected and/or used. Exemplary deficits
are that WHDFKHUV¶points of view are missing or misrepresented in the PI sets; that the broad topic of student
assessments of various issues is not looked at.16 2) At some sample universities, existing small PI collec-
tions failed to adequately address the current needs of the institution, for example because the PIs were
introduced by higher education politics instead of being designed by HEI-internal stakeholders. 3) It is well
known at least from anecdotal narratives circulating in higher education systems that it is a widespread
problem (also present in the SQELT case study universities) that the reliability of PI data and information is
often questionable. Ubiquitous reasons are that data are collected through faculty or third-party data ser-
vices while it is processed by university staff; or that various mechanisms exist for collecting data and infor-
mation which are not integrated and sometimes hard or even impossible to be integrated into a single PDM
system. 4) At some HEIs it is criticized that (quantitative) PIs are developed that do not adequately grasp a
certain HEI performance.
The strategic SWOT analysis for the benchlearning area of PIs of the SQELT university sample is com-
pleted by raising the question how the identified strengths of PI models can be used to overcome the weak-
nesses, to exploit the opportunities and to encounter the threats described in Table 5. In the present case,
however, it turns out that the strengths mentioned in Table 5 cannot be used. In consequence, other inter-
vention measures must be applied. These follow immediately from the idea that the deficiencies and desires
should be fixed which are incorporated in the weaknesses, opportunities and threats mentioned in Table 5.
As a first step, respective general recommendations for action are depicted in the last line of Table 6. At this
point it is worthwhile to mention and remember that the adoption and development of an HEI-specific PI set
selected from the SQELT comprehensive PI set (Appendix 3) would contribute to constructively deal with
most of the weaknesses, opportunities and threats listed in Table 5. In other words, the SQELT comprehen-
sive PI set provides the means for realising the recommendations for action given in the last line of Table 6.
Table 6: Strategy matrix of PI SWOTs and its recommendations for EIOD, adopted from (Leiber 2020, Ta-
ble 6) (n/a = not applicable)
Finally, it is worthwhile to note that the above SWOT analysis of PIs of L&T in higher education can be com-
plemented by some of the results of the SQELT online survey about various VWDNHKROGHUV¶DVVHVVPHQWVRI
a selection of PIs (SQELT Intellectual Output O10) from the SQELT comprehensive PI set (SQELT Intellec-
tual Output O9) (SQELT-MIO 2020). In particular, some of the full-text answers of the survey respondents
recognise weaknesses 1) and 3) and threats 1) and 2) depicted in Table 5.
16 Such assessments were also given in full-text answers to the SQELT online survey reported in Intellectual Output O10 (see
SQELT-MIO 2020).
W
O
T
1)
2)
3)
1)
2)
3)
1)
2)
S
S/W
S/O
S/T
1)
n/a
n/a
n/a
n/a
n/a
n/a
n/a
n/a
2)
n/a
n/a
n/a
n/a
n/a
n/a
n/a
n/a
3)
n/a
n/a
n/a
n/a
n/a
n/a
n/a
n/a
M
M/W
M/O
M/T
1)
Complete
collected
and used,
HEI-
specific PI
set
Evaluate
performance
monitoring
needs of HEI
and revise
existing
(small) PI
set accord-
ingly
Implement
quality assur-
ance of data
acquisition and
stratify meth-
odology of PI
collection and
processing
Complete
PI set to-
wards
close-to-
complete
HEI-
specific
set
Introduce
internal
organisa-
tional PIs
Foster the
develop-
ment of a
national
student
survey
Evaluate
(existing) PI
set for ade-
quate repre-
sentation/
grasp of HEI
performance
Complement
set of quanti-
tative PIs
with set of
qualitative
(complex)
PIs
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 15/95
Ethics of PDGM
The fourth benchlearning area of the SQELT case study is ethics of PDGM, the core theme of which is the
protection of performance data collected about individuals and institutions against unauthorised use and
misuse by third parties. Such data include, for example, data from higher education analytics such as
Learning Analytics, academic analytics, student analytics etc.
In the European Union (EU) and thus in the European Higher Education Area (EHEA), the EU General Data
Protection Regulation (GDPR; EC 2016) on the protection of natural persons with regard to the processing
of personal data and on the free movement of such data was passed a few years ago and is in force in all
EU member states since May 25, 2018. By further national regulations, however, the EU member states
bring the right to the protection of personal data under GDPR into line with the right to freedom of expres-
sion and information (Art. 85 and Art. 86 of the GDPR). Otherwise, the member states are generally not al-
lowed to weaken or strengthen the GDPR regulations by stipulated national regulations. However, the
GDPR contains various opening clauses that enable the individual member states to regulate certain as-
pects of data protection even at national level.17
According to the GDPR, data collected must be justified under one of the lawful bases for data processing:
The collected data meet a legal obligation (e.g., date of birth); they are in the XQLYHUVLW\¶V legitimate interests
(e.g., prior qualifications); they are required to fulfil contractual obligations with the student (e.g., modules
studied, grades and use of IT facilities). In consequence, VWXGHQWV¶DQGRWKHUVWDNHKROGHUV¶FRQVHQWLVQRW
needed for much of the data associated with Learning Analytics because the data are already regularly and
justifiably collected. Consent by stakeholders or students must be obtained, however, for using special cate-
gory data (e.g., ethnic origin; time spent for studying, physical and virtual activities at the HEI), and when
interventions based on their data analytics are to be undertaken with individual stakeholders or students.18
If obtaining consent is required, the following GDPR rules must be met (EC 2016; cf. Appendix 5): Consent
requests shall not be intermingled with other terms and conditions; stakeholders must be informed clearly
and in detail about what they are consenting to; information must be given to stakeholders of any third-party
data controllers who might rely on their consent; stakeholders need to be made aware of the consequences
of either providing or withholding their consent; data collection must require clear, positive action from
stakeholders (for example, the use of pre-ticked boxes is not acceptable).
In the SQELT Erasmus+ Strategic Partnership, the following weaknesses and threats of the ethics of PDGM
were identified at certain partner universities: 1) Data protection and privacy concerns related to PDGM
were not or not adequately recognized; for example, stakeholders reported that, currently, there waV³ORZRU
QRVHQVLELOLW\IRUHWKLFDOLVVXHV´around (performance) data management (weakness). 2) At other places,
however, some stakeholders saw data protection and privacy concerns (e.g., WHDFKHUHYDOXDWLRQVVWXGHQWV¶
VDWLVIDFWLRQVWXGHQWV¶VWXG\VXFFHVVDVXQGHVLUDEOHOLPLWDWLRQVWRWKHDFFHVVLELOLW\RISHUIRUPDQFHGDWDDQG
information (threat). In a nutshell, these two perspectives present the ubiquitous tension between the possi-
ble benefits of (performance) data analytics on the one hand and its possible threats for (individual) self-
determination on the other hand.
A general recommendation that solves such problems is to develop and install a reliable ethical framework
including the systematic and ongoing reflection of ethical issues of PDGM. This is essential to establish a
vivid PDGM culture which is based on a sufficiently widespread understanding of (performance) data own-
ership and related interpretation capabilities and evidence-informed decision-making. The SQELT Ethical
Code of Practice for (Performance) Data Management (see Appendix 5) can be understood as an exem-
plary proposal following this recommendation.19
17 This is the reason why the GDPR is also referred to as a hybrid between directive and regulation.
18 For further details on these issues see also Appendix 5 (and Appendix 4).
19 A more detailed abstract strategic SWOT analysis is not given for the benchlearning area of Ethics of PDGM for several rea-
sons: The possible strengths and opportunities are more or less pre-determined by the GDPR and and subordinate ethical and
legal regulations while the typical weaknesses and threats that were identified in the SQELT case study are those few men-
tioned above. Nevertheless, in this context it would be interesting to analyse how different European countries and national
higher education systems interpret the GDPR.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 16/95
Conclusions
First of all, it is worthwhile to notice that the ubiquitous and, at the same time, notorious success factors of
QM and evidence-informed organisational development (EIOD) in HEIs are non-trivially (also) relevant for a
successful implementation and further development of PDGM in HEIs (and any other complex social organi-
sation as well). These success factors are corroborated by the above-discussed strategic SWOT analyses
and comprise but are not limited to the following issues (also cf. Leiber 2019b, 332ff.): to foster and dissemi-
nate personal characteristics for ethical behaviour of participating stakeholders, including self-competences
and social competences (cf. Leiber 2016); to oblige leadership of the organisation; to assure data and re-
porting quality including appropriate design, tested validity, reliability and communicated purposes of data
collection and use; to involve relevant stakeholders in all PDGM development and application phases (par-
ticipatory approach); to close the quality (Deming) cycles thereby ensuring adequate quality enhancement
and EIOD of PDGM; to invest sufficient resources (time, finances, competences, human workforce).
In the light of these success factors, benchlearning and strategic SWOT analyses related to the SQELT uni-
YHUVLW\SURMHFWSDUWQHUV¶DFWXDO3'*0V\VWHPVRUDSSURDFKHVH[KLELWed the need of several EIOD initiatives
to further develop, improve and refine their PDGM models. Particularly, the sample PDGM models showed
the following organisational transformation needs:
x Procedures of data processing and communication, software platforms and responsible bodies for
collecting and interpreting PIs must be (further) developed to improve quality as well as usability
and accessibility of data and information. Particularly, there was a need for better organizing
3'*0V\VWHPVWKDWDYRLGPXOWLSOHLVODQGVROXWLRQVDQGXQQHFHVVDU\UHVRXUFHV¶FRQVXPSWLRQ
x The empirical performance monitoring needs of HEIs must be balanced with various opposing pol-
icy demands originating from traditional disciplinary attitudes (e.g. rejection on PIs, particularly
quantitative PIs) as well as from ministerial education politics (e.g. focus on quantitative and eco-
nomic indicators and reduction to very small PI sets that are easily manageable though less in-
formative).20
x Processes, organisational bodies and human resources for fostering participative responsibility for
PDGM must be established including more efficient decision-making of collegial bodies.
x Educational strategies (mission, values, vision) must be established, including the prospects and
ambiguities of PDGM and data analytics (e.g. Learning Analytics).
Furthermore, based on the stocktaking and benchlearning insights of the SQELT project partners (SQELT-
MIO 2020) including stakeholder focus group surveys and discussions, the following critical success factors
of PDGM could be identified that may be supportive to guidance for other HEIs that engage in developing
their PDGM:
x Provide justifiable belief in success promises of PDGM ± surveyed stakeholders are often unsure
about the possibility to fulfil all promises of PDGM, particularly Learning Analytics.
x Leadership engagement is a core driver of PDGM development and implementation ± surveyed
stakeholders sometimes diagnose insufficient engagement of leaders in PDGM.
x Reflected understanding and practice of PD(G)M based on adequate/sufficient and self-deter-
mined, HEI adequate PI sets is also of basic importance ± surveyed stakeholders see various defi-
cits in theLU+(,V¶PI sets.
x Reflected and applied PDGM ethics is indispensable ± this is seen as a very important issue by
most surveyed stakeholders (while the willingness to practice this theoretical insight does not al-
ways seem to keep pace with the claimed importance).
x Finally, an adequate financial climate is necessary ± underfinanced L&T is often experienced as
one of the obstacles to implement appealing PDGM solutions.
20 Analysing differences in higher education politics between SQELT partner countries is an interesting subject that could
deepen the understanding of PDGM. However, it was not focus of the SQELT project, particularly because of its comprehensive
toolbox approach that should render SQELT results to be applicable in any European (and possibly also non-European) coun-
try.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 17/95
Open Questions and Limitations of the SQELT Case Study
As usual, there are specific limitations to the SQELT project and case study on which this Guideline is
based: the SQELT project was limited in time (duration of 36 months) and funding (Erasmus+ co-funding);
SDUWLFXODUO\WKHSURMHFW¶VWLPHZLQGRZwas too narrow to implement evidence-informed organisational devel-
opment (EIOD) related to PDGM in the case study universities: The core activities of the last three main
steps of the benchlearning model ± Integration; Action; Maturity ± (see Figure 2) can only be addressed af-
ter the lifetime of the SQELT project that ended in November 2020. Particularly, effectively tracing SQELT-
informed interventions and their effects (follow-up) is not possible within the 36 months of project lifetime.
Accordingly, due to these time restrictions in combination with the complexity of the intended outcomes, the
SQELT impact analysis, which was planned as a before-after comparison, was replaced by two explorative
assessments of then available PI sets by various relevant stakeholder groups from the partnership universi-
ties and from other universities, QM units and experts. Another limitation is set by the fluid stakeholder par-
ticipation in HEIs, particularly students.
In addition, there are well-known limitations of SWOT analysis (cf. Leiber et al. 2018, pp. 353-354) and limi-
tations of benchlearning which must be accounted for. As for the latter, for example, there are dangers of
viewing benchlearning as a one-time project; focusing on quantitative output data; misunderstanding it as
self-mirroring; emulating or mimicking competitors; fostering a rat race. Further restrictions to effective
EHQFKOHDUQLQJDUHDQRUJDQLVDWLRQV¶LQDELOLW\RIUHDGLQHVVDQGIOH[LELOLW\WRLPSOHPHQWFKDQJHLQDELOLW\RI
transparency and communication; fear of detecting and exposing weaknesses (and threats). Whether these
limitations occur in the context of the SQELT case study project can only be determined later (after the end
of the project).
Another contingent limitation of the SQELT case study and therefore of this Guideline lies in the fact that the
benchlearning area of ethics of PDGM requires a deeper investigation. For example, it could be interesting
to analyse how different European countries and national higher education systems interpret, handle and
adopt the GDPR (EC 2016). On the level of the six HEIs of the case study from Aveiro, Birmingham, Ghent,
Kraków, Krems and Milan, however, this analysis was carried out and documented in Intellectual Output O8
of the SQELT project and Appendix 5 of this Guideline.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 18/95
References and Literature Overview
Accreditation Council (2013) Rules for the Accreditation of Study Programmes and for System Accreditation
[in German]. Decision of the Accreditation Council of 08 December 2009, last modified on 20 February
2013, Drs 20/2013 (Bonn, Accreditation Council). Available at http://www.akkreditierungsrat.de/filead-
min/Seiteninhalte/AR/Beschluesse/AR_Regeln_Studiengaenge_aktuell.pdf (accessed 11 September 2020).
Adam, S. (2013) The central role of learning outcomes in the completion of the European Higher Education
Area 2013-2020. Journal of the European Higher Education Area, 2013-2, 1-35.
Åkerlind, G.S. (2004) A new dimension to understanding university teaching. Teaching in Higher Education,
9(3), 363-375.
Albareda-Tiana, S., Vidal-Raméntol, S. & Fernández-Morilla, M. (2018) Implementing the sustainable devel-
opment goals at University level. International Journal of Sustainability in Higher Education 19(3), 473-497.
Alghamdi, M. & Alanizan, S. (2018). Performance Indicators, motivations and barriers in online distance
courses: a case study at Arab Open University. International Journal of Teaching and Education, 6(2), 46-
60. Available at https://pdfs.semanticscholar.org/0627/1a17fd86a2301005cb763ba56c572cb63168.pdf (ac-
cessed 18 September 2020).
Alhassan, I., Sammon, D. & Daly, M. (2018) Data governance activities: a comparison between scientific
and practice-oriented literature. Journal of Enterprise Information Management, 31(2), 300-316.
Ali, L., Hatala, M., Gasevic, D. & Jovanovic, J. (2012) A qualitative evaluation of evolution of a learning ana-
lytics tool. Computers and Education, 58(1), 470-489.
Anderson, L.W. and Krathwohl, D.R. et al. (eds.) (2013) A Taxonomy for Learning, Teaching, and As-
sessing: A Revision of Bloom's Taxonomy of Educational Objectives. Abridged Edition. Boston (MA): Pear-
son Higher Education.
Arnold, K.E. (2010) Signals: applying academic analytics. EDUCAUSE Quarterly, 33(1). Available at
https://er.educause.edu/articles/2010/3/signals-applying-academic-analytics (accessed 11 September
2020).
Association of Governing Boards (AGB) (2010) The most common performance indicators for institutions
and their boards. Washington, D.C.: AGB. Available at https://agb.org/trusteeship-article/the-most-common-
performance-indicators-for-institutions-and-their-boards/ (accessed 11 September 2020).
Ball, R. & Halwachi, J. (1987) Performance indicators in higher education. Higher Education, 16, 393-405.
Behle+0DKHU60HDVXULQJ³7HDFKLQJ([FHOOHQFH´DQG³/HDUQLQJ*DLQ´LQWKH8QLWHG.LQJGRP
Beiträge zur Hochschulforschung, 40(2), 48-67.
Bellina, L., Tegeler, M., Müller-Christ, G. & Potthast, T. (2018) Education for Sustainable Development in
Higher Education Teaching [in German]. %0%)3URMHFW³1DFKKDOWLJNHLWDQ+RFKVFKXOHQHQWZLFNHOQ± ver-
netzen ± berichten (HOCHN´%UHPHQXQG7ELQJHQAvailable at https://www.hochn.uni-hamburg.de/-
downloads/handlungsfelder/lehre/hoch-n-leitfaden-bne-in-der-hochschullehre.pdf (accessed 11 September
2020).
%HQQHWW5.DQH66WXGHQWV¶LQWHUSUHWDWLRQVRIWKHPHDQLQJVRITXHVWLRQQDLUHLWHPVLQWKH1D
tional Student Survey. Quality in Higher Education, 20(2), 129-164.
Berk, R.A. (2005) Survey of 12 strategies to measure teaching effectiveness. International Journal of
Teaching and Learning in Higher Education, 27(1), 48-62.
Biesenbender, S. & Hornbostel, S. (2016a) The research core dataset for the German science system:
challenges, processes and principles of a contested standardization project. Scientometrics, 106, 837-847.
Biesenbender, S. & Hornbostel, S. (2016b) The research core dataset for the German science system: de-
veloping standards for an integrated management of research information. Scientometrics, 108, 401-412.
Blair, E. & Noel, K.V. (2013) Improving higher education practice through student evaluation systems: is the
student voice being heard? Assessment & Evaluation in Higher Education, 39(7), 879-894.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 19/95
Blank, R.K. (1993) Developing a system of education indicators: selecting, implementing, and reporting indi-
cators. Educational Evaluation and Policy Analysis, 15(1), 65-80.
Blömeke, S., Zlatkin-Troitschanskaia, O., Kuhn; C. & Fege, J. (2013) Modelling and measuring competen-
cies in higher education: tasks and challenges. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn & J.
Fege (eds.), Modeling and Measuring Competencies in Higher Education. Tasks and Challenges (pp. 1-10).
Rotterdam: Sense Publishers.
Bocconi, S., Kampylis, P. & Punie, Y. (2012) Innovative Learning: Key Elements for Developing Creative
Classrooms in Europe. Luxembourg: Publications Office of the European Union.
Borden, M.H. & Coates, H. (2017) Learning Analytics as a counterpart to surveys of student experience.
New Directions for Higher Education, 179, 89-102.
Brigandt, B. (2010) Scientific reasoning is material inference: combining confirmation, discovery, and expla-
nation. International Studies in the Philosophy of Science, 24(1), 31-43.
Buyarski, C., Murray, J. & Torstrick, R. (2017) Learning Analytics across a statewide system. New Direc-
tions for Higher Education, 179, 33-42.
Caeiro, S., Sandoval Hamón, L.A., Martins, R. & Bayas Aldaz, C.E. (2020) Sustainability assessment and
benchmarking in higher education institutions ± A critical reflection. Sustainability, 12, 543 (30 p.).
Camilleri, M.A. & Camilleri, A.C. (2018) The performance management and appraisal in higher education. In
C. Cooper (2018) Driving Productivity in Uncertain and Challenging Times. (University of the West of Eng-
land, 5th September). British Academy of Management, UK.
Camp, R.C. (1994) Benchmarking [in German]. München: Hanser.
Campbell, D.F.J. & Carayannis, E.G. (2013) Epistemic Governance in Higher Education. Quality Enhance-
ment of Universities for Development. (SpringerBriefs in Business.). New York, NY: Springer. Available at
http://www.springer.com/business+%26+management/organization/book/978-1-4614-4417-6 (accessed 18
September 2020).
&DPSEHOO')-3DQWHOLü,3URFHVVHVRIOHDUQLQJDQGSURFHVVHVRILQQRYDWLRQ. In: E.G. Carayan-
nis (ed.) (2020) Encyclopedia of Creativity, Invention, Innovation and Entrepreneurship (Living Edition) (pp.
1-6). New York, NY: Springer. Available at https://link.springer.com/referenceworkentry/10.1007/978-1-
4614-6616-1_200098-1 (accessed 18 September 2020).
Campbell, J.P., DeBlois, P. & Oblinger, D. (2007) Academic analytics. A new tool for a new era. Available at
https://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era (accessed 11 Septem-
ber 2020).
Cathcart, A., Greer, D. & Neale, L. (2014) Learner-focused evaluation cycles: facilitating learning using
feedforward, concurrent and feedback evaluation. Assessment and Evaluation in Higher Education, 39(7),
790-802.
Cave, M., Hanney, S., Henkel, M., & Kogan, M. (1997) The Use of Performance Indicators in Higher Educa-
tion: The Challenge of the Quality Movement. London: Jessica Kingsley.
Centre for Higher Education Development (CHE). (2018) U Multirank Indicator Book 2018. Available at
https://www.umultirank.org/export/sites/default/press-media/documents/Indicator-Book-2018.pdf (accessed
11 September 2020).
Center for Postsecondary Research (2018) Engagement Insights. Survey Findings on the Quality of Under-
graduate Education. National Survey of Student Engagement (NSSE), Annual Results 2018. Bloomington:
Center for Postsecondary Research (CPR). Available at https://files.eric.ed.gov/fulltext/ED594729.pdf (ac-
cessed 11 September 2020).
Chahar, B. & Hatwal, V. (2018) A study of performance management system in higher education institution
with special reference to academicians. Journal of Emerging Technologies and Innovative Research
(JETIR), 5(6), 350-353.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 20/95
Challice, G., Compton, S. & Vickers, N. (2018) 2017 Student Experience Survey. Methodological Report.
Melbourne: Social Research Centre. Available at https://www.qilt.edu.au/docs/default-source/gos-re-
ports/2017/2017-ses-methodology-report.pdf?sfvrsn=3048e33c_2 (accessed 11 September 2020).
Chalmers, D. (2008) Indicators of University Teaching and Learning Quality (Strawberry Hills, Australian
Learning and Teaching Council). Available at https://www.researchgate.net/publica-
tion/265248222_INDICATORS_OF_UNIVERSITY_TEACHING_AND_LEARNING_QUALITY (accessed 11
September 2020).
Cheng, M. (2016) Quality in Higher Education. Developing a Virtue of Professional Practice. Rotterdam:
Sense Publishers.
CISCO (2016) Digitising Higher Education. To Enhance Experiences and Improve Outcomes. San Jose:
CISCO.
Coccoli, M., Guercio, A., Maresca, P. & Stanganelli, L. (2014) Smarter universities: a vision for the fast-
changing digital era. Journal of Visual Languages and Computing, 25, 1003-1011.
Corrin, L., Kennedy, G., French, S. Buckingham Shum, S., Kitto, K., Pardo, A., West, D., Mirriahi, N. & Col-
vin, C. (2019) The Ethics of Learning Analytics in Australian Higher Education. A discussion paper. Availa-
ble at https://www.siyaphumelela.org.za/documents/5cc17266a030a.pdf (accessed 4 July 2020).
Cunha J.M. & Miller, T. (2014) Measuring value-added in higher education: possibilities and limitations in
the use of administrative data. Economics of Education Review, 42, 64-77. Available at
http://dx.doi.org/10.1016/j.econedurev.2014.06.001 (accessed 11 September 2020).
Daniel, B. (2015) Big Data and Analytics in higher education: opportunities and challenges. British Journal
of Educational Technology, 46(5), 904-920.
DDCMS [Department for Digital, Culture, Media and Sport] (2018) Data Ethics Framework. London: United
Kingdom. Available at https://www.gov.uk/government/publications/data-ethics-framework/data-ethics-
framework (accessed 4 July 2020).
De Freitas, S., Gibson, D., du Plessis, C., Halloran, P., Williams, E. Ambrose, M., Dunwell, I. & Arnab, S.
(2015) Foundations of dynamic learning analytics: using university student data to increase retention. British
Journal of Educational Technology, 46(6), 1175-1188.
DeGEval [Society for Evaluation] (2016) Standards for Evaluation [in German]. Mainz: DeGEval. Available
at https://www.degeval.org/fileadmin/Publikationen/Kurzversion_der_Standards_fuer_Evaluation_-_Revi-
sion_2016.pdf (accessed 11 September 2020).
Douglas Williams, J. (1995) The challenge of developing new educational indicators. Educational Evalua-
tion and Policy Analysis, 17(1), 113-131.
Drachsler, H. & Greller, W. (2016) Privacy and analytics: IW¶VD'(/,&$7(LVVXH± a checklist for trusted
learning analytics. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge
(ACM), 89-98.
Du Troit-Brits, C. (2018) Towards a transformative and holistic continuing self-directed learning theory.
South African Journal of Higher Education, 32(4), 51-65.
Eaton, S. (2018) Innovating integrity in online learning contexts. Journal for Research and Practice in Col-
lege Teaching, 3(2), 98-103.
EC [EU Commission] (2016) General Data Protection Regulation (GDPR). As of 4 May 2016; entry into
force 25 May 2018. Available at https://gdpr-info.eu/ and https://eur-lex.europa.eu/legal-con-
tent/EN/TXT/PDF/?uri=CELEX:32016R0679 (accessed 17 September 2020).
ECPDM [SQELT Ethical Code of Practice for (Performance) Data Management] (2020) Ethical Code of
Practice for (Performance) Data Management. Available at https://www.evalag.de/en/research/sqelt/intellec-
tual-outputs/ (accessible after end of SQELT project).
Elouazizi, N. (2014) Critical factors in data governance for Learning Analytics. Journal of Learning Analytics,
1(3), 211-222.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 21/95
Elton, L. (2004) Goodharts Law and performance indicators in higher education. Evaluation and Research
in Education, 18 (1-2), 120-128.
ENQA (2015) Standards and Guidelines for Quality Assurance in the European Higher Education Area.
Brussels: European Association for Quality Assurance in Higher Education (ENQA).
Fernie, S. & Thorpe, A. (2007) Exploring change in construction: supply chain management. Engineering,
Construction and Architectural Management, 14(4), 319-333.
Findler, F., Schoenherr, N., Lozano, R. & Stacherl, B. (2019) Assessing the impacts of higher education in-
stitutions on sustainable development. An analysis of tools and indicators. Sustainability, 11, 59, 1-19.
Freytag, P.V. & Hollensen, S. (2001) The process of benchmarking, benchlearning and benchaction. The
TQM Magazine, 13(1), 25-33.
Fritz, J. (2011) Classroom walls that talk: Using online course activity data of successful students to raise
self-awareness of underperforming peers. Internet and Higher Education, 14(2), 89-97.
Fritz, J. (2017) Using analytics to nudge student responsibility for learning. New Directions for Higher Edu-
cation, 179, 65-75.
Fuentes, R., Fuster, B. & Lillo-Bañuls, A. (2016) A three-stage DEA model to evaluate learning-teaching
technical efficiency: key performance indicators and contextual variables. Expert Systems with Applications,
48, 89-99.
García-Aracil, A. & Palomares-Montero, D. (2010) Examining benchmark indicator systems for the evalua-
tion of higher education institutions. Higher Education, 80, 217-234.
Gibbs, A., Kennedy, D. & Vickers, A. (2012) Learning outcomes, degree profiles, Tuning project and com-
petences. Journal of the European Higher Education Area, 2012-1, 71-88.
Gover, A., Loukkola, T. & Sursock, A. (2015) ESG Part 1: Are Universities Ready? Brussels: EUA.
Greller, W. & Drachsler, H. (2012) Translating learning into numbers: a generic framework for learning ana-
lytics. Educational Technology & Society, 15(3), 42-57.
Guerrero, M., Cunningham, J.A. & Urbano, D. (2015) Economic impact of entreprHQHXULDOXQLYHUVLWLHV¶DFWLY
ities: An exploratory study of the United Kingdom. Research Policy, 44, 748±764.
Gunn, A. (2018) Metrics and methodologies for measuring teaching quality in higher education: developing
the Teaching Excellence Framework (TEF). Educational Review, 70(2), 129-148.
Harrison, H., Birks, M., Franklin, R. & Mills, J. (2017) Case study research: Foundations and methodological
orientations. Forum: Qualitative Social Research, 18(1): Art. 19. Available at http://www.qualitative-re-
search.net/index.php/fqs/article/view/2655/4079 (accessed 2 October 2020).
Harvey, L. (2003) Student feedback. Quality in Higher Education, 9(1), 3-20.
HEC [Higher Education Commission] (2016) From Bricks to Clicks. The Potential of Data and Analytics in
Higher Education. London: Policy Connect.
Hickel, J. (2019) Is it possible to achieve a good life for all within planetary boundaries? Third World Quar-
terly, 40:1, 18-35.
Hickel, J. & Giorgos, K. (2019) Is green growth possible? New Political Economy, (18 pages), published
online: 17 Apr 2019.
Higher Education Authority (2018a) Higher Education System Performance Framework. Dublin: Higher Ed-
ucation Authority (HEA). Available at https://www.education.ie/en/Publications/Education-Reports/higher-
education-system-performance-framework-2018-2020.pdf (accessed 11 September 2020).
Higher Education Authority (2018b) The Irish Survey of Student Engagement (ISSE). Results from 2018.
Dublin: Higher Education Authority (HEA)/ Irish Universities Association/ Technological Higher Education
Association/ Union of Students in Ireland. Available at http://hea.ie/assets/uploads/2018/11/ISSE-Report-
2018-TEXT-Tag-A.pdf (accessed 11 September 2020).
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 22/95
+5.>*HUPDQ5HFWRUV¶&RQIHUHQFH@Higher Education in a Digital Age: Rethinking Information Com-
petency ± Redirecting Processes. %RQQ*HUPDQ5HFWRUV¶&RQIHUHQFH
Huertas, E., Biscan, I., Ejsing, C., Kerber, L., Kozlowska, L., Marcos Ortega, S., Lauri, L., Risse, M.,
Schörg, K. & Seppmann, G. (2018) Considerations for Quality Assurance of E-Learning Provision. Report
from the ENQA Working Group VIII on Quality Assurance and E-Learning. Brussels: ENQA. Available at
https://enqa.eu/indirme/Considerations%20for%20QA%20of%20e-learning%20provision.pdf (accessed 17
September 2020).
,&2,QIRUPDWLRQ&RPPLVVLRQHU¶V2IILFHGuide to the General Data Protection Regulation (GDPR).
Available at https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protec-
tion-regulation-gdpr/ (accessed 4 July 2020).
IUSE [Indiana University School of Education] (2018) NSSE Engagement Indicators. Indiana University
Bloomington. Available at https://nsse.indiana.edu/nsse/survey-instruments/engagement-indicators.html
(accessed 11 September 2020).
Johnson, J.A. (2017) Ethics and justice in Learning Analytics. New Directions for Higher Education, 179, 77-
87.
Johnson, L., Adams Becker, S., Estrada, V. & Freeeman, A. (2014) NMC Horizon Report: 2014 Higher Edu-
cation Edition. Austin: The New Media Consortium.
Johnson, L., Smith, R., Willis, H., Levine, A. & Haywood, K. (2011) The 2011 Horizon Report. Austin,
Texas: The New Media Consortium.
Kells, H.R. (1992) Performance Indicators for Higher Education: A Critical Review. Education and Employ-
ment Division Population and Human Resources Department. The World Bank
Keshavarz, M. (2011) Measuring course learning outcomes. Journal of Learning Design, 4(4), 1-9.
Khatri, V. & Brown, C.V. (2010) Designing data governance. Communications of the ACM, 53(1), 148-152.
Kim, C.K. & Chang, H.-C. (2019) The current state of data governance in higher education. Proceedings of
the Association for Information Science and Technology (ASIS&T), 55(1), 198-206. Available at:
https://onlinelibrary.wiley.com/doi/epdf/10.1002/pra2.2018.14505501022 (accessed 18 April 2019).
Kinash, S., Naidu, V., Knight, D., Judd, M.-M., Sid Nair, C., Booth, S., Fleming, J., Santhanam, E., Tucker,
B. & Tulloch, M. (2015) Student feedback: a learning and teaching performance indicator. Quality Assur-
ance in Education, 23(4), 410-428.
Kleimann, B. (2019) (German) Universities as multiple hybrid organisations. Higher Education, 77, 1085-
1102. Available at https://doi.org/10.1007/s10734-018-0321-7 (accessed 4 July 2020).
.OHPHQþLþ, M. (2015) What is student agency? An ontological exploration in the context of research on stu-
GHQWHQJDJHPHQW,Q0.OHPHQþLþ6%HUJDQ & 53ULPRåLþ(GV, Student Engagement in Europe: Soci-
ety, Higher Education and Student Governance (pp. 11-29). Strasbourg: Council of Europe Publishing.
Komatsu, H. & Rappleye, J. (2018) Will SDG4 achieve environmental sustainability? CASGE Working Pa-
per No. 4 2018. Retrieved from http://dx.doi.org/10.14507/casge4.2018 (accessed 23 March 2020).
Krämer, J. & Müller-Naevecke, C. (2014) Compendium Competences ± Formulate Competence Goals for
Higher Education [in German]. Münster: University of Applied Sciences Münster.
Krakower, J.Y. (1985) Assessing Organizational Effectiveness: Considerations and Procedures. Boulder,
CO: NCHEMS.
Kruse, A. & Pongsajapan, R. (2012) Student-centered Learning Analytics. CNDLS Thought Papers. Availa-
ble at https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.475.106&rep=rep1&type=pdf (accessed
11 September 2020).
Lane, J.E. (2014) Building a Smarter University: Big Data, Innovation, and Analytics. Albany: SUNY Press.
Laredo, P. (2007) Revisiting the Third Mission of universities: Toward a renewed categorization of university
activities. Higher Education Policy, 20(4), 441-456.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 23/95
Ledermüller, K. & Fallmann, I. (2017) Predicting learning success in online learning environments: self-reg-
ulated learning, prior knowledge and repetition. Zeitschrift für Hochschulentwicklung (ZFHE), 12(1), 79-99.
Leiber, T. (2016) Persönlichkeitsentwicklung als elementares Bildungsziel. Methodische Optionen der Um-
setzung und Bewertung im Hochschulbereich. die hochschullehre. Interdisziplinäre Zeitschrift für Studium
und Lehre, 2, 21 S. Available at http://www.hochschullehre.org/wp-content/files/diehochschul-
lehre_2016_leiber.pdf (accessed 11 September 2020).
Leiber, T. (2018) Impact evaluation of quality management in higher education: a contribution to sustainable
quality development in knowledge societies. European Journal of Higher Education, 8 (3), 235-248.
Leiber, T. (2019a) A general theory of learning and teaching and a related comprehensive set of perfor-
mance indicators for higher education institutions. Quality in Higher Education, 25(1), 76-97.
Leiber, T. (2019b) Organizational change and development through quality management in higher education
institutions: theory, practice, and recommendations for change agents. In R.G. Hamlin, A.D. Ellinger & J.
Jones (eds.), Evidence-Based Initiatives for Organizational Change and Development (pp. 316-341). Hershey
(PA): IGI Global.
Leiber, T. (2020) Performance data governance and management in learning and teaching: Basic elements
and desiderata in the light of a European case study. Documentation of the 2019 Annual Meeting of the
German Society for Higher Education Research (GfHf). Available at www.gfhf2019.de (accessed 30 Novem-
ber 2020).
Leiber, T. & Meyer, W. (2019) Ethics in evaluation and ethical aspects of the standards for evaluation [in
German]. In J.U. Hense, W. Böttcher, M. Kalman & W. Meyer (eds.) (2019) Evaluation: Standards in different
fields of action. Uniform quality standards despite heterogeneous practice? [in German] (pp. 87-104). Mün-
ster: Waxmann.
Leiber, T., Stensaker, B. & Harvey, L. (2015) Impact evaluation of quality assurance in higher education:
methodology and causal designs. Quality in Higher Education, 21 (3), 288-311.
Leiber, T., Stensaker, B. & Harvey, L. (2018) Bridging theory and practice of impact evaluation of quality
management in higher education institutions: a SWOT analysis. European Journal of Higher Education,
8(3), 351-365.
Liebowitz, J. (2017) Thoughts on recent trends and future research perspectives in Big Data and Analytics
in higher education. In B. K. Daniel (ed.), Big Data and Learning Analytics in Higher Education: Current
Theory and Practice (pp. 7-18). Cham: Springer.
Liu, D.Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A.J. (2017) Data-driven personalization of stu-
dent learning support in higher education. In A. Peña-Ayala (ed.), Learning Analytics: Fundaments, Applica-
tions, and Trends (pp. 143-169). Cham: Springer.
Lockyer, L., Heathcote, E. & Dawson, S. (2013) Informing pedagogical action: aligning learning analytics
with learning design. American Behavioral Scientist, 57(10), 1439-1459.
Lodge, J.M. & Bonsanquet, A. (2013) Evaluating quality learning in higher education: re-examining the evi-
dence. Quality in Higher Education, 20(1), 3-23.
Lonn, S., McKay, A. & Teasley, S.D. (2017) Cultivating institutional capacities for Learning Analytics. New
Directions for Higher Education, 179, 65-75.
Macfadyen/3'DZVRQ60LQLQJ/06GDWDWRGHYHORSDQµHDUO\ZDUQLQJV\VWHP¶IRUHGXFDWRUV
A proof of concept. Computers and Education, 54(2), 588-599.
Macfadyen, L.P. & Dawson, S. (2012) Numbers are not enough. Why e-learning analytics failed to inform
an institutional strategic plan. Educational Technology and Society, 15(3), 149-163.
Malikowski, S., Thompson, M. & Theis, J. (2007). A model for research into course management systems:
bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 24/95
Martin, M. & Parikh, S. (2017) Quality Management in Higher Education: Developments and Drivers. Re-
sults from an International Survey. New trends in higher education. Paris: UNESCO Available at
http://unesdoc.unesco.org/images/0026/002602/260226E.pdf (accessed 11 September 2020).
Mawere, G.E. (2018) Perceptions of students and lecturers on online module and lecturer evaluation at
Great Zimbabwe University. International Journal of Research in IT and Management, 8(12), 6-15.
Mayring, P. (2000) Qualitative Content Analysis [28 paragraphs]. Forum Qualitative Sozialforschung / Fo-
rum: Qualitative Social Research 1(2), Art. 20. Available at http://nbn-resolving.de/urn:nbn:de:0114-
fqs0002204 (accessed 11 September 2020).
McGonigal, K. (2005) Teaching for transformation: from learning theory to teaching strategies. Newsletter
on Teaching of the Center for Teaching and Learning of Stanford University, 14(2), 4 p.
Melo, A.I. & Figueiredo, H. (2020) Performance management and diversity in higher education: an introduc-
tion. Tertiary Education and Management, 26, 247-254.
Menon, M.E., Terkla, D.G. & Gibbs, P. (eds.) (2014) Using Data to Improve Higher Education. Research,
Policy and Practice. Rotterdam: Sense Publishers.
Molas-Gallart, J., Castro-Martínez, E. (2007) Ambiguity and conflict in the development of 'Third Mission'
indicators. Research Evaluation, 16(4), 321-330.
Morais, C., Alves, P. & Miranda, L. (2017) Performance indicators in higher education with learning analyt-
ics. Journal on Advances in Theoretical and Applied Informatics, 3(2), 12-17.
MUNZ [Massey University of New Zealand] (2015) Massey University Policy Guide. Data Management Pol-
icy. Available at http://www.massey.ac.nz/massey/fms/PolicyGuide/Docu-
ments/ITS/Data%20Management%20Policy.pdf?403D1D0BFF1107EDEDF5C487687BB868 (access 18
April 2019).
Neves, J. & Hillman, N. (2018) 2018 Student Academic Experience Survey. Oxford: AdvanceHE/Higher Ed-
ucation Policy Institute (hepi). Available at https://www.hepi.ac.uk/wp-content/uploads/2018/06/STRICTLY-
EMBARGOED-UNTIL-THURSDAY-7-JUNE-2018-Student-Academic-Experience-Survey-report-2018.pdf
(accessed 11 September 2020).
NFES [National Forum on Education Statistics] (2010) The Forum Guide to Data Ethics. Available at
https://nces.ed.gov/pubs2010/2010801.pdf (accessed 4 July 2020).
Norris, D., Baer, L., Leonard, J., Pugliese, L. & Lefrere, P. (2008) Action analytics: measuring and improving
performance that matters in higher education. EDUCAUSE Review, 43(1), 42-67.
OECD (2012) A Guiding Framework for Entrepreneurial Universities. Available at
https://www.oecd.org/site/cfecpr/EC-OECD%20Entrepreneurial%20Universities%20Framework.pdf (ac-
cessed 16 September 2020).
OECD-AHELO (2013) AHELO ± Feasibility Study Report, 3 vols (Paris, OECD). Retrieved from
https://www.oecd.org/education/skills-beyond-school/AHELOFSReportVolume3.pdf (accessed 11 Septem-
ber 2020).
Ong, V. K. (2016) Business intelligence and Big Data Analytics for higher education: cases from UK higher
education institutions. Information Engineering Express, 2(1), 65-75.
Open University (2014) Policy on Ethical Use of Student Data for Learning Analytics. Available at
http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-
policy (accessed 11 September 2020).
Otto, B. (2011) A morphology of the organisation of data governance. ECIS 2011 Proceedings. Paper 272.
Available at http://aisel.aisnet.org/ecis2011/272 (access 9 May 2019).
Pistilli, M.D. (2017) Learner analytics and student success interventions. New Directions for Higher Educa-
tion, 179, 43-52.
Pollard, E., Williams, M., Williams, J., Bertram, C., Buzzeo, J., Drever, E., Griggs, J. & Coutinho, S. (2013)
How Should We Measure Higher Education? A Fundamental Review of the Performance Indicators. Part
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 25/95
One: The Synthesis Report. Brighton: Institute for Employment Studies/HEFCE. Available at
https://dera.ioe.ac.uk/18967/2/2013_ukpireview2.pdf (accessed 11 September 2020).
Pollitt, C. (1990) Measuring university performance: Never mind the quality, never mind the width? Higher
Education Quarterly, 44(1), 60-81.
Prinsloo, P. & Slade, S. (2014) Student data privacy and institutional accountability in an age of surveil-
lance. In M.E. Menon et al. (eds.), Using Data to Improve Higher Education. (pp. 197-214). London: Sense
Publishers.
QUELIT (2016) On the way to sustainable quality enhancement in learning and teaching. Comprehensive
set of performance indicators based on the ESG and an integrative comparison of the AHELO study, pro-
gram accreditation and the Creative Classroom Research Model. INQAAHE Research Project. Available at
https://www.evalag.de/en/research/ (accessed 11 September 2020).
Ramsden, P. (1991) A performance indicator of teaching quality in higher education: The Course Experi-
ence Questionnaire. Studies in Higher Education, 16(2), 129-150.
Ramsden, P. (1993) Theories of learning and teaching and the practice of excellence in higher education.
Higher Education Research and Development, 12(1), 87-97.
5LFKDUGVRQ-7(6WXGHQWV¶DSSURDFKHVWROHDUQLQJDQGWHDFKHUV¶DSSURDFKHVWRWHDFKLQJLQKLJKHU
education, Educational Psychology, 25(6), 673-680.
Ridder, H.-G. (2017) The theory contribution of case study research designs. Business Research, 10, 281-
305.
Rieckmann, M & Bormann, I. (2020) Higher Education Institutions and Sustainable Development. Imple-
menting a Whole-Institution Approach. Basel: MDPI.
RNE [German Council for Sustainable Development] (2018) The Sustainability Code for Higher Education
Institutions. Berlin: RNE. Available at: https://www.hochn.uni-hamburg.de/-downloads/handlungs-
felder/nhb/2018-05-14-rne-kodex-eng.pdf (accessed 11 September 2020).
Rogers, T., Dawson, S., & Gasevic, D. (2016) Learning Analytics and the Imperative for Theory-Driven Re-
search. In The SAGE Handbook of E-learning Research (pp. 232±250). Thousand Oaks: Sage.
Rowe, K. & Lievesley, D. (2002) Constructing and using educational performance indicators. Available at
https://research.acer.edu.au/learning_processes/11 (accessed 11 September 2020).
Rubel, A. & Jones, K.M.L. (2016) Student privacy in learning analytics: an information perspective, The In-
formation Society. An International Journal, 32, 143-159.
Rushkoff, D. (2010) Program or be Programmed: Ten Commands for a Digital Age. New York: OR Books.
Saeys, T. (2016) Integrated student lifecycle management. Higher education: gaining the edge through out-
standing student experiences. IT White paper. Diegem: Itelligence Business Solutions. Available at
https://itelligencegroup.com/wp-content/usermedia/4806_ITEL_WhitePaper_Education_WEB_fin.pdf (ac-
cessed 11 September 2020).
Scheffel, M. (2017) The Evaluation Framework for Learning Analytics. Maastricht: Datawyse (SIKS Disser-
tation Series No. 2017-34). Available at https://pdfs.seman-
ticscholar.org/43c8/912c4fac6c4273afd72d056052168b08afd2.pdf (accessed 11 September 2020).
Scheffel, M., Drachsler, H., Stoyanov, S. & Specht, M. (2014) Quality indicators for learning analytics. Edu-
cational Technology and Society, 17(4), 117-132.
Schneider, R., Szczyrba, B., Welbers, U. & Wildt, J. (2009) Change of Teaching and Learning Cultures [in
German]. Bielefeld, Bertelsmann.
Sclater, N. (2014) Code of Practice for Learning Analytics. A literature review of the ethical and legal issues.
London: JISC. Available at http://www.wojde.org/FileUpload/bs295854/File/07rp_54.pdf (accessed 4 July
2020).
Sclater, N. (2017) Learning Analytics Explained. London: Routledge.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 26/95
Sclater, N. & Bailey, P. (2015) Code of Practice for Learning Analytics (JISC). Available at
https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics (accessed 11 September 2020).
Sclater, N., Peasgood, A. & Mullan, J. (2016) Learning Analytics in Higher Education. A Review of UK and
international Practice. Full Report. Bristol: JISC.
Sharples, M., Mcandrew, P., Weller, M., Ferguson, R., Fitzgerald, E. & Hirst, T. (2013) Innovating Pedagogy
2013: Open University Innovation Report 2 (No. 9781780079370). Milton Keynes: UK. Available at
http://www.open.ac.uk/blogs/innovating/ (accessed 11 September 2020).
Shavelson, R.J., Zlatkin-Troitschanskaia, O. & Mariño, J.P. (2018) Performance indicators of learning in
higher education institutions: an overview of the field, in E. Hazelkorn, H. Coates and A.C. McCormick
(eds.), Research Handbook on Quality, Performance and Accountability in Higher Education (pp. 249-263).
Cheltenham: Elgar Publishing.
Siemens, G. (2011a) Call for Papers of the 1st International Conference on Learning Analytics & Knowledge
(LAK 2011). Available at https://tekri.athabascau.ca/analytics/ (accessed 11 September 2020).
Siemens, G. (2011b) Learning Analytics: the emergence of a discipline. American Behavioral Scientist,
57(10), 1380-1400.
Slade, S. & Prinsloo, P. (2013) Learning analytics: ethical issues and dilemmas. American Behavioral Sci-
entist, 57(10), 1510-1529 (doi: 10.1177/0002764213479366).
Social Research Centre (2019) Quality Indicators for Learning and Teaching (QILT). Law Courts: Social Re-
search Centre (SRC). Available at https://www.qilt.edu.au/ (accessed 11 September 2020).
South Africa Higher Education Performance Indicators (2009 - 2013) Available at https://afri-
caopendata.org/dataset/south-africa-higher-education-performance-indicators-2009-2013 (accessed 11
September 2020).
SPJSGM [S P Jain School of Global Management] (2018) Management of Student Performance and Data
Policy and Procedures. Available at https://www.spjain.ae/hubfs/policies-pdf/management-of-student-perfor-
mance-data-21-08-18.pdf?t=1536390925636 (access 18 April 2019).
SQELT [Sustainable Quality Enhancement in Higher Education Learning and Teaching] (2020) Sustainable
quality enhancement in higher education learning and teaching. Integrative core dataset and performance
data analytics. Erasmus+ Strategic Partnership. Available at https://www.evalag.de/sqelt/ (accessed 30 No-
vember 2020).
SQELT-MIO (2020) SQELT Main Intellectual Outputs. Erasmus+ Strategic Partnership. Available at
https://www.evalag.de/en/research/sqelt/intellectual-outputs/ (accessed 30 November 2020).
Stafford-Smith, M., Griggs, D., Gaffney, O., Ullah, F., Reyers, B., Kanie, N., Stigson, B., Shrivastava, P.,
/HDFK02¶&RQQHOO',QWHJUDWLRQWKHNH\WRLPSOHPHQWLQJWKH6XVWDLQDEOH'HYHORSPHQW*RDOV.
Sustainability Science, 12, 911-919.
Stea, A., Donche, V. & van Petegem, P. (2014) Understanding differences in teaching approaches in higher
education: An evidence-based perspective. Reflecting Education, 9(1), 21-35.
Tapia-Fonllem, C., Fraijo-Sing, B., Corral-Verdugo, V. & Ortiz Valdez, A. (2017) Education for sustainable
development in higher education institutions: Its influence on the pro-sustainability orientation of Mexican
students. SAGE Open, January-March 2017, 1-15. Available at https://jour-
nals.sagepub.com/doi/pdf/10.1177/2158244016676295 (accessed 17 September 2020).
7D\ORU-,PSURYLQJSHUIRUPDQFHLQGLFDWRUVLQKLJKHUHGXFDWLRQ7KHDFDGHPLFV¶SHUVSHFWLYH. Jour-
nal of Further and Higher Education, 25(3), 379-393.
Taylor, J. (2014) Informing or distracting? Guiding or driving? The use of performance indicators in higher
education. In M.E. Menon, D.G. Terkla & P. Gibbs (eds.) Using Data to Improve Higher Education (pp. 7-
24). Rotterdam: Sense Publishers.
Taylor, J. & Taylor, R. (2003) Performance indicators in academia: an x-efficiency approach? Australian
Journal of Public Administration, 62(2), 71-82.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 27/95
Thille, C. & Zimmaro, D. (2017) Incorporating Learning Analytics in the classroom. New Directions for
Higher Education, 179, 9-17.
Tempelaar, D., Rientes, B. & Nguyen, Q. (2017) Adding dispositions to create pedagogy-based Learning
Analytics. Zeitschrift für Hochschulentwicklung (ZFHE), 12(1), 15-35.
Tranberg, S., Hasselbalch, G., Kofod Olsen, B. & Byrne, S. (2018) Data Ethics. Principles and Guidelines
for Companies, Authorities and Organisations. Aarhuis: AKAPRINT.
UNESCO (2013) International Standard Classification of Education. Fields of education and training
2013(ISCED-F 2013) ± Detailed field descriptions. Paris: UNESCO. Available at
https://unesdoc.unesco.org/ark:/48223/pf0000235049 (accessed 16 September 2020).
UNESCO (2014) UNESCO Roadmap for Implementing the Global Action Programme on Education for Sus-
tainable Development. Paris: UNESCO. Available at https://sustainabledevelopment.un.org/content/docu-
ments/1674unescoroadmap.pdf (accessed 11 September 2020).
UNESCO (2017) Education for Sustainable Development Goals. Learning Objectives. Paris: UNESCO.
Available at https://www.unesco.de/sites/default/files/2018-08/unesco_education_for_sustainable_develop-
ment_goals.pdf (accessed 11 September 2020).
UNGA [United Nations General Assembly] (1948) Universal Declaration of Human Rights (UDHR). Availa-
ble at http://www.un.org/en/universal-declaration-human-rights/ (accessed 18 September 2020).
UNGA [United Nations General Assembly] (2008) Universal Declaration of Human Rights. Adopted and pro-
claimed by UN General Assembly Resolution 217 A (iii) of 10 December 1948 Text: UN Document A/810, p.
71 (1948). Refugee Survey Quarterly, 27(3), 149-182.
UNSW Sydney [University of New South Wales Sydney] (2017) Data Governance Policy. Available at
https://www.gs.unsw.edu.au/policy/documents/datagovernancepolicy.pdf (accessed 18 April 2019).
UWL [University of West London] (2018) Learning Analytics Policy. Available at
https://www.uwl.ac.uk/sites/default/files/Departments/About-us/uwl_learning_analytics_policy_2018_0.pdf
(accessed 21 May 2019).
Wächter, B., Kelo, M., Lam, Q. K. H., Effertz, P., Jost, C. & Kottowski, S. (2015) University Quality Indica-
tors: A Critical Assessment. Brussels: Directorate General for International Policies, Policy Department B:
Structural and Cohesion Policies, Culture and Education.
Webber, M., Lynch, S. & Oluku, J. (2013) Enhancing student engagement in student experience surveys: a
mixed method study. Educational Research, 55(1), 71-86.
Whiteley, S. (2016) Creating a coherent performance indicator framework for the higher education student
lifecycle in Australia. In: R. Pritchard, A. Pausits & J. Williams (eds.), Positioning Higher Education Institu-
tions. From Here to There (pp. 143-160). Dordrecht: Sense Publishers.
:LOGW-³7KHVKLIWIURPWHDFKLQJWROHDUQLQJ´± Theses on the change of learning culture in modular
study structures [in German], in Ehlert, H. & Welbers, U. (Eds.), Quality Assurance and Study Reform [in
German] (pp. 168-178). Düsseldorf: Grupello.
:LOOLDPVRQ%1XPEHUFUXQFKLQJ7UDQVIRUPLQJKLJKHUHGXFDWLRQLQWRµSHUIRUPDQFHGDWD¶USS-
briefs40, 1-8.
Yarkova, T. & Cherp, A. (2013) Managing the sacred? Perspectives on teaching excellence and learning
outcomes from an international postgraduate university. European Journal of Higher Education, 3(1), 24-39.
Yorke, M. (1991) Performance indicators: Towards a synoptic framework, Higher Education, 21, 235-248.
Yorke, M. (1998) Performance indicators relating to student development: Can they be trusted? Quality in
Higher Education, 4(1), 45-61.
Yorke, M. (2009) µ6WXGHQWH[SHULHQFH¶VXUYH\VVRPHPHWKRGRORJLFDOFRQVLGHUDWLRQVDQGDQHPSLULFDOLQ
vestigation. Assessment & Evaluation in Higher Education, 34(6), 721-739.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 28/95
Zilvinskis, J., Willis, J. & Borden, M.H. (2017) An overview of Learning Analytics. New Directions for Higher
Education, 179, 9-17.
Zimmerman, P.-A., Eaton, R. & van de Mortel, T. (2017) Beyond orientation: evaluation of student lifecycle
activities for first-year Bachelor of Nursing students. Collegian, 14, 611-615.
Zlatkin-Troitschanskaia, O., Pant, H.A., Kuhn, C., Toepper, M. & Lautenbach, C. (2016) Measurement of
Academically Mediated Competencies of Students and Graduates. An Overview of the National and Inter-
national Research Status [in German]. Wiesbaden: Springer.
Zlatkin-Troitschanskaia, O., Fischer, J., Lautenbach, C. & Pant, H. A. (2018) Heute studieren, morgen wei-
terqualifizieren ± Implikationen für die Weiterbildung aus der Kompetenzforschung. Weiterbildung, 4, 30-33
Zlatkin-Troitschanskaia, O., Pant, H. A. & Coates, H. (Eds.) (2018) Assessing Student Learning Outcomes
in Higher Education. Abingdon: Routledge.
Zlatkin-Troitschanskaia, O., Shavelson, R. J. & Pant, H. A. (2018) Assessment of learning outcomes in
higher education ± International comparisons and perspectives. In C. Secolsky & B. Denison (Eds.). Hand-
book on Measurement, Assessment and Evaluation in Higher Education (2nd ed.). New York: Routledge.
Zlatkin-Troitschanskaia, O., Toepper, M., Pant, H. A., Lautenbach, C. & Kuhn, C. (Eds.) (2018). Assess-
ment of Learning Outcomes in Higher Education ± Cross-national Comparisons and Perspectives. Wiesba-
den: Springer.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 29/95
Acknowledgment
The work on this Guideline on Performance Data Governance and Management in Higher Education Learn-
ing and Teaching was carried out in the context of the project ³Sustainable Quality Enhancement in Higher
Education Learning and Teaching´ (acronym: SQELT; http://www.evalag.de/sqelt), which was co-funded by
the European Commission between 2017 and 2020. The SQELT consortium would like to thank the Euro-
pean Commission for valuable financial support which made the SQELT project possible. The consortium
partners would also like to express their gratitude to all engaged experts, in particular (in alphabetical or-
der): Maarja Beerkens (Leiden University, The Netherlands), Philipp Pohlenz (University of Magdeburg,
Germany), Cláudia Sarrico (CIPES, Matosinhos, Portugal), Bjørn Stensaker (University of Oslo, Norway).
They enriched the project by their critical analyses, contributions in internal meetings and presentations at
the SQELT International Evaluation Workshop in Krems, Austria (1-2 July 2019). The experts also added to
the exploitation and dissemination of the SURMHFW¶VUHVXOWVE\conference presentations and papers. Finally,
our thanks go to the participants of focus group discussions and the online survey for their willingness to
HQJDJHZLWKWKH64(/7SURMHFW¶VUHVXOWV
This publication reflects the views only of the SQELT project partners. The creation of this resource has
been (partially) funded by the ERASMUS+ grant program of the European Union under Grant no. 2017-1-
DE01-KA203-1HLWKHUWKH(XURSHDQ&RPPLVVLRQQRUWKHSURMHFW¶VQDWLRQDOIXQGLQJDJHQF\'$$'
can be held responsible for the content or liable for any losses or damage resulting of the use of this re-
source.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 30/95
Appendix 1: SQELT Template of PDGM Policy
Appendix 1 presents a template for Performance Data Governance and Management (PDGM) Policy at
HEIs.
Template21
Performance Data Governance and Management Policy
(PDGMP)
of [insert name of higher education institution]
With Focus on Performance Data of Learning and Teaching,
including Learning Analytics, to be Accompanied by Supporting Documents
Policy Statement
Version
Approved by Approval date Effective date Next review
1.0
[insert name(s) of approving per-
son(s)]
[insert approval
date]
[insert effective
date]
[insert date of
next review]
Purpose
Performance Data Governance and Management Policies (PDGMPs) are collections of prin-
ciples that describe the rules to steer and manage the integrity, security, quality, and usage
of performance data22 during their lifecycle.
PDGMPs also define the roles and responsibilities of institutional/organisational staff, con-
tractors, and consultants with internal and external parties in relation to data access, re-
trieval, storage, disposal, and backup of institutional/organisational data assets.
This PDGMP is adapted to higher education institutions (HEIs), a specific type of complex
multiple-hybrid social organisations, as far as necessary. Accordingly, the purpose of this
PDGMP is to:
x
Define the roles and responsibilities for different data creation and usage types, cases
and/or situations, and to establish clear lines of accountability;
x
Develop good quality practices for effective data management, data (quality) lifecycle and
data protection;
x
Protect the HEI¶VGDWDDJDLQVWLQWHUQDODQGH[WHUQDOWKUHDWV; particularly assure pro-
tection of privacy, academic freedom, intellectual property, information security and
compliance;
x
Ensure that the HEI¶VGDWDKDQGOLQJFRPSOLHVZLWKDSSOLFDEOHODZs, regulations, ex-
change and standards;
x
Ensure that a data trail is effectively documented within the processes associated
with accessing, retrieving, exchanging, reporting, managing and storing of data.
21 This template is much inspired by and closely follows (UNSW Sydney 2017). Further sources used include (Alhassan et al.
2018; Johnson 2017; Khatry & Brown 2010; Kim & Chang 2019; Leiber 2019a; MUNZ 2015; Otto 2011; Prinsloo & Slade 2014;
SPJSGM 2018; UWL 2018).
22 7KURXJKRXWWKLVGRFXPHQW³GDWD´JHQHUDOO\UHIHUVWRQXPHULFDOGDWDDQGTXDOLWDWLYHLQIRUPDWLRQ)RUH[DPSOHUHOHYDQW³GDWD´
may be generated by and derived from performance indicators which comprise quantitative as well as qualitative indicators
(Leiber, 2019a, pp. 2-)XUWKHUPRUHWKHIRFXVLVRQ³GDWD´ZKLFKDUHUHOHYDQWIRUDQGUHODWHGWRLQVWLWXWLRQDOSHUIRUPDQFH
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 31/95
Scope
This PDGMP applies to, but is not limited to, all institutional performance data of learning and
teaching at [insert name of HEI]. Management of data associated with academic research ac-
tivity, Third Mission and University administration will be covered by respective policies (e.g.
Research data management Policy; Third Mission Data Management Policy; Administration
Data Management Policy) which will address the specific requirements at more detailed lev-
els.
This PDGMP covers, but is not limited to, institutional performance data in any form, including
print, electronic, audio visual, backup and archived data.
This PDGMP applies to all staff, contractors and consultants of [insert name of HEI].
Policy Provisions
Background Information
Institutional data, including performance data of learning and teaching, is a strategic asset of [insert name of
HEI] DQGWKHDSSURSULDWHJRYHUQDQFHIRUPDQDJHPHQWDQGXVHRIWKHVHGDWDLVFULWLFDOWRWKHLQVWLWXWLRQ¶V
operations. Lack of governance can lead to operational inefficiencies and could expose the [insert name of
HEI] to unwanted risks.
This PDGMP is guided by principles that should be adhered to support the improvement in managing and
securing the data across the institution. This PDGMP may be a component of an even more comprehensive
Data Governance Framework which allows the institution to better leverage their data quality activities, busi-
ness processes and capabilities through improving the oversight, guidance and quality of data. Such frame-
work would have to be approved and endorsed by a Data Governance Steering Committee.
Policy Framework and Principles
The following framework outlines the principles and minimum standards that guide the data governance and
management procedures of [insert name of HEI] and must be adhered to by all institutional staff. The Data
Policy Framework may be subdivided and consists of the four areas of Data Governance and Ownership,
Data Quality and Integrity, Data Classification and Security, and Data Terms and Definitions (see Figure 3).
Figure 3: Data Policy Framework (source: UNSW Sydney 2017, p. 2)
The University is committed to the ethical use of all institutional data. The principles and minimum stand-
ards that guide the data governance and management procedures of [insert name of HEI] are informed by
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 32/95
WKH8QLYHUVLW\¶VPLVVLRQDQGYDOXHVDVVHWout in the University Strategy (e.g. Structure and Development
Plan) and Mission Statement(s).
Governance and Ownership
It is indispensable that the roles of stakeholders in Data Governance and Management (DGM) are clearly
defined and ownership and responsibilities are explicitly ascribed to individual actors and organisational
units. Accordingly, the data governance roles and responsibilities of [insert name of higher education institu-
tion] are defined in the following Table.
Data Governance Role
Data Governance Responsibility
Data Governance
Steering Committee
(DGSC)
The Data Governance Steering Committee is the strategic decision-
making body for data governance and oversees the implementation of
the Data Management Framework. The DGSC is the cross-functional
committee that makes policy decisions and includes senior representa-
tion for core services, records management, privacy officer and technical
stakeholders. The DGSC is a University wide committee, with members
consisting of Data Executives, Data Stewards and designated Data Us-
ers, senior academic and professional staff. The DGSC has oversight of
the Data Governance Programme and is responsible for approving en-
dorsing the procedures related to the Data Governance Policy. The
DGSC also assures appropriate data processes are used in all of the
8QLYHUVLW\¶VGDWD-driven decisions.
The DGSC is responsible for the overall strategy of the Data Govern-
ance of [insert name of HEI].
The DGSC members are responsible for (MUNZ 2015)
x defining the 8QLYHUVLW\¶Vstrategic priorities,
x µDSSURYLQJDQGHQGRUVLQJDQ\GDWDSROLFLHVSURFHVVHVVWDQGDUGVDQG
guidelines¶
x µSODQQLQJDQGVSRQVRULQJGDWDPDQDJHPHQWSURMHFWVDQGVHUYLFHV¶,
x µFRPPXQLFDWLQJDQGSURPRWLQJWKHYDOXHRIGDWDDVVHWV¶,
x µPDQDJLQJDQGUHVROYLQJGDWDUHODWHGLVVXHVWKDWFDQQRWEHUHVROYHG
E\WKHGDWDFXVWRGLDQ¶,
x µPRQLWRULQJFRPSOLDQFHWRSROLFLHVVWDQGDUGVDQGregulatory legisla-
WLRQ¶
Chief Data Officer
The Chief Data Officer is responsible for the overall management of the Data
and Information Governance of [insert name of HEI].
Data Executive
A Data Executive supported by a Data Owner has the responsibility for the
management of data assigned within their portfolio. They are expected to
undertake any strategic work on data management.
Data Owner
The Data Owner has operational responsibilities in assisting Data Stewards
with day-to-day data administration activities; including, but is not limited to
develop, maintain, distribute, and secure institutional data. Data Owners are
expected to have high-level knowledge and expertise in the content of data
within their responsible area. This role is also the organisational Data Cus-
todian.
Data Owners are delegated by a Data Executive, and are responsible for
ensuring effective local protocols are in place to guide the appropriate use
of their data asset. Access to, and use of, institutional data will generally be
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 33/95
23 A supporting document.
administered by the appropriate Data Owner. Data Owners (or a delegated
Data Steward) are also responsible for ensuring that all legal, regulatory,
and policy requirements are met in relation to the specific data or infor-
mation asset. This includes responsibility for the classification of data in ac-
cordance with the Data Classification Standard23.
Data Owners are responsible for ensuring that data conforms to legal,
regulatory, exchange, and operational standards.
The Data Owner must ensure the process for the administration of data is
in accordance with the Data Quality Lifecycle (refer Appendix 1).
Chief Data Officers, Data Executives and Data Owners are responsible for
(MUNZ 2015)
x µDVVLJQLQJ'DWD6WHZDUGVIRUGDWDLQWKHir area of responsibility and al-
ORZLQJWLPHIRUWKHPWRFRPSOHWHUHOHYDQWWDVNV¶,
x µPDQDJLQJDQGUHVROYLQJGDWDUHODWHGLVVXHVWKDWFDQQRWEHUHVROYHG
E\WKH'DWD6WHZDUG¶,
x µDXWKRULVLQJVHFXULW\FODVVLILFDWLRQRIDVVLJQHGGDWDHJSXEOLFUH
VWULFWHG¶,
x µDXWKRULVLQJDFFHVVWRDVVLJQHGGDWDDQGLWVXVDJHLQRWKHUV\VWHPV¶,
x µLGHQWLI\LQJDQGUHJLVWHULQJSHUVRQDOO\LGHQWLILDEOHLQIRUPDWLRQ3,,FRQ
WDLQHGLQGDWDVRXUFHV¶,
x µDSSO\LQJDQGPDQDJLQJWKHHWKLFDOSURFHVVHVZKHUHUHTXLUHG¶,
x µHQVXULQJWKDWGDWDLVILW-for-purpose including defining data quality lev-
HOVPHWULFVEXVLQHVVUXOHVDQGIDFLOLWDWLQJGDWDLQWHJUDWLRQ¶
Data Steward
Data Steward is a Member of the Executive who oversees the capture,
maintenance and dissemination of data for a particular organisational unit.
Data Stewards are responsible for assuring the requirements of the Data
Governance Policy and the Data Governance Procedures are followed within
their organisational unit. Data Stewards do the day-to-day management of
the operational requirements, data quality, compliance with requirements,
conformance to policies and standards, security controls, and identifying and
resolving data issues. They undertake any operational work on data man-
agement.
Every data area must have one or more Data Stewards, who are responsible
for the quality and integrity, implementation and enforcement of data man-
agement within their Division, Faculty, Centre or research project.
The Data Steward will classify and approve the access, under delegation
from a Data Owner, based upon the appropriateness of the uVHU¶VUROHDQG
the intended use. Where necessary, approval from the Data Executive/Data
Owner may be required prior to authorisation of access.
Data Stewards are responsible for (MUNZ 2015)
x implementing and monitoring access to corporate data according to
approved organisational rules and processes,
x µLPSOHPHQWLQJDQGPDLQWDLQLQJGDWDquality requirements for assigned
GDWDVHWV¶,
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 34/95
x µLGHQWLI\LQJDQGKHOSLQJWRUHVROYHGDWDLVVXHVULVNVDQGHUURUVHVFD
ODWLQJWR'DWD&XVWRGLDQZKHQUHTXLUHG¶,
x µDUUDQJLQJDSSURSULDWHWUDLQLQJIRUVWDIIWRHQVXUHWKDWGDWDLVFDSWXUHG
and used accurately aQGDSSURSULDWHO\¶,
x µSURYLGLQJLQSXWLQWRGDWDSROLFLHVVWDQGDUGVDQGSURFHGXUHV¶,
x µSURYLGLQJDGYLFHDQGSHUPLVVLRQIRUGDWDZKHQLWLVUHSXEOLVKHGGXSOL
FDWHGRULQWHJUDWHGZLWKRWKHUV\VWHPV¶,
x µHQVXULQJWKDWDOOGDWDWKDWLVQRWDUHFRUGKDVDQappropriate retention
DQGGLVSRVDOVFKHGXOH¶,
x µFKDPSLRQLQJWKHLPSOHPHQWDWLRQRIGDWDPDQDJHPHQWVWDQGDUGVDQG
SURFHVVHV¶
Data Custodian
Data Custodian is a Member of the Executive who oversees the capture,
maintenance and dissemination of data for a particular organisational unit.
Data Custodians are responsible for assuring the requirements of the data
structure, storage, processing, access and security within their organisa-
tional unit. They undertake any operational work on data management.
Data Custodians are responsible for (MUNZ 2015)
x µassigning Data Stewards for data in their area of responsibility and al-
lowing time for them to complete relevant tasks,
x µmanaging and resolving data related issues that cannot be resolved
by the Data Steward,
x µauthorising security classification of assigned data, e.g. public, re-
stricted,
x µauthorising access to assigned data and its usage in other systems,
x µidentifying and registering personally identifiable information (PII) con-
tained in data sources,
x µapplying and managing the ethical processes (where required),
µensuring that data is fit-for-purpose including defining data quality levels,
metrics, business rules and facilitating data integration¶
Data Specialist (Infor-
mation Technology Ser-
vices)
Data Specialists are business and technical Subject Matter Experts (SMEs)
in relation to the data or information asset. The SMEs under Management
and Operations category (refer Appendix 2) are Business or Information
Technology Specialists who will be responsible for providing ongoing sup-
port to [insert name of HEI] operational systems, data or informational as-
sets.
Information Technology Services are responsible for (MUNZ 2015)
x µSURPRWLQJWKHYDOXHRI8QLYHUVLW\GDWDIRUHQWHUSULVHXVHZKLOHfacil-
LWDWLQJVKDULQJLQWHJUDWLRQDQGVHFXULW\¶,
x µSURYLGLQJGDWDDUFKLWHFWXUHVHUYLFHVWKDWOHYHUDJHDGDWDUHIHUHQFH
model, manage an enterprise data classification and create an in-
ventory of corporate data assets,
x µdocumenting, managing and distributing data models (conceptual,
logical and/or physical) of University data including relevant
PHWDGDWDDQGGDWDLQWHJUDWLRQ¶,
x µdesigning and managing processes for maintaining the integrity,
accuracy, timeliness, consistency, standardisation and value of data
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 35/95
Quality and Integrity
Data Researchers and Data Users must ensure appropriate procedures are followed to uphold the
quality and integrity of the data they access.
Data records must be kept up to date throughout every stage of the institutional performance and op-
erations and in an auditable and traceable manner. Data should only be collected for legitimate uses
and to add value to [insert name of HEI] and its various stakeholders. Extraction, manipulation and re-
porting of data must be done only to perform institutional purposes. In general, this includes learning
and teaching, research, third mission, and administration.
LQFOXGLQJPDVWHUGDWDPDQDJHPHQW¶,
x µproviding advice and support for the data stewards, data custodi-
DQVDQGWKHGDWDJRYHUQDQFHFRPPLWWHHPHPEHUV¶,
x µdesigning, managing and implementing data management policies,
standards and frameworks in consultation with relevant key stake-
KROGHUV¶,
x µcreating and/or reviewing database designs that align with relevant
standards ensuring the principles are applied e.g. access and secu-
ULW\¶,
x µproviding storage infrastructure and operational database support
including managing the database management system platform,
EDFNLQJXSGDWDEDVHVDQGPRQLWRULPSURYHGDWDEDVHSHUIRUPDQFH¶,
x µproviding guidance on use and implications of the different storage
options to ensure it is fit-for-SXUSRVH¶,
x µensuring appropriate procedures and systems are in place to sup-
SRUWEXVLQHVVFRQWLQXLW\DQGGLVDVWHUUHFRYHU\¶
Data User
Data User is any staff member, contractor, consultant, or authorised agent
who accesses, inputs, amends, deletes, extracts, and analyses data in the
LQVWLWXWLRQ¶V,7V\VWHPWRFDUU\RXWWKHLUGD\-to-day duties. Data Users are
not generally involved in the governance process but are responsible for the
quality assurance of data. Appropriate security and approval are required
from Data Stewards to maintain the quality and integrity of the Data.
Data Users are responsible for (MUNZ 2015)
x µFRPSO\LQJZLWKGDWDPDQDJHPHQWSROLFLHVSURFHVVHVDQGVWDQGDUGV¶,
x µHQVXULng that all data access and usage is relevant and appropriate to
WKHZRUNEHLQJXQGHUWDNHQ¶,
x µREVHUYLQJDQ\DFFHVVFRQWUROVRUVHFXULW\UHVWULFWLRQVWKDWDSSO\WRWKH
data to which they have access, and only working within the data ac-
cess boundaries that WKH\KDYHEHHQSHUPLWWHG¶,
x µSUHYHQWLQJDQ\XQDXWKRULVHGDFFHVVWRGDWDWRZKLFKWKH\KDYHDF
cess rights and ensuring that confidential or restricted data is always
SURWHFWHGZLWKRXWGLVFORVXUHWRDQ\XQDXWKRULVHGSHUVRQV¶
Data Researcher
Original research data and primary materials generated in the conduct of
research at [insert name of HEI] is owned and retained by [insert name of
HEI], subject to any contractual, statutory, ethical, or funding body require-
ments. Researchers are permitted to retain a copy of the research data and
primary materials for future use, subject to any contractual, statutory, ethi-
cal or funding body requirements.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 36/95
Where appropriate, before any data (other than publicly available data) is used or shared outside
[insert name of HEI], verification with the Data Steward is required to ensure the quality, integrity and
security of data will not be compromised.
Data shall be retained and disposed of in an appropriate manner in accordance with the 8QLYHUVLW\¶V
Recordkeeping Policy24 and associated procedures under applicable governmental laws and regula-
tions.
Classification and Security
Staff, contractors and consultants should refer to the Data Classification Standard25 and the Data
Handling Guideline26 for further information.
Appropriate data security measures (see Data Classification Standard) must be adhered to at all
times to assure the safety, quality and integrity of University data.
Personal use of institutional data, including derived data, in any format and at any location, is
prohibited.
Records stored in an electronic format must be protected by appropriate electronic safeguards and/or
physical access controls that restrict access only to authorised user(s). Similarly, data in the Univer-
sity Data repository (databases etc.) must also be stored in a manner that will restrict access only to
authorised user(s).
This Policy applies to records in all formats (paper, digital or audio-visual) whether registered files,
working papers, electronic documents, emails, online transactions, data held in databases or on tape or
disks, maps, plans, photographs, sound and video recordings, or microforms.
Terms and Definitions
The definition and terms used to describe different types of data should be defined consistently or referred to
the relevant Glossary of the institution.
Addendum: Additional Principles of Learning Analytics
Here, additional principles of Learning Analytics can be integrated into the PDGM Policy; such principles
are presented in Appendix 4.
Policy Review
This PDGMP will be reviewed and updated every three (3) years from the approval date, or more frequently
if appropriate. In this regard, any staff members who wish to make any comments about the Policy may for-
ward their suggestions to the Responsible Officer.
Further Assistance
Any staff member who requires assistance in understanding this Policy should first consult their nominated
supervisor who is responsible for the implementation and operation of these arrangements in their work
area. Should further assistance be needed, the staff member should contact the Responsible Officer for
clarification.
24 A Recordkeeping Policy is a set of rules to control document, information and data (quality) lifecycle in an organisation, from
the moment it is created or received, until it is stored for historical reference or destroyed.
25 A supporting document.
26 A supporting document.
Accountabilities
Responsible Officer
>LQVHUWIXQFWLRQDQGRUQDPHRIWKH+(,¶V3'*0UHVSRQVLEOHRIILFHUHJ'L
rector Planning and Performance of Learning and Teaching]
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 37/95
Contact Officer
Chief Data Officer
Supporting Information
Legislative Compliance
7KLV3ROLF\VXSSRUWVWKH8QLYHUVLW\¶VFRPSOLDQFHZLWKWKHIROORZ
ing legislation: [insert respective legislation information, if appli-
cable].
Supporting Documents
Metadata Documentation (e.g. data dictionary, metadata standards, the-
sauri)
Data Classification Standard and Data Handling Guideline, comprising but
not restricted to Performance Indicators for Learning and Teaching, includ-
ing their Definition, Theoretical Background and Operational Description
IT Security Policy ± Information Security Management System (ISMS)
IT Security Standards
(WKLFV)UDPHZRUNRI3'*0LQFOXGLQJEXWQRWUHVWULFWHGWRWKH8QLYHUVLW\¶V
Performance Data Ethics Framework, Risk Management Framework
(Further) Definitions and Acronyms
To establish operational definitions and facilitate ease of reference, the following terms are defined:
Business Intelligence
A set of theories, methodologies, architectures, and technologies that trans-
form and integrate raw data into meaningful and useful information that pro-
vides the business community easy access to data that supports decision
making.
Business Subject Matter
Experts (SMEs)
Refer to Data Specialist.
Chief Data Officer (CDO)
Senior officer of the University responsible for Data and Information Govern-
ance.
Data (corporate, institu-
tional)
³Institutional data´ can be understood as facts, figures or individual pieces of
information that is captured through the operation of the University and can
be words, numbers or images etc. Data is the raw detail that can be used to
represent information or, from which, information can be derived. All data
needs to be managed regardless of whether it is structured or unstructured.
In a slightly more specified sense, institutional data is the representation
of facts, concepts or instructions in a formalised (consistent and agreed)
manner suitable for communication, interpretation or processing by hu-
man or automatic means. Typically comprised of numbers, words or im-
ages. The format and presentation of data may vary with the context in
which it is used.
Data is not Information until it is used in a particular context for a par-
ticular purpose.
In the context of this policy this term includes all institutional data including
research, administrative, and learning and teaching artefacts.
Data Access or Re-
striction
The right to read, copy, or query data.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 38/95
Data Area
A Data Area is a term used to denote a subset of institutional data that is
the responsibility of a team including Data Owner and Data Stewards. This
could include an entire IT system (e.g. Human Resources system) or a
business area such as Identity and Access Management, or a Research
project. It may include data that is the responsibility of University Divisions,
such as Finance, HR, Library, Learning and Teaching, Third Mission, and
Research etc.
Data Creator
Individuals and organisational units creating the data.
Data Executive
Is a Senior Executive with planning and decision-making authority for
part or all of the Universitys institutional data.
Data Governance
Data governance is the strategic and formal management of data assets
through the establishment of processes, maintenance of common stand-
ards and exercising positive control; it includes managing, improving,
PRQLWRULQJDQGSURWHFWLQJDQRUJDQLVDWLRQ¶VGDWDDQGLQIRUPDWLRQ. This
then supports business intelligence and master data management initia-
tives, facilitates migration of legacy data, meets compliance and legisla-
tive requirements and improves corporate flexibility and business agility.
Data Integrity
Refers to the accuracy and consistency of data over its entire lifecycle.
Data Management
Data management is the function that develops, manages and executes
policies, processes, standards and frameworks that collect, protect, de-
liver, and enhance the value of data and information assets to meet the
data availability, quality and security needs of the University.
Data Quality Lifecycle
Refers to the process for planning, creating, managing, storing, imple-
menting, protecting, improving and disposing of all institutional data of
the University.
Data Quality
Refers to the process for planning, creating, managing, storing, imple-
menting, using, protecting, improving, archiving and disposing of all insti-
tutional data of the University.
Data Reference Model
Data reference model is a data architecture framework to enable infor-
mation sharing and reuse by standard description and identification of
common data. It promotes good data management practises. The refer-
ence model at Massey leverages the framework developed by the Amer-
ican Federal Government.
Data Security
Refers to the safety of (institutional) data in relation to the following crite-
ria: Access control; Authentication; Effective incident detection, reporting
and solution; Physical and virtual security; and Change management
and version control.
Information Security
Management System
(ISMS)
,QUHVSRQVHWRWKH8QLYHUVLW\¶V Data Classification and Handling require-
ments, the ISMS provides Information Security governance and sets out
people, process and technology related controls to assure the confidential-
LW\LQWHJULW\DQGDYDLODELOLW\RIWKH8QLYHUVLW\¶V data. The ISMS is a re-
sponse to the Universitys Data Classification and Handling requirements.
Moreover, the deployment and measurement of ISMS controls provides
input into the risk management process enabling informed business deci-
sions.
Management Board (MB)
The senior executive team of the University.
Metadata
Metadata is structured data that describes and/or enables finding, man-
aging, controlling, understanding or preserving other information over
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 39/95
Revision History
Version Approved by
Approval date
Effective date
Sections modi-
fied
1.0
[insert name(s) of approving
person(s)]
[insert approval
date]
[insert effective
date]
[insert name(s)
of modified sec-
tion(s)]
1.1
[insert name(s) of approving
person(s)]
[insert approval
date]
[insert effective
date]
[insert name(s)
of modified sec-
tion(s)]
«
«
«
«
«
time. Metadata includes, but is not restricted to, characteristics such as
the content, context, structure, access, and availability of the data.
Personal Identifiable In-
formation (PII)
Personal Identifiable Information is data contained in University systems
that is private information as defined by the Privacy and Data Protection
Laws and Regulations.
Record (Institutional
Record)
Metadata records stored in any digital format, any document or other
source of information compiled, recorded or stored in written form or on
film, or by electronic process, or in any other manner or by any other
means.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 40/95
Addendum 1: Data Quality Lifecycle
The Data Quality Lifecycle refers to the processes for planning, creating, managing, storing, implementing,
using, protecting, improving, archiving and disposing of all institutional data, including performance data of
L&T (Figure 4).
Figure 4: Data Quality Lifecycle (source: UNSW Sydney 2017, p. 7)
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 41/95
Addendum 2: (Performance) Data Governance Roles and Responsibilities
Figure 5 gives an overview of an optional scheme of how strategic, operational and support functions and
responsibilities of PDGM may be distributed within a HEI.
Figure 5: Data governance roles and responsibilities: management and operations (source: UNSW Sydney
2017, p. 8)
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 42/95
In differentiation to the above considerations, Table 7 presents some additional and complementary infor-
mation about domains, domain-related decisions and potential roles of responsibility in the context of perfor-
mance data and PDGM.
Table 7: Framework for (performance) data decision domains, adopted with revisions from (Kathri & Brown
2010, p. 149)
PDG(M) domains
PDG(M) domain decisions
Potential roles or locus of responsi-
bility
Data principles and responsibili-
ties: clarifying the role of perfor-
mance data (PD) as an asset and
the responsibilities
What are the uses of PD for the organi-
sation (i.e., the university)?
What are the mechanisms for communi-
cating organisational uses of PD on an
ongoing basis?
What are the desirable behaviours for
employing PD as assets?
How are the opportunities for sharing
and reuse of PD identified?
How does the regulatory environment in-
fluence the organisational uses of PD?
PD owner, individual and organisational
PD producer/supplier
PD processor and dresser (e.g., ranker)
PD steward
PD custodian
PD consumer
Organisational PD committee/council
Data quality including data pro-
cesses and technology: establish-
ing the requirements of intended
use of PD
What are the standards for PD quality
with respect to accuracy, timeliness,
completeness and credibility?
What is the strategy for establishing and
communicating PD quality?
How will PD quality as well as the asso-
ciated strategy be evaluated?
PD owner, individual and organisational
PD subject matter expert
PD quality manager
PD quality analyst
Data interpretation: establishing
the semantics of PD to make it in-
terpretable
What is the program for documenting the
semantics of PD?
How will PD be consistently defined and
modelled so that it is interpretable?
What is the plan to keep different types
of meta-PD up to date?
Organisation PD architect
Organisation PD modeller
PD modelling engineer
PD architect
Organisation architecture committee
Data access: specifying access
requirements of PD
What is the organisational value of PD?
How will risk assessment be conducted
on an ongoing basis?
How will assessment results be inte-
grated with the overall compliance moni-
toring efforts?
What are PD access standards and pro-
cedures?
What is the program for periodic moni-
toring and audit for compliance?
How is security awareness and educa-
tion disseminated?
What is the program for backup and re-
covery?
PD owner, individual and organisational
PD beneficiary
Chief information security officer
PD security officer
Technical security analyst
Organisation architecture development
committee
Data (quality) lifecycle: determin-
ing the definition, production, re-
tention and retirement of PD
How is PD stored and inventoried?
What is the program for DPD definition,
production, retention, and retirement for
different types of PD?
How do the compliance issues related to
legislation affect PD retention and ar-
chiving?
Organisation PD architect
Information chain manager
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 43/95
Appendix 2: SQELT Performance Data Management (PDM) Systems
Model
For a Performance Data Management (PDM) systems model ± with a focus on L&T ±, there are several
considerations that should be taken into account:
1. Teaching is easier than learning: ³7HDFKLQJ´DSSHDUVWRUHSUHVHQWWKHHDVLHUWDVNLQWKHVHQVHWKDWSHU
formance data on teaching already are more and better established. There already are quality criteria
and expectations for teaching, well elaborated, and being implemented for a longer period of time, so
with routines to be assessed.
2. Learning is more complicated than teaching: Learning obviously is the more complicated and less con-
sensual aspect of L&T. To begin with, it must be realized that learning is not restricted to teaching, but,
of course, can also refer to other aspects and activities, such as research. L&T, i.e. that is learning in
connection with teaching, emphasizes a learning with regard to teaching or a learning that is based on
teaching. So, without teaching data a modeling of learning in teaching is not possible. Different meta-
phors may apply for here. Teaching can be regarded as a type of primary data (to a certain extent), and
learning can be seen as meta-data (also to a certain extent).
3. A gradual conversion from teaching to learning: In a certain sense, there is a grey area of overlap be-
tween complex performance data of teaching and the performance data of learning (based on teach-
ing). So, decisions must be taken on this, by the concrete HEI itself, or also by actors, decision-makers
or analysts and researchers from outside of the institution: Where, when is a conversion from perfor-
mance data of teaching over to performance data of learning? For a clarification, it could be stated:
learning is even more complex (comprehensive, aggregated) than teaching alone. So performance data
in connection with learning will try to create an over-look over time, want to assess, whether changes
have occurred, and want to evaluate, whether such changes qualify to be valued as improvements (or
not). Because, phrased in terms of a simple formula: learning may be depicted as a process of innova-
tion, which leads to (or results in) types of a betterment or improvement (Campbell & 3DQWHOLü
4. The organisational context of PDM: The systems model of PDM brings of course the concrete organisa-
tion into play, the concrete institution of the HEI. There are different options, how and where such a
PDM may be set up, may be implemented and further developed within the organisation, and PDM may
root only in one organisational unit, or could cross-connect different organisational units of the same
institution. A systems model of PDM requires additional attentions:
4.1. Data inform, but do not govern: L&T performance data can inform the management of L&T data,
and can inform decision-making and governance, but L&T performance data cannot (automati-
cally) generate by-itself the decision-making and governance as such. Good governance and deci-
sion-making should refer to performance data in L&T, but it is not the data that are creating the
decision.
4.2. Epistemic Governance within a Learning Organisation: A HEI should self-regard itself as a learn-
ing organisation. This also must be taken into account for a systems model of PDM. So, the impli-
cation is that data are not only being defined, but also that the underlying conceptual understand-
LQJRIWKHGDWDLVEHLQJPDGHH[SOLFLW)RUH[DPSOHWKHJRYHUQDQFHDSSURDFKRI³(SLVWHPLF*RY
HUQDQFH´&DPSEHOO & Carayannis 2013) is requiring this explicitly. Particularly with regard to
learning (and here even more so than for teaching) it must be demonstrated, why data on learning
really qualify as performance data on learning. But of course, also performance data on teaching
need explanations (a conceptual explanation).
4.3. Quality assurance and quality development (quality enhancement) of performance data manage-
ment: A systems model of management of performance data in teaching and learning will mean
that there are structures and processes of quality assurance and quality development (in connec-
tion with organisational development) in place. Quality assurance can reflect on the accuracy of
the L&T data (and indicators). Quality development can reflect on how to improve L&T data (and
indicators), in the sense of progressing toward next-stage or next-generation data with advanced
requirement purposes. The aspect of learning marks here the one great frontier.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 44/95
4.4. Management and governance of performance data: Epistemic governance is also highlighting that
there should be an explicit understanding of how a systems model of PDM in L&T is relating to
Performance Data Management and Governance (PDGM) based on L&T. In other words: Are per-
formance data of L&T being used for governance, and if so, in which way? There should be no
surprises for faculty, staff, students, and graduates at a HEI. Good governance requires here fair
and transparent conditions (which are not changed and altered in unfair ways). Data protection is
a necessary standard for quality. Well-balanced interactions of PDM and PDGM require good de-
signs of an integrated quality assurance and quality development, so that a governance of a PDM
of L&T can really contribute to a further organisational development in HEIs.
Figure 6 shows an exemplary visualisation of a Digital Performance Data Management (DPDM) system that
is implemented at one of the SQELT partner universities, University of Aveiro. In this case example DPDM
is embedded into a 100% wireless digitalised infrastructure and a 100% digitalised BackOffice.
Figure 6: DPDM system model at the University of Aveiro ± Information system and software solutions
DGRSWHGIURPWKHXQLYHUVLW\¶VSUHVHQWDWLRQVDW64(/77UDQVQDWLRQDO3URMHFW0HHWLQgs)
At the basis is a digital system of management and monitoring of networks and systems (for data pro-
cessing) that underlies the documentation of management activities and workflow in the university. The lat-
ter are realised by several basic performance areas that are depicted in the form of rectangular, vertical col-
umns in Figure 6. Among them are, for example, SIG-UA (Management of Georeferenced Infrastructures),
e-LEARNING, PACO (the Academic Online Portal) and SInBAD (Information System for the Library and
Digital Archives) etc. All these are accessible by authorised users via the interaction platform ContactUA.
As shown in Figure 6, these building blocks constitute the FrontOffice that is 100% available. On top, there
is the digitalised PI system that receives information and data from all DPDM system levels as far as rele-
vant. Included in this DPDM system is a distribution of responsibilities and related (justified) accessibilities
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 45/95
of the various data and information. Typical involved stakeholders on the various DPDM system levels are
also schematically indicated in Figure 6 (left and right of the pyramid).27
As already mentioned in the main text, within the SQELT case study, the benchlearning area of (D)PDM
systems could not be fully exploited because of lack of available resources. Accordingly, there is room for
users of this Guideline to improve on this subject.
27 This DPDM system was explained in more detail by the project partner University of Aveiro at various SQELT Transnational
Project Meetings.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 46/95
Appendix 3: SQELT Comprehensive PI Set for L&T in Higher Educa-
tion
Pragmatic Understanding of Performance Indicators and Working Definitions
About the Indispensability and Pragmatic Understanding of Performance Indicators
Underlying the present piece of work is the following pragmatic understanding of performance indicators
(PIs):
Performance indicators µUHSUHVHQWTXDOLWDWLYHDQGTXDQWLWDWLYHLQIRUPDWLRQDQGGDWDZKLFKLQGLFDWH
IXQFWLRQDOTXDOLWLHV³SHUIRUPDQFH´RILQVWLWXWLRQDORUJDQLVDWLRQDORULQGLYLGXDOSHUIRUPDQFHSURYLG
HUV¶7KXVµ3,VSURYLGHLQIRUPDWLRQDERXW the degree to which quality performance objectives are
EHLQJPHW¶ (Leiber 2019a, p. 77).
In other words, a performance indicator gives an indication of some performance (or performance pre-con-
dition) of an individual or an organisation, for example, in the context or framework of a project, programme,
product or other initiative. Typically, a performance indicator is related to points of reference such as stand-
ards and goals against which the measured value of the indicator and thus the achieved degree of perfor-
mance or success is assessed.28
PIs, particularly qualitative PIs, can be used to monitor performance and performance capacity and aspects
of these for comparative purposes, to facilitate the assessment of institutional operations, and to provide
evidence for quality assurance and improvement. Particularly, more complex PIs to a greater extent rely on
points of reference (e.g. objectives, assessments, comparators). Accordingly, such PIs are generated by
more complex procedures and related measures (e.g. structured surveys and interviews, focus group dis-
cussions, expert assessments) and intended to provide a complexity-adequate indication of a state or pro-
cess.
Depending on the complexity of the activity, project, programme or organisation under scrutiny, the perfor-
mances to be looked at can be very different and therefore PIs can cover a wide range of measures of dif-
ferent complexity: from pure performance figures (numerical values; quantitative PIs)29 to complex qualita-
tive performance information, which is based on the measurement and collection of qualitative information
(qualitative PIs)30 (Cave et al. 1997). Usually, by convention a PI only refers to past performances the
PHDVXUHPHQWEHLQJGHVFULSWLYHRUODJJLQJZKLOHDQµLQGLFDWRU¶RIIXWXUHSHUIRUPDQFHPD\EHFDOOHGDSURJ
nosticator that cannot be mainly based only on factual achievements.
In this sense, performance data management models based on PIs represent one specific modelling per-
spective which seems to be indispensable for any systematic approach to quality assurance (QA) and qual-
ity development in HEIs because
x µ3,VUHIOHFWWKHTXDOLW\JRDOV³WDUJHWHGSHUIRUPDQFH´RILQVWLWXWLRQVLQVWLWXWLRQDOXQLWVDQGSUR
JUDPPHVDQGZLWKRXWVHWWLQJVXFKJRDOVLWVHHPVLPSRVVLEOHWRV\VWHPDWLFDOO\LPSURYHTXDOLW\¶
(Leiber 2019a, p. 77);
x µ3,VRSHQWKHZD\WRREMHFWLI\FRPPXQLFDWLRQDQGRSHUDWLRQDOLVDWLRQRITXDOLty relevant features
DQGLQWKHFDVHRITXDQWLWDWLYH3,VPHDVXUHWKHP¶/HLEHUa, p. 77);
x PIs can be used in various performance models in HEIs such as quality audit, accreditation and
performance reporting. In this way, PIs can be used by HEIs to provide information for internal QA
28 ,WVKRXOGEHQRWHGWKDWDFOHDUHUDQGPRUHFRQGHQVHGDQGDWWKHVDPHWLPHZLGHO\DJUHHGGHILQLWLRQRIWKHWHUP³SHUIRU
PDQFHLQGLFDWRU´LQKLJKHUHGXFDWLRQ/7LVFXUUHQWO\QRWDYDLODEOHHowever, this is not extremely bad, because definitions are
not empirically true or false, or more reliable or less reliable, but merely fulfil pragmatic functions of conceptual clarity and for
facilitating communication.
29 An example of VXFKDTXDQWLWDWLYHSHUIRUPDQFHLQGLFDWRULV1XPEHURIVWXGHQWZRUNSODFHVKHOGLQDXQLYHUVLW\¶VIDFLOLWLHVLQ
relation to the student population of the university and/or per subject field and/or per study programme.
30 An example of such a qualitative performance indicator is: 6WXGHQWV¶Oearning gain in reflective competences (according to
relevant quality criteria to be identified, e.g. systemic thinking, forward thinking, critical thinking, self-perception competence)
that could be assessed by (satisfaction) surveys of students, surveys of teaching staff and assessment reports by experts/peers
RWKHUWKDQVWXGHQWVDQGWHDFKLQJVWDIILQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLng
Analytics.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 47/95
(e.g., monitor performance for comparative purposes; facilitate assessment of institutional opera-
tions), external QA (such as accreditations, audits, evaluations) and accountability needs and re-
porting purposes (e.g., to the government, HEI council, broader public) and rankings/ratings.
It should be recognised, however, that PIs will usually only µGHSLFWWUHQGVDQGXQFRYHULQWHUHVWLQJTXHV
WLRQV¶EXWµWKH\GRQRWREMHFWLYHO\SURYLGHH[SODQDWLRQVZKLFKUHIOHFWWKHFRPSOH[LW\RIKLJKHUHGXFDWLRQRU
SHUPLWFRQFOXVLRQVWREHGUDZQ¶&KDOPHUVSPIs can be considered thus as markers or pointers
to more complex and multi-layered realities. ,QVWHDGµPXOWLSOHVRXUFHVRIERWKTXDQWLWDWLYHDQGTXDOLWDWLYH
LQIRUPDWLRQ¶DUHQHHGHGDQGLWLVµLPSHUDWLYHWKDWLQGLFDWRUVVKRXOGRQO\EHLQWHUSUHWHGLQOLJKWRIFRQWH[WXDO
information concerning institutional operation and with the assumption and purpose for which the infor-
PDWLRQLVEHLQJXVHGPDGHH[SOLFLW¶LELG
Suggesting a comprehensive31 set of qualitative and quantitative PIs does neither imply that QA (comprising
quality development) is reduced to a tick box-like checking of pre-determined fixed features, nor does it
mean that monitoring and checking PIs would necessarily completely exploit the QA process. Particularly,
PIs are necessarily vague and fuzzy to a certain degree, in the sense that they cannot be completely pre-
cise with respect to conceptualisation and operationalisation, and applicable to different HEIs at the same
time. In other words, PIs must be interpreted and operationalised; both procedures can be usually carried
out in a variety of ways depending on various possible adjustments to the context.
Even more than that, in general any list of PIs will be fallible in several ways: First, there is always the pos-
sibility that elements of the set are empirically inadequate. Second, there is always a tendency that model-
ling is under-complex in a too much pronounced way as compared to the modelled entities and their dy-
namics. Third, PI model sets will usually be systematically incomplete like any list of normative statements
because we cannot foresee all the individual cases.
About the Quality of Performance Indicators
Following Denise Chalmers, the measurement, monitoring and evaluation of L&T quality in HEIs should in-
volve PIs
µZKLFKDUHVLJQLILFDQWLQLQIRUPLQJLQGLYLGXDODQGLQVWLWXWLRQDOSHUIRUPDQFHDQGZKHUHIHDVLEOHDOVR
significant on a common national or sector-wide level. A useful PI is one that informs the develop-
ment of strategic decision-making, resulting in measurable improvements to desired educational
RXWFRPHVIROORZLQJLPSOHPHQWDWLRQ¶&KDOPHUVS32
In other words, as a rule, PIs, if adequately applied, are core elements of (summative or formative) quality
evaluation procedures. Therefore, the principles or methodological standards of evaluation can be applied
WRFKDUDFWHUL]HSURSHU3,VLHWKHLU³ILWQHVVIRUSXUSRVH´3,VPXVWEHXVHIXO appropriate, fair and precise
(DeGEval, 2016):
x Usefulness
PIs should be useful, i.e. it should inform the user in a way that can improve decisions. To be use-
ful, the different goals of PIs, i.e. the information and knowledge requirements of the users, must be
clarified in advance. In addition, usefulness also depends on the competences and credibility of
those using PIs in assessments and evaluations.
x Appropriateness
The procedures for obtaining data and information for PIs should be appropriate. As a rule, instead
of being used in isolation PIs must be used as a group thus grasping the multi-facetedness and in-
terconnectedness of performance issues.
31 µ&RPSUHKHQVLYHQHVV¶DOVRLPSOLHVµVLJQLILFDQFHRQDFRPPRQVHFWRUZLGHVFDOH¶
32 Some may object that this is a huge requirement for PIs and therefore drastically limit the number of possible PIs. On the
other hand, it should be borne in mind that every contribution, though possibly small, to strategic decisions including improve-
ment potential must be taken into account. Particularly, "strategies" are not solely "big strategies" but comprise also smaller
ones as for example the strategy to improve the relevant library offers of a specific requested literature.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 48/95
The understanding of PIs should be holistic, i.e. PIs should provide data and information concern-
ing the L&T environment, teaching competences and processes, learning competences and pro-
cesses, as well as learning outcomes and learning gains including their assessment.
x Fairness
The collection of data and information for PIs should be planned and carried out in a way that pro-
tects the rights, safety and dignity of the persons involved.
x Precision
Survey methods and data sources should be selected in such a way that the reliability of the data
obtained and its validity in relation to answering the performance measurement questions are en-
sured according to professional standards. The technical standards should be based on the quality
criteria of empirical research. The sources of information and data used for PIs should be docu-
mented with adequate accuracy to assess the reliability and appropriateness of the information and
data.
About a More Integrative Approach to L&T
It can be argued that an integrative approach to L&T is required which takes seriously the competence and
learning outcomes orientation, while, at the same time, does not overlook other L&T domains. Moreover, it
also entails that inputs, processes, outputs and outcomes of L&T are jointly considered (Chalmers 2008).
Such an approach may be based on the following distinction of four L&T subdomains that pragmatically dif-
ferentiate the area of L&T into four (interlocking) sub-areas in order to facilitate the analytical compilation
and presentation of the comprehensive PI set:
x Teaching competences and processes
This sub-area of L&T focuses on capturing the teaching processesLHWHDFKHUV¶FRPSHWHQFHV,
skills and actions (which are, of course, related to learning processes).
x Learning competences and processes
This sub-area of L&T focuses on capturing the learning competences and actions of students
(which are, of course, related to teaching processes).
x Learning outcomes and learning gain and their assessment
This sub-area of L&T focuses on the outcomes and impact of L&T that are realised by the students
including the assessment processes for measuring these outcomes. The differentiation of this sub-
area is justified by the following three aspects: the sub-area addresses the main objective of L&T, it
has recently become the focus of quality assessment and it is particularly difficult to grasp.
x L&T environment
This sub-area of L&T comprises the framework conditions and inputs to L&T in institutional and or-
ganisational matters, staff and students etc.
These four constitutive domains should be taken into consideration to generate a comprehensive view on
L&T quality issues, because L&T quality of (higher) education is multi-causally determined by the quality of
inputs (L&T environment), processes (L&T competences and processes) and characterised by the quality of
outcomes (Learning outcomes/gain and their assessment). As already mentioned, these four domains (and
their PIs, see Tables 8-11 below) are usually not strictly separable from each other and should therefore
always be considered jointly. This holistic perspective emerges also clearly from the Standards and Guide-
lines for Quality Assurance in the European Higher Education Area (ESG) (ENQA, 2015).
µThe first associated task is to optimise L&T environment (ENQA, 2015, Standard 1.6), while the
VHFRQGWDVNFRQVLVWVLQLPSOHPHQWLQJWKHµVKLIWIURPWHDFKLQJWROHDUQLQJ¶ZKLFKSURYLGHVPRUHDF
tive roles for learners and participatory approaches. This second task comprises: (1) a student-cen-
tred approach, whereby students and their learning processes are adequately considered as the
core targets of improving quality in L&T (ENQA, 2015, Standard 1.3); (2) changed roles for teach-
ers (ENQA, 2015, Standard 1.5), who improve their teaching competences to de-emphasise the
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 49/95
traditional focus on instructing passive students and give stronger emphasis to the proper arrange-
ment of learning environments and design of learning situations and learning advice; (3) the con-
structive alignment of L&T to learning objectives and outcomes and their effective assessment
(ENQA, 2015, Standard 1.2), which also includes some alignment of academia (for example, objec-
tives of academic and artistic qualifications) to society (for example, objectives of employability, citi-
zenship and personality development); (4) the promotion of self-organised and active learning (for
example, problem-based learning; research-oriented learning; lifelong learning); (5) the conjunction
of knowledge acquisition and acquisition of learning strategies; (6) the consideration of motiva-
WLRQDOYROLWLRQDODQGVRFLDODVSHFWVRIOHDUQLQJ¶/HLEHUa, p. 79).
A Comprehensive Set of Performance Indicators for Learning and Teaching: General Remarks
and Working Definitions
The tentatively comprehensive33 set of more than 80034 explicitly listed PIs for L&T, which is presented
below in Tables 8-11, resulted and emerged from critical reflection of the pertinent literature and related
sources, the most influential of which are the following: (Accreditation Council, 2013; Åkerlind, 2004; Boc-
coni et al., 2012; Chalmers, 2008; CHE 2018; IUSE, 2018; Keshavarz, 2011; Krämer & Müller-Naevecke,
2014; Lodge & Bonsanquet, 2014; OECD-AHELO, 2013, pp. 41 ff., 54 ff; Ramsden, 1991; CHE, 2018;
Whiteley, 2016; Yarkova & Cherp, 2013; Yorke, 1991; Yorke, 1998; Zlatkin-Troitschanskaia et al., 2016).
Other sources are explorative surveys, interviews and focus group discussions with stakeholders (stu-
dents; teachers; quality managers; representatives of the leadership) of the six SQELT partner universi-
ties ± University of Aveiro (Portugal); Birmingham City University (United Kingdom); Ghent University
(Belgium); Jagiellonian University Kraków (Poland); Danube University Krems (Austria); University of Mi-
lan (Italy) ± (SQELT, 2018) and eight German public universities from the federal state of Baden-
Wuerttemberg in the context of an INQAAHE Research Project (QUELIT, 2016). Underlying the ap-
proach and identification of PIs for L&T is a general theory of L&T which motivates and justifies at least
some of the PIs.35
Accordingly, the study took an iterative approach, in the sense that the results of main project develop-
ment steps continuously informed the following project steps, analyses and syntheses. Methodologically,
a qualitative and conceptual hierarchy-based analysis of textual (and linguistic) descriptions of perfor-
mance areas and especially performance indicators at HEIs was used. The textual and linguistic descrip-
tions came from scholarly literature, focus group discussions with project-H[WHUQDO+(,V¶PHPEHUVDQG
discussions among project members. Establishing the final comprehensive PI set (SQELT Intellectual
Output O9) required a number of iterations (exemplified by SQELT Intellectual Outputs O4, O5 and O6)
all of which either aimed at increasing relevance and accuracy of the PIs, or had a specific thematic fo-
cus (e.g. incorporating certain performance areas and types; including Learning Analytics; considering
data ethics; reflecting dimensions of sustainability education).
The mentioned comprehensiveness of the proposed PI set deserves some explanatory comments:
³&RPSUHKHQVLYHQHVV´LVQRWPHDQWWRGHQRWH³SHUIHFWLRQ´RU³DFWXDOFRPSOHWHQHVV´RUVLPLODU Rather, the
completeness of the PI set refers to the signature that the list of PIs is presented as a set which is non-
exhaustive and can always be expanded and made more precise. Furthermore, it should be acknowl-
edged that PIs also require a continuous further development, addressing something like an ³evolution of
PIs´. Rather, the completeness of the PI set refers to the signature that the list of PIs is presented as a
set which is non-exhaustive and can always be expanded and made more precise. Furthermore, it
should be acknowledged that PIs also require a continuous further development, addressing something
33 7REHTXLWHFOHDU³FRPSUHKHQVLYH´LVQRWV\QRQ\PRXVWR³FRPSOHWH´³ILQLVKHG´RU³SHUIHFW´
34 There is generally a certain degree of flexibility with regard to the exact specification of this number, which stems from the fact
that the concrete description of a performance indicator includes pragmatic decisions about how detailed or "deep" a perfor-
mance process and its results can be or shall be analysed and then described as such. This interpretational depth of differentia-
tion is differently pronounced for different PIs. For example, there is hardly any room for iQWHUSUHWDWLRQIRUWKH3,³Number of
books per book title held in library per student population of subject fields and/or per study programmes´VHH7DEOHZKLOH
there is a great deal of room for interpretation for the PI ³6WXGHQWV¶OHDUQLQJJDLQLQVRFial competences (according to relevant
quality criteria to be identified, e.g. team, communication and leadership competences; empathy; ability to cooperate; ability to
solve conflicts) that could be assessed by (satisfaction) surveys of students, surveys of teaching staff and assessment reports
E\H[SHUWVSHHUVRWKHUWKDQVWXGHQWVDQGWHDFKLQJVWDIILQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHd
data for Learning Analytics´(see Table 10).
35 First steps of a corresponding analysis were developed in (Leiber, 2019a), which will be followed up in near future.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 50/95
OLNHDQ³HYROXWLRQRI3,V´ Among other things, this includes that any users of the proposed PI set, such
as HEIs, have to creatively deal with the set, in view of the potential fallibilities, complexity and institu-
tional profile-driven preferences and potentially further framework conditions. It is important to under-
VWDQGWKDWWKHSURSRVHG³FRPSUHKHQVLYH3,VHW´GHILQLWHO\is not a PI set suggested for all HEIs or sug-
gested to be fully adopted by any particular HEI whatsoever. Rather, the proposed comprehensive PI set
is meant to represent a broad and wide range of PIs from which any particular HEI normally will choose a
subset relevant to its needs such as its institutional and subject-mix profile, core development areas,
technical challenges, capacity issues, challenges of strategic maturity, governmental expectations, etc.
General Comments on the SQELT PI Set
For clarification, a few additional general comments on the comprehensive SQELT PI set as depicted in Ta-
bles 8-11 seem to be in order:
x These PIs are either qualitative or quantitative in nature; particularly, there is no epistemological
and methodological reduction to exclusively quantitative PIs.
x The PIs have been developed in accordance with the ESG (ENQA, 2015), i.e. at least broadly, the
(6*¶VTXDOLW\DVVXUDQFHDUHDVDQGFRPPLWPHQWVDUHDFFRXQWHGIRU36
x It should be noted that there is (almost) no single PI which is not open to interpretation. To give just
a few examples: even counting the student population and the number of subject fields of a HEI
can be a complicated and in details contested task due to the lack of uncontested international
standards and guidelines. A similar reasoning applies, for example, to counting teaching staff.
x Another source of possible dispute is given by the understanding of the authors of the SQELT com-
prehensive PI set that it is hard in practice to explicate every PI in all details that are required for its
concrete application under specific circumstances. In other words, for certain PIs the user has to
give a reasonable interpretation to the PI as formulated in the PI set: For example, in the case of
the PI ³Satisfaction survey of students about quality of physical and virtual library services (accord-
ing to relevant quality criteria)´ one of the main tasks left to the user is to identify the relevant qual-
ity criteria of physical and virtual library services. Further examples of this type can be found in the
PI list in Tables 8-11.
x Many of the PIs listed below comprise and incorporate several options, which are indicated by for-
mulations such as the following: ³SHUHEI and/or per department/institute and/or per subject field
DQGRUSHUVWXG\SURJUDPPH´Certainly, any user of the PI can make choices from these options
according to their own considerations and preferences.
x The linguistic description of (qualitative) PIs stands in the field of tension between a very precise
and detailed (and therefore more extensive) description on the one hand and a concise but there-
fore more abstract (or some may call it vague) description on the other hand. Often, how to find
and keep the balance between these two extremes is not unambiguous or completely uncontested.
Therefore, the authors of the SQELT PI set do not claim that it is perfect in its entirety. Some of the
PIs presented could benefit from linguistic fine-tuning and it will also be opportune and necessary
for users of the set to (qualitatively) interpret the PIs suggested.
x PIs differ ± as stated earlier in this report ± in, for example, their level of preciseness, the level of
³PHDVXUDELOLW\´DQGOHYHORI³SUR[LPLW\´WRWKHreal-world performance processes. It is not easy to
achieve a clear consensus that all PIs in the present list(s) (Tables 8-11) achieve the same level of
descriptive accuracy.
x Many of the suggested PIs may be further analysed, while some of them may also require deeper
analysis and operational interpretations such as spelling out in more detail the related assessment
36 For some readers, it may seem worthwhile to check for this accordance in more detail.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 51/95
procedures, evaluation criteria37 and exact ways of calculating RU³EXLOGLQJ´WKH3,V$QRWKHUDV
pect is to look more deeply into pedagogical characteristics and technological options of relevant
learning processes.
x It is legitimate to question which of the PIs in the SQELT comprehensive PI set may be conceived
as µJHQHULF¶RUµVWDQGDUG¶RUµRIVHFWRU-ZLGHUHOHYDQFH¶LQ higher education L&T. In view of the large
scope of the SQELT comprehensive PI set, and the complexity of many of its PIs, this question
probably has no simple answer valid for all HEIs. From the perspective of the SQELT project, it can
be suggested that the comprehensive PI set may ± in time ± contribute to leading to a system or
sector consensus on a generic or standard set.
x At the same time, however, the hope of a conclusive clarifying consensus with the sector (sub-sec-
tors) should not be expressed too positively, because the empirical corroboration and justification
of PIs for complex multiple-hybrid social organisations such as HEIs is a rather challenging task. To
give a real-life example: when HEI representatives are surveyed, which PIs they see as more im-
portant or less important, and whether PIs are applied in their institution ± regularly, occasionally,
or not at all ±, interviews, focus group discussions and other approaches show that, in general, it is
not easy for HEI members to answer these questions because the relevance of certain PIs varies
with subject fields (disciplines or sub-disciplines), institutional levels, profiles and development
goals, and it is often difficult to have an overview over the corresponding QA activities, monitoring
and performance measurements throughout the whole institution (i.e., the different faculties, de-
partments etc.). Therefore, it must be accepted that, by tendency, answers can be vague and fuzzy
to some (probably often undetectable) extent.
37 )RUH[DPSOHDOOFRPSOH[3,VUHSUHVHQWHGE\³VWXGHQWVDWLVIDFWLRQVXUYH\VDERXW«´DQG³WHDFKLQJVWDIIVDWLVIDFWLRQVXUYH\V
DERXW«´UHTXLUHVSHFLILFDWLRQVRIWKHFULWHULDWREHVXUYH\HGLHWKHLWHPVWKDWDUHLQWHQGHGWREHPHDVXUHGDQGPRQLWRUHG.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 52/95
SQELT PIs for Teaching Competences and Processes
In Table 8 WKH64(/7SURMHFW¶V3,Vthat are mainly related to teaching competences and processes are
listed, including their measures/performance measurement methods, if appropriate. To facilitate overview in
a pragmatic way, the PIs of this area are ordered according to performance types and performance sub-
types. This makes it also easier to check which performance types are covered by the listed PIs.
Table 8: Comprehensive set of PIs for L&T ³3HUIRUPDQFH,QGLFDWRU6HW,V´: performance area of
teaching competences and processes
Performance
types
Performance sub-
types PIs and their measures/performance measurement methods
Teaching staff workload
Official teaching commitment in average semester or trimester or year hours per week per
subject field38 and/or study programme
Teaching staff workload (according to relevant quality criteria to be identified, e.g. number
of teaching hours per semester week; number of courses) that could be assessed by satis-
faction surveys39 of relevant groups40 of teaching staff (e.g. of a subject field, study pro-
gramme)
Quality of teach-
ing staff, teach-
ing and teaching
staff engage-
ment
Teaching skills
Proportion of teaching staff who participated in pedagogical training (according to relevant
quality criteria to be identified, e.g. didactics of Transformative and Holistic Continuing Self-
Directed Learning (THCSDL)41)
Proportion of teaching staff who participated in support activities for their adaptation of
technology-enhanced L&T (e.g. e-learning, flipped classroom) (according to relevant qual-
ity criteria to be identified)
Proportion of teaching staff who participated in peer support systems for teaching staff (ac-
cording to relevant quality criteria to be identified)
Proportion of teaching staff who participated in teaching observation (according to relevant
quality criteria to be identified)
Teaching staff recruit-
ment
Quality of teaching courses of recruitment candidates for teaching staff (according to rele-
vant quality criteria to be identified, e.g. didactics of Transformative and Holistic Continuing
Self-Directed Learning (THCSDL)) that could be assessed by (satisfaction) surveys of stu-
dents and teaching staff
Quality of recruitment procedures (according to relevant quality criteria to be identified, e.g.
procedural responsibilities; recruitment and selection process; recruitment quality criteria)
for lecturers/associate professors/full professors (according to relevant quality criteria to be
identified, e.g. teaching skills, pedagogic skills, research activities) that could be assessed
by (satisfaction) surveys of students42, surveys of teaching staff and assessment reports by
experts/peers43 (other than students and teaching staff)44 (SUSTEX)
Publications and
presentations
Number and/or percentage of non-refereed publications during a specified period (e.g.
three years) per FTE (full-time-equivalent) member of teaching staff and/or per subject field
and/or per study programme
Number and/or percentage of refereed publications during a specified period (e.g. three
years) per FTE (full-time-equivalent) member of teaching staff and/or per subject field
and/or per study programme
Number and/or percentage of double-blind refereed publications during a specified period
(e.g. three years) per FTE (full-time-equivalent) member of teaching staff and/or per sub-
ject field and/or per study programme
38 Subject fields may be identified according to the classification in (UNESCO 2013) or any other appropriate classification.
39 In the following, the notion of survey generally comprises online and paper-and-pencil questionnaires with closed and open
TXHVWLRQVVWUXFWXUHGLQWHUYLHZVDQGIRFXVJURXSGLVFXVVLRQVLQRWKHUZRUGV³VXUYH\´LVQRWUHVWULFWHGRUUHGXFHGWRTXDQtita-
tive survey questionnaires.
40 The understanding that generally relevant groups should be selected for surveys and other data acquisition procedures ap-
plies wherever required throughout this PI set without being explicitly mentioned.
41 See e.g. (Du Troit-Brits 2018).
42 Comment: Some SQELT partners had/have concerns about the inclusion of students in the assessment of teaching staff re-
cruitment. A counter argument is that, in general, students should not be excluded from participation when it comes to teaching
staff recruitment. ± Of course, such assessment must be organised adequately. For example, it is not to be expected that stu-
dent beginners and students who are not engaged in HEI organisation could contribute fairly well to such assessment.
43 Here as well as at similar places throughout this PI set, it is due to the user of the PI in which form and context such assess-
ment is carried out: for example, the assessment may be integrated part of an accreditation or it may be carried out as an indi-
vidual evaluation of a study programme. Also, WKHXVHURIWKH3,KDVWKHFKRLFHZKRH[DFWO\WKHVH³H[SHUWVSHHUV´PD\EHLI
any, who may be involved in addition to, or in replacement of the before-mentioned stakeholders.
44 In the following and throughout this PI set, these three basic appropriate ways of assessment are abbreviated by the acronym
SUSTEX.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 53/95
Number and/or percentage of non-refereed presentations at academic conferences during
a specified period (e.g. three years) per FTE (full-time-equivalent) member of teaching staff
and/or per subject field and/or per study programme
Number and/or percentage of refereed presentations at academic conferences during a
specified period (e.g. three years) per FTE (full-time-equivalent) member of teaching staff
and/or per subject field and/or per study programme
Number and/or percentage of double-blind refereed presentations at academic confer-
ences during a specified period (e.g. three years) per FTE (full-time-equivalent) member of
teaching staff and/or per subject field and/or per study programme
Teaching staff compe-
tences
THDFKLQJVWDII¶VVXEMHFW-matter competences (according to relevant quality criteria to be
identified) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VPHWKRGRORJLFDOFRPSHWHQFHVDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHULDWREH
identified) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VYRFDWLRQDOWUDLQLQJFRPSHWHQFHVDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHULDWR
be identified) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VGLJLWDOVNLOOVFRPSHWHQFHVDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHULDWREHLGHQ
tified) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VVRFLDOFRPSHWHQFHVHJ team, communication and leadership compe-
tences) (according to relevant quality criteria to be identified) that could be assessed by
satisfaction surveys of students
THDFKLQJVWDII¶VUHVSHFWDQGLQWHUHVWIRUVWXGHQWVDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHria to
be identified) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VHQFRXUDJLQJVWXGHQWV¶DXWRQRPRXV, critical thinking and acting (according
to relevant quality criteria to be identified) that could be assessed by satisfaction surveys of
students
THDFKLQJVWDII¶V didactics competences and pedagogical knowledge and skills (according
to relevant quality criteria to be identified, e.g. didactics competences in Transformative
and Holistic Continuing Self-Directed Learning (THCSDL) and knowledge of teaching mod-
els and learning processes) that could be assessed by satisfaction surveys of students
THDFKLQJVWDII¶VVHQVLWLYLW\WRFRXUVHOHYHODQGSURJUHVVDFFRUGLQJWRUHOHYDQWTXDOLW\FULWH
ria to be identified) that could be assessed by satisfaction surveys of students
Teaching VWDII¶VIRVWHULQJVXVWDLQDELOLW\YDOXHVVRFLDOHFRORJLFDOHFRQRPLFDODFFRUGLQJ
to relevant quality criteria to be identified) that could be assessed by satisfaction surveys of
students
THDFKLQJVWDII¶VIHHGEDFNWRVWXGHQWVHJRQZRUNLQSURJUHVV, test, completed assign-
ments) (according to relevant quality criteria to be identified) that could be assessed by
satisfaction surveys of students
THDFKLQJVWDII¶VH[SHUWLVHDQGFRPSHWHQFHVLQFRQWLQXLQJHGXFDWLRQDQGOLIH-long learning
(according to relevant quality criteria to be identified) that could be assessed by satisfac-
tion surveys of students
Quality of teaching courses (according to relevant quality criteria to be identified, e.g. em-
bedding of courses in curriculum, meaningful structures, options for participation, imparting
knowledge and skills, preparation of teacher) WKDWFRXOGEHDVVHVVHGE\VWXGHQWV¶VDWLV
faction) surveys and/or teaching staff peer review and/or by participating observation of
teaching staff
Academic content and
structure of courses of-
fered
ASSURSULDWHQHVVRIREMHFWLYHVRIFRXUVHV¶FRQWHQWDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHULDWR
be identified) that could be assessed by SUSTEX
CRQWHPSRUDQHLW\DQGWLPHOLQHVVRIFRXUVHV¶FRQWHQWDFFRUGLQJWRUHOHYDQWTXDOLW\FULWHULD
to be identified) that could be assessed by SUSTEX
Methods of course delivery, and the quality and quantity of the demands made of students
(according to relevant quality criteria to be identified) that could be assessed by SUSTEX
Compatibility of studies with working (according to relevant quality criteria to be identified)
that could be assessed by SUSTEX
Cutting-edge teaching
Use of current research in informing teaching and curricula content (according to relevant
quality criteria to be identified) that could be assessed by SUSTEX
Organisation of course
sessions
Organisation of course sessions/flexible learning (exemplary quality criteria include flexibil-
ity in the requirements, time and location of study, teaching, assessment and certification)
that could be assessed by SUSTEX
Special teaching staff
competences in medi-
cine
Quality of bedside teaching (e.g. concerning mentoring, suitability of rooms and variety of
diagnostic techniques applied) (according to relevant quality criteria to be identified) that
could be assessed by patient surveys and/or SUSTEX and/or peer review and participating
observation by teaching staff and
Mutual integration of pre-clinical/theoretical and clinical/practical courses including experi-
ence with patient contact (according to relevant quality criteria to be identified) that could
be assessed by SUSTEX
Quality skills labs and training centres (exemplary quality criteria include maintenance, ac-
cessibility, technical facilities, mentoring) that could be assessed by SUSTEX
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 54/95
Overall quality of the
student experience of
teaching
OYHUDOOTXDOLW\RIVWXG\SURJUDPPHVFRXUVHVDQGVWXGHQWV¶H[SHULHQFHRIWHDFKLQJH[HP
plary quality criteria include structure of study programme based on the contemporary
state of knowledge and research; quality and relevance of course requirements; teaching
based on the contemporary state of knowledge and research; achievability of L&T goals)
that could be assessed by satisfaction surveys of students
Contact with
work environ-
ment
Internships/practical ex-
perience/work experi-
ence
Number and/or percentage of study programmes (Bachelor, Master, doctoral/PhD) that
have compulsory internships
Number and/or percentage of compulsory internships per study programme (Bachelor,
Master, doctoral/PhD)
Number and/or percentage of hours connected to the internships per study programme
(Bachelor, Master, doctoral/PhD)
Number and/or percentage of ECTS credits connected to the internships per study pro-
gramme (Bachelor, Master, doctoral/PhD)
Number and/or percentage of phases of practical experience and/or work experience
and/or external projects per study programme (Bachelor, Master, doctoral/PhD)
Inclusion of internships in the study programme curricula (Bachelor, Master, doctoral/PhD)
(according to relevant quality criteria to be identified) that could be assessed by SUSTEX
Inclusion of phases of practical experience and/or work experience and/or external projects
in the study programme curricula (Bachelor, Master, doctoral/PhD) (according to relevant
quality criteria to be identified) that could be assessed by SUSTEX
Inclusion of phases of practical experience and/or work experience and/or external projects
in the study programme curricula (Bachelor, Master, doctoral/PhD) (according to relevant
quality criteria to be identified) that could be assessed by SUSTEX
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 55/95
SQELT PIs for Learning Competences and Processes
In Table 9 WKH64(/7SURMHFW¶V3,Vthat are mainly related to learning competences and processes are
listed, including their measures/performance measurement methods, if appropriate. To facilitate overview in
a pragmatic way, the PIs of this area are ordered according to performance types and performance sub-
types. This makes it also easier to check which performance types are covered by the listed PIs.
Table 9: Comprehensive set of PIs for L&T ³3HUIRUPDQFH,QGLFDWRU6HW,V´: performance area of
learning competences and processes
Performance
types
Performance sub-
types PIs and their measures/performance measurement methods
Quality learning
and student en-
gagement
Student workload
Student workload (according to relevant quality criteria to be identified, e.g. number of
learning hours per semester week, number of courses) that could be assessed by
SUSTEX45 and/or by Learning Analytics methodologies46 including, if required and lawfully
protected (e.g. by the European General Data Protection Regulation (GDPR; EC 2016) or
the SQELT Ethical Code of Practice for (Performance) Data Management (ECPDM 2020)),
the VWXGHQWV¶ personalised data that are relevant to make use of the PI for Learning Analyt-
ics47 (PDRLA48)
Student interactions
with learning content
Average duration per student interaction with course activities (e.g. solution of exercises,
watching videos, listening to lecture, participation in working groups, etc.) that could be as-
sessed by reports generated from Learning Management Systems (LMSs) and/or Learning
Analytics tools49 per student and/or per study programme including the lawful protection of
WKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV (PDRLA)
Average duration per student interaction with course contents that could be assessed by
reports generated from LMSs and/or Learning Analytics tools per student and/or per study
programme including WKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU
Learning Analytics (PDRLA)
Number of repetitive visits to learning contents (e.g. during online learning) that could be
assessed by reports generated from LMSs and/or Learning Analytics tools per student
and/or per study programme including WKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQ
alised data for Learning Analytics (PDRLA)
Student motivation
6WXGHQWV¶dispositions, values and attitudes towards learning that could be assessed by
SUSTEX through collection of learner data and pedagogical descriptors (exemplary quality
criteria include learning-related emotions such as enjoyment, curiosity, frustration, or anxi-
HW\DQGWKHLULQWHUDFWLRQVVWXGHQWV¶DELOLW\LQGHDFWLYDWLQJQHJDWLYHOHDUQLQJHPRWLRQVVWX
GHQWV¶OHDUQLQJVWUDWHJLHV including WKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDO
ised data for Learning Analytics (PDRLA)
Student learning com-
petences
6WXGHQWV¶FRPSHWHQFHVZLWKUHVSHFWWROHDUQLQJDQGVHOI-directed learning (SDL) (e.g. stu-
GHQWV¶NQRZOHGJHDQGXQGHUVWDQGLQJRIOHDUQLQJWKHRULHVRZQOHDUQLQJSURFHVVHVprob-
lem-based learning, research-based learning, internships, online learning, mobile learning,
blended learning) (according to relevant quality criteria to be identified) that could be as-
sessed by SUSTEX LQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHG
data for Learning Analytics (PDRLA)
Overall quality of learn-
ing experience
Overall quality of student learning experience (according to relevant quality criteria to be
identified) that could be assessed by student satisfaction surveys including the lawful pro-
WHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV (PDRLA)
45 Student surveys may be based on, e.g., self-assessment, learning diary, think-aloud protocols.
46 Exemplary quality criteria include visualisation of student activity for promotion of self-regulated learning processes via Stu-
dent Activity Meter; providing insight into individual and group interactions with the learning content via LOCO-Analyst.
47 ,QWKHIROORZLQJDQGWKURXJKRXWWKLV3,VHWWKLVFODXVHDERXWVWXGHQWV¶SHrsonal data protection is used in the abbreviated ver-
VLRQ³LQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV´
48 PDRLA = Personalised data required for Learning Analytics.
49 Such as BlackBoard, Moodle, Desire2Learn (e.g. individual user tracking, course-based); Social network analysis generated
from Learning Analytics tools such as SNAPP (Social Networks Adapting Pedagogical Practice) (e.g. visualization of student
relationships established through participation in LMS discussions); Individual and group monitoring generated from Learning
$QDO\WLFVWRROVVXFKDV*/$66*UDGLHQW¶V/HDUQLQJ$QDO\WLFV6\VWHPHJYLVXDOL]DWLRQRIVWXGHQWDQGJURXSRQOLQHHYHQWDF
tivity); Discourse analysis generated from Learning Analytics tools such as COHERE (e.g. visualization of social and conceptual
networks and connections.
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 56/95
SQELT PIs for Learning Outcomes and Learning Gain and Their Assessment
In Table 10 WKH64(/7SURMHFW¶V3,Vthat are mainly related to learning outcomes and learning gain and
their assessment are listed, including their measures/performance measurement methods, if appropriate.
To facilitate overview in a pragmatic way, the PIs of this area are ordered according to performance types
and performance sub-types. This makes it also easier to check which performance types are covered by the
listed PIs.
Table 10: Comprehensive set of PIs for L&T ³3HUIRUPDQFH,QGLFDWRU6HW,V´: performance area of
learning outcomes and learning gain and their assessment
Performance
types
Performance sub-
types PIs and their measures/performance measurement methods
Student success
Coursework perfor-
mance
Personal student coursework grades and credit points earned including the lawful protec-
WLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFVPDRLA)
Assessment/examination grades and credit points earned during the study programme in-
FOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\W
ics (PDRLA)
Percentage of credit points awarded in service-learning activities (e.g. students in commu-
nity service activities and social work) in relation to total number of credit points including
WKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV
(PDRLA)
Structure and form
*UDGHVRIVWXGHQWV¶ILQDOH[DPLQDWLRQVRIWKHVWXG\SURJUDPPHincluding the lawful protec-
WLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFVPDRLA)
Number and/or percentage RI%DFKHORUV¶GHJUHHVDZDUGHGSHU\HDUSHU+(,DQGRUSHU
subject field and/or department/institute and/or study programme
Number and/or percentage RI0DVWHUV¶GHJUHHVDZDUGHGSHU\HDUSHU+(,DQGRUSHUVXE
ject field and/or department/institute and/or study programme
Number and/or percentage of long first degrees awarded per year per HEI and/or per sub-
ject field and/or department/institute and/or study programme
Number and/or percentage of doctoral/PhD (or equivalent) degrees awarded per year per
HEI and/or per subject field and/or department/institute and/or study programme
Number and/or percentage of doctoral/PhD (or equivalent) degrees awarded to interna-
tional doctorate/PhD candidates per year per HEI and/or per subject field and/or depart-
ment/institute and/or study programme
Percentage of final examinations (Bachelor/Master/PhD) conducted face-to-face or online
Completion of study
units
Number and/or percentage of students who did not complete the programme modules they
KDGVWDUWHGLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶personalised data for
Learning Analytics (PDRLA)
Number and/or percentage of students who did not complete the first year of study includ-
LQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV
(PDRLA)
Number and/or percentage of students who did not complete the undergraduate pro-
grammes within the planned programme duration (Bachelor graduation on time) including
WKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFV
(PDRLA)
Number and/or percentage of students who did not complete the undergraduate pro-
JUDPPHV%DFKHORUJUDGXDWLRQLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHU
sonalised data for Learning Analytics (PDRLA)
Number and/or percentage of students who did not complete the graduate programmes
within the planned programme duration (Master graduation on time) including the lawful
SURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU/HDUQLQJ$QDO\WLFVPDRLA)
Number and/or percentage of students who did not complete the graduate programmes
0DVWHUJUDGXDWLRQLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHG
data for Learning Analytics (PDRLA)
Number and/or percentage of students who did not complete their long first degree (= more
than four years) within the planned programme duration (long first-degree graduation on
WLPHLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWD for Learning
Analytics (PDRLA)
Number and/or percentage of students who did not complete their long first degree (= more
than four years) (long first-degree graduation) including the lawful protection of the use of
VWXGHQWV¶SHUVRQDOLVHGGDWD for Learning Analytics (PDRLA)
SQELT ± ERASMUS+ Project 2017-20 ± Intellectual Output O11: PDGM in HE L&T: A Guideline 57/95
Number and/or percentage of students who did not complete the doctoral/PhD (or equiva-
lent) programmes within the planned programme duration (postgraduate graduation on
WLPHLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWD for Learning
Analytics (PDRLA)
Number and/or percentage of students who did not complete the doctoral/PhD (or equiva-
lent) programmes (postgraduate graduation) including the lawful protection of the use of
VWXGHQWV¶SHUVRQDOLVHGGDWD for Learning Analytics (PDRLA)
Drop-out
Number and/or percentage of students who left their study programme per semester
and/or per year per HEI and/or per subject field and/or per department/institute and/or per
study programme
Number and/or percentage of students who intend to exit their study programme per year
per HEI and/or per subject field and/or per department/institute and/or per study pro-
JUDPPHLQFOXGLQJWKHODZIXOSURWHFWLRQRIWKHXVHRIVWXGHQWV¶SHUVRQDOLVHGGDWDIRU Learn-
ing Analytics (PDRLA)