BookPDF Available

Organizational Learning. A Framework for Public Administration


Abstract and Figures

In order to face the challenge of effective organizational learning in our public policies, we need to address three pressing questions: 1. How does learning work in our public organizations? 2. What promising practices can we implement to advance learning in public organizations? 3. What changes in public management are required to combine learning with the growing demands of performance and accountability? This book is an attempt to address those questions in a systematic and empirical manner. The answers presented in this volume are the result of a four-year empirical research project conducted in Polish ministries and study visits in public institutions of twelve countries of the Organization for Economic Co-operation and Development.
Content may be subject to copyright.
Scholar Publishing House
62 Krakowskie Przedmieście Str.,
00-322 Warsaw, Poland
tel./fax 22 826 59 21, 22 828 95 63, 22 828 93 91
First edition
Typeset: WN Scholar (Jerzy Łazarski)
Printed by Wrocławska Drukarnia Naukowa PAN
Series: Ministerstwa Uczące Się
Series editor: Karol Olejniczak, PhD
Proofreading: Lucy Bindulska, Molly Huffman
Cover project: Katarzyna Juras
Cover illustration: ©
This publication is free of charge. It has been co-financed by the European Union,
European Social Fund. It is the result of the project titled: “Ministerstwa Uczące Się (MUS) –
zestaw narzędzi diagnozy i wsparcia mechanizmów organizacyjnego uczenia się kluczowych
dla polityk publicznych opartych na dowodach.
Project has been co-financed by the European Union and implemented in the framework
of Operational Program “Human Capital”, priority “Good Governance”.
Copyright © Centre for European Regional and Local Studies, University of Warsaw
ISBN 978-83-7383-724-9
doi: 10.7366/9788373837249
List of Figures and Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Karol Olejniczak
Why learning matters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
What is organizational learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
The structure and value of this book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
1 Discovering the learning mechanism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
1.1 Stage 1: Developing thetheoretical framework . . . . . . . . . . . . . . . . . . . . . . . . 21
Analytical procedure and methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.2 Stage 2: Testing the framework in practice – asurvey with ministry
employees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Analytical procedure and methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
SEM – Structural Equation Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
1.3 Stage 3: Exploring learning in-depth – interviews with leaders . . . . . . . . . . 29
Analytical procedure and methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
1.4 Conclusions – the organizational learning framework. . . . . . . . . . . . . . . . . . 36
1.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
2 Searching for inspiration. Practices from twelve countries . . . . . . . . . . . . . 49
Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
2.1 Examples from countries with aWeberian Model of public
administration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
France . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Japan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Spain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Switzerland. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
2.2 Examples from countries with aNeo-Weberian Model . . . . . . . . . . . . . . . . . 54
Norway . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Sweden . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
2.3 Examples from countries with New Public Management . . . . . . . . . . . . . . . 57
Australia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
New Zealand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
United Kingdom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
2.4 Examples from countries with aGovernance model . . . . . . . . . . . . . . . . . . . 62
Canada . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
The Netherlands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
The United States of America . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
2.5 Conclusions from the international comparison. . . . . . . . . . . . . . . . . . . . . . . 67
2.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
3 Moving towards accountability for learning . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Karol Olejniczak, Kathryn Newcomer
3.1 Accountability and performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
3.2 Changing the way we understand public interventions . . . . . . . . . . . . . . . . . 85
3.3 Redefining failure and creating space for experimentation. . . . . . . . . . . . . . 91
3.4 Refocusing accountability to promote learning. . . . . . . . . . . . . . . . . . . . . . . . 93
3.5 Conclusions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
3.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Annexes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Annex 1. Survey items that measure Mechanism of Organizational
Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Annex 2. Interview protocol for interviews in Polish ministries . . . . . . . . . . . . . 105
Annex 3. Interview protocol used during international study visits . . . . . . . . . . 106
List of Figures and Tables
Figure 1. Stages of the research process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Figure 2. Framework of organizational learning – version 1 . . . . . . . . . . . . . . 24
Figure 3. SEM modeling stages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Figure 4. Framework of organizational learning – version 2 . . . . . . . . . . . . . . 28
Figure 5. The mechanism of organizational learning . . . . . . . . . . . . . . . . . . . . 44
Figure 6. Relations between processes and determinants of organizational
learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Figure 7. Amore realistic logic model for planning and evaluating public
interventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Figure 8. Aspectrum of reasons for errors in implementing public sector
interventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Table 1. Overview of main literature strands . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Table 2. Coding categories and coding results . . . . . . . . . . . . . . . . . . . . . . . . . 31
Table 3. Results of the mixed-method analysis – sources of knowledge. . . . 33
Table 4. Results of mixed-method analysis – characteristics of feedback. . . 34
Table 5. The practical utility of framework – tool for monitoring
organizational learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Table 6. Processes of organizational learning . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Table 7. Determinants of organizational learning . . . . . . . . . . . . . . . . . . . . . . 41
Table 8. Trajectories: Public Service Newsletter. . . . . . . . . . . . . . . . . . . . . . . . 50
Table 9. Coaching in public administration – aguide. . . . . . . . . . . . . . . . . . . 51
Table 10. Database of hyari-hatto incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Table 11. Nemawashi and ringi decision-making process . . . . . . . . . . . . . . . . 52
Table 12. Knowledge Management 2.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Table 13. SimpA practice. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Table 14. Partnerforum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Table 15. Program of acquiring specialists for public administration. . . . . . . 55
Table 16. Research committees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
8List of Figures and Tables
Table 17. Training focused mainly on the development of ‘soft’ skills
and qualifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Table 18. Talent management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Table 19. Mental models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Table 20. Performance improvement framework. . . . . . . . . . . . . . . . . . . . . . . . 59
Table 21. Growing leaders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Table 22. Developmental peer-review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Table 23. Regulatory impact assessment – quality assurance
mechanisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Table 24. GCpedia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Table 25. Summaries of completed actions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Table 26. Laboratory of innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Table 27. Knowledge brokers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Table 28. Data-driven performance reviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Table 29. Communities of practice. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Table 30. Typology of action fields of organizational learning practices . . . . 72
Table 31. Common criteria used for assessing accountability
in the public sector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Table 32. Contrasting expectations and behaviors associated with performance
accountability and organizational learning . . . . . . . . . . . . . . . . . . . . 85
Table 33. Elements of the logic model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
Table 34. Elements of the logic models – theories. . . . . . . . . . . . . . . . . . . . . . . 89
Table 35. Transition from traditional performance accountability
to accountability for learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
This book is the result of afour-year research project co-financed by the European
Union, European Social Fund. The project was conducted by the Centre for European
Regional and Local Studies – University of Warsaw (EUROREG) and The Malopolska
School of Public Administration Cracow University of Economics.
Our core team consisted of (in alphabetical order): Robert Chrabąszcz, Anna Do -
ma radzka, Jowanka Jakubek-Lalik, Sylwia Kołdras, Bartosz Ledzion, Stanisław Mazur,
Michał Możdżeń, Karol Olejniczak (team leader), Adam Płoszaj, Jakub Rok, Dawid
Sześciło, Paweł Śliwowski, Łukasz Widła, Dominika Wojtowicz and Monika Wolska.
In our project we explored the nature and practice of organizational learning.
And this book itself is aresult of such alearning process. It would not be possible
without the discussions, exchange of ideas and support given by members of different
institutions. We owe them all our sincere thanks.
Above all, we would like to thank the general directors of the ministries with
whom we began cooperation in 2010, and with whom we continued collaboration,
despite institutional changes. We are grateful to Monika Dziadkowiec, Katarzyna
Szarkowska, Edyta Szostak, Ms Magdalena Tarczewska-Szymańska as well as Sławo-
mir Lewandowski, Adam Wojtas and Wojciech Kijowski. Their good will allowed
us to conduct acomprehensive diagnosis of the learning mechanisms in the Polish
administration, which is the first of its kind.
We would also like to thank all the departmental directors and staff of the Polish
ministries who participated in the research survey and in-depth interviews. In 2011,
we had the pleasure of studying the Ministry of the Interior and Administration,
the Ministry of the Infrastructure, the Ministry of Regional Development and the
Ministry of the Environment, while in 2014, following numerous organizational
changes, asecond diagnosis was performed in the Ministry of Administration and
Digitization, the Ministry of the Infrastructure and Development and the Ministry of
the Environment. The information that we obtained allowed us to better understand
learning mechanisms and to create solutions appropriate to the Polish context. We are
also grateful to those people who helped us on adaily basis to coordinate our research
in the ministries: Ewa Dąbrowska, Elżbieta Kazaniecka and Andrzej Zbylut.
We especially appreciate our Colleagues who supported us with their advice and
ideas throughout the whole process: Frans Leeuw, Lech Marcinkowski and Kathryn
10 Acknowledgements
Our thanks go to all the staff of the public agencies and institutions in the
twelve countries that we visited. The experiences that they were willing to share
with us during interviews were an exceptional source of inspiration for innovations
introduced in Poland. We like to recognize especially: John Butter, Katherine Dawes,
Robert Goldenkoff, Peter van Hoesel, Masahiro Horie, Masao Kikuchi, Peter van
der Knaap, Rick Kowalewski, Bill Maurer, Keiichi Muto, Wim van Nunspeet, Nancy
Potok, Paul Robben, Kimberly Vitelli, Jaap de Wit and Pieter Wouters.
Our sincere thanks to the directorate and staff of the eight Polish departments1,
with whom we tested selected practices over the course of many months. We are
particularly grateful to: Dorota Błaszczyk, Joanna Gradowska, Dagmara Haba, Mag-
dalena Jabłonowska, Luiza Kaczmarek, Joanna Kossowska, Karolina Kulicka, Małgo-
rzata Kuźma, Jarosław Łuby, Alicja Ogonowska, Monika Pałasz, Małgorzata Stepa niuk,
Marek Śmietanko, Izabela Wereśniak-Masri, Grzegorz Ziomek and Izabela Żar czyńska.
We are grateful to them for giving us their time, for their active engagement in jointly
creating implementation scenarios and then implementing the changes. We would like
to thank them for creatively enriching our practices with their experience and ideas,
and for reflecting on the process of introducing improvements. We are confident that
our joint efforts will bear fruit in the daily work of their institutions. We believe that,
thanks to this close cooperation, we have been able to prepare aset of tools that will
provide essential support to other departments of Polish ministries as well as other
public institutions.
We would also like to express our thanks to the representatives of the Chancellery
of the Prime Minister, working as the Implementing Institution of our project.
We owe particular gratitude to the team led by Director Krzysztof Motyk – Marta
Lenart, Danuta Pietrzak and Artur Jopek. Their professional assistance allowed us to
implement the project efficiently, mi nimizing any risks that appeared.
Last but not least we would like to give thanks to the editing team of our publisher
– Wydawnictwo Naukowe Scholar, especially for Łukasz Żebrowski who supervised
editing process of all Polish and English publications of our project, Lucy Bindulska
and Molly Huffman who edited this book.
On the behalf of Research Team
Karol Olejniczak and Stanisław Mazur
1 The Department of Navigation and the Department of Control of the Ministry of Transport,
Construction and Maritime Economy; the Department of Supra-regional Programs and the Human
Resources Management Office of the Ministry of Regional Development (currently all four departments
are contained within of the Ministry of the Infrastructure and Development); the Law Department and
Department of Environmental Information of the Ministry of the Environment; the Department of
Public Administration and the Department of Information Society of the Ministry of Administration and
Karol Olejniczak
The complexity of modern socio-economic issues turns our public policies into
acontinuous trial and error process. Every time public managers and stakeholders
address anew policy issue they have to use research results, experience and insight
to find out what works for whom and in what context. That requires from public
organizations intense organizational learning. This challenge is especially demanding
for administrations of countries that are undergoing modernization of their socio-
economic systems. Poland is an example of such country.
In order to face the challenge of effective organizational learning, we need to
address three pressing questions:
1. How does learning work in our public organizations?
2. What promising practices can we implement to advance learning in public orga-
3. What changes in public management are required to combine learning with the
growing demands of performance and accountability?
This book is an attempt to address those questions in asystematic and empirical
manner. The answers presented in this volume are the result of afour-year empirical
research project conducted in Polish ministries and study visits in public institutions
of twelve countries of the Organization for Economic Co-operation and Develo p-
Why learning matters
Modern institutions of public administration face anumber of challenges in the
process of developing and implementing public policies. First, since the majority of
modern socio-economic problems are complex and dynamic in nature, addressing
them in an effective way requires aflexible, multi-sector approach and therefore
the mobilization of different, broad coalitions of socio-economic actors. Public
organizations have to orchestrate this process, and face number of challenges while
running programs that are highly complex (Rogers, 2008).
Second, the development of science and modern technologies creates unprece-
dented capacity for data collection, analysis and empirical research. The opportunity
for evidence-based public management (that is, using information for decision-
making) is clearly visible (Shillabeer et al., 2011). However these developments have
12 Karol Olejniczak
also created information overload (Spira, 2011). Thus, public organizations face the
challenge of developing smart strategies to turn the data into meaningful knowledge,
useful in apublic policy process (Hatry, Davies, 2011; Partnership for Public Service,
Third, ascarcity of public financial resources urges governments to focus public
interventions on solutions that are effective. This leads to the challenge of identifying
and understanding the mechanisms behind successful interventions (Davies et al.,
2009). Public organizations have to look into the “black box” of intervention design
(Astbury, Leeuw, 2010) and learn the existing social or even behavioral mechanism of
change that underpins the effectiveness of regulations and public programs (Pawson,
2013; Sunstein, 2011).
Fourth, studies show that “mechanisms of change” are highly contextual (Pawson,
Tilley, 1997). They work for certain recipients, at certain times and under certain
conditions. This in turn means a need for permanent adaptation of developed
solutions through constant learning and responding.
All these challenges lead to situation where every intervention can be treated as
an experiment, while policy-making becomes a “trial-and-error problem-solving
process” (Bardach, 2006, p. 350). In such conditions, the only effective strategy for
every public organization is continuous learning.
The issue of organizational learning seems even more pressing for the public
administration in countries that are undergoing transformation since, in addition to
the four issues described above, these administrations face afifth challenge. They
are moving from atraditional, bureaucratic model of public administration to anew
paradigm of public management. This new approach is strengthened by the fact that
external aid programs (e.g. European Union funds) are driven by the logic of public
management. This requires both strategic planning and effective, development-
oriented use of resources. At this point organizational learning becomes an important
asset to tap into, while trying to make the best use of EU funds and live up to the new
standards of public policy.
What is organizational learning
The issue of using knowledge in organizations and building acompetitive advan tage
based on experience has been addressed by three different strands of lite rature: Orga-
ni zational Learning, Learning Organization and Knowledge Ma nagement (Easter by-
Smith et al., 1999; Maier, 2007, p. 19-93). Abrief summary is pre sented in Table 1.
The first strand, “Organizational Learning”, focuses on studying learning pro-
cesses of and within organizations. Since its roots are in behavioral analysis, these
studies approach the learning phenomenon as asocial process of interaction, infor-
mation flows and system of feedback that gradually changes the mental models
(assumptions) shared by members of organizations (e.g., see Argyris & Schon, 1995;
and Cyert & March, 1963).
The second strand, “the Learning Organization”, concentrates on explaining
how to create and improve an organization, so it can reach its ideal – the capacity
to learn effectively, adapt, compete and prosper. These studies are normative in
nature – clearly assuming that there is ablueprint for a“learning organization” with
aset of characteristics that can be developed regardless of the sector or profile of an
organization’s business (e.g., Pedler, et al. 1997; Senge, 1990).
The third strand is “Knowledge Management”. Its roots are in economics and
management, therefore studies from this strand focus on “knowledge” as an asset,
aunique resource of competitive advantage (Nonaka & Takeuchi, 1995; Wiig, 1993).
They conceptualize the nature of knowledge, and its different types and explain how
knowledge is gathered, stored, shared and used in improving performance.
Although literature on organizational learning and knowledge management has
been steadily growing (Dierkes et al., 2001; Easterby-Smith & Lyles, 2011; Ma & Yu,
2010), it still has certain shortcomings, especially visible to public sector practitioners.
The majority of work has been devoted to private sector organizations. These studies
offer limited insight into organizational learning processes of public administration,
since management in public agencies is different from management in private firms
due the distinctive nature of government (Hill & Lynn, 2008; Rainey & Chun, 2005).
As Easterby-Smith and Lyles point out (2011, p. 16), literature still lacks empirical
research on actual learning processes. Examples of good empirical work, focused on
learning in public agencies, are limited (Lipshitz et al. 2007; Mahler & Casamayou,
2009; Moynihan & Landuyt, 2009). Literature is dominated by theoretical studies,
missing the practical considerations that public managers have to face every day in
their work on public policies.
On the one hand, frameworks offered by academic literature often are complex
and difficult to be operationalized and measured. On the other hand, models offered
by consultants lack grounding in empirical evidence. This makes it challenging for
public managers to use them for organizational assessment and evidence-based
Finally, all three strands tend to ignore the political aspects of power and control
that take place in organizations around knowledge resources and especially in
decision-making within bureaucratic organizations (Easterby-Smith & Lyles, 2011;
Grieves, 2008, p. 469; and Prusak, 2001).
Initially, authors from different literature strands tended to underline the uni-
queness of their approaches. However in recent years, Organizational Learning and
Knowledge Management have begun to merge (Easterby-Smith & Lyles, 2011). For
practitioners – managers in public and private organizations – this divide has been
always quite artificial. Comparative research shows that public managers borrow ideas
and techniques from different strands without even using or even being aware of the
different terminology applied in each strand (see: chapter 3 of this book). Therefore,
in this book we take apragmatic stand and derive information from all three strands
of research.
Building on the body of literature on organizational learning and knowledge
management, we propose the following sets of definitions. They will guide us
throughout the rest of this book.
14 Karol Olejniczak
We define ORGANIZATIONAL LEARNING as adaptation that is based on
the social process of reflection that produces new insights, knowledge and association
between past actions, the effectiveness of those actions and future actions(Fiol & Lyles,
1985, p. 811; Lipshitz et al., 2007, p. 16).
Four aspects required special explanation (Argyris & Schön, 1995; Fiol & Lyles,
1985; March, 1991; Lipshitz et al., 2007):
• An adaptation can be reactive – responding to changes in the environment or
proactive – taking initiative based on the analysis of observed trends.
• An adaptation can cover both incremental improvement (single-loop learning)
as well as substantial changes in assumptions underlying policy intervention,
current organizational strategies and exploration of new approaches (double-loop
learning, sometimes called “unlearning”).
An adaptation is based on evidence, mainly feedback about an organizations
performance (activities and their effects) and ability to reflect on that information.
• Reflection is asocial process that involves teams who consciously and critically
review the relevance of assumptions, objectives and routines shared by members
of the organization (so-called mental models).
We argue that in the process of learning three TYPES OF KNOWLEDGE can be
produced and used (Alavi & Leidner 2001; Ein-Dor, 2010; Ferry, Olejniczak, 2008;
Polanyi, 1966):
• contextual knowledge – knowledge about the context in which an organization
operates, its stakeholders, trends in the given policy field;
strategic knowledge – knowledge about the key objectives of the organization, its
mission and available resources;
operational knowledge – know-how on procedures and effective processes.
These types of knowledge can be generalized and are easy to codify (explicit
knowledge) or lie in the heads of personnel, rooted in experience, context-specific
(tacit knowledge).
Last but not least, we assume that organizational learning, as defined above, is
positively linked with performance. In other words, ORGANIZATIONAL LEARN-
ING IMPROVES THE PERFORMANCE of an organization, both in its strategic and
operational activities. This relation has been confirmed in the literature mentioned
earlier (Cavaleri & Seivert, 2005; Fugate et al., 2009; McNabb, 2007; Monavvarian
&Kasaei, 2007; Pee et al., 2010; Perez-Lopez et al., 2004; Wiig, 2002).
Table 1. Overview of main literature strands
strand Theoretical
roots Focus Key motives
in "classics" definitions Classic works Top 5 quoted research
articles, times cited in ISI
Learning Psychology
System thinking
Focuses on studying
learning processes of and
within organizations.
Aims at description and
understanding of factors
that influence learning.
Studies are more theoretical
and descriptive in nature.
Multilevel nature of learning
(people, teams, organizations);
Feedback and feedback-loops;
Relation with environment;
Value of experimentation and
Change in cognition and
(Argyris, Schon,
(Cyert, March,
(March, 1991), 2985
(Powell et al., 1996), 1823
(Levitt, March, 1988), 1786
(Huber, 1991), 1714
(Brown, Duguid, 1991), 1554
Organization System thinking
Management Focuses on an ideal type
of organization that has the
capacity to learn effectively,
adapt, compete and
Aims at understanding
how to create and improve
learning capacities.
Studies are more practical
and normative in nature.
Positive role of learning;
Ability to permanently change
and adapt;
Organization as a living
Collective processes
of learning.
(Senge et al.,
(Pedler et al.,
(Slater, Narver, 1995), 860
(Garvin, 1993), 569
(Kim, 1993), 303
(Simonin, 1997), 263
(Ferlie, Shortell, 2001), 238
Management Economics
Sees knowledge as
a unique resource of
competitive advantage.
Studies aim either at
conceptualizing the nature
of knowledge (more
theoretical orientation) or at
explaining how knowledge
is gathered, stored, shared
and used in improving
performance (practical
Knowledge as a resource;
Types of knowledge (tacit vs.
Stages of knowledge
management: acquisition,
distribution, application,
storing.;Strategies and tools
of effective KM;
Positive effects of knowledge
management – innovation,
competitive advantage.
Takeuchi, 1995),
(Wiig, 1993)
(Alavi, Leidner, 2001), 1246
(Spender, 1996), 878
(Dyer, Nobeoka, 2000), 735
(Studer et al., 1998), 734
(Hansen et al., 1999), 733
Source: (Easterby-Smith & Lyles, 2011, p. 3) (Maier, 2007, pp. 19-93) (Örtenblad, 2001); own review of definitions used in 75 key publications on KM, OL, LO; analysis of ISI
database records (accessed January 2012)
16 Karol Olejniczak
The structure and value of this book
In our book we start by looking into public organizations to understand the me -
chanism of organizational learning. We focus on the level of departments because they
are the basic organizational and functional structures of the ministries. In other words,
these are the places where practical solutions of public intervention (regulations,
programs, etc.) are designed and executed. In Polish ministries, departments have
around 20 to 50 staff. They can be further divided into units - small teams of 5to
10 people. They have clear and distinctive functions related to policy tasks (for
example within the Ministry of Infrastructure and Regional Development these are:
Department of Roads and Motorways, Department of Competitiveness and Inno-
vation, Department for Spatial Development Policy) or service delivery within the
Ministry (e.g. De partment of Human Resources, IT Department, Legal Department).
In the first chapter of the book we address the question of how learning works in
the departments of ministries. In subsequent sections we present the stages of our
empirical discovery – from forming an initial theoretical model, through quantitative
verification, deepening with qualitative exploration, confronting Polish findings with
observations from other countries, to afinal framework. In the conclusions of Chap-
ter 1 we provide readers with a framework of organi zational learning. This covers
both the processes that constitute the learning cycle and determinants that influence
the effectiveness and quality of that process. Both researchers and practitioners will
appreciate that our framework provides aclear synthesis in the form of avisual model
combined with arobust means of measuring elements of the learning phenomenon
in public organizations (with the use of tested survey questions). The presented
framework can be used as adiagnostic tool for the monitoring and assessment of
learning processes in public organizations. It also provides amap to visualize results
and conducts data driven discussion on an organizations condition.
In Chapter 2 we compare the public administrations of twelve countries: Australia,
Canada, Spain, Switzerland, France, Great Britain, Japan, the Netherlands, New
Zealand, Norway, Sweden and the United States. Based on interviews and areview of
documents we identify over 80, very practical techniques that could advance learning
in public organizations. In the conclusions of the chapter we point out certain
phenomenon that are similar across different cultures and traditions. We also propose
atypology that connects identified practices with key elements of the organizational
learning framework, identified in Chapter 1. We hope that this chapter will be
avaluable source of inspirations for public civil servants across different countries.
In the final chapter of the book we look more broadly, beyond single departments,
at the context in which public organizations operate. We ponder what changes in
public management are required to promote learning. We focus on discussing
ways to overcome tension between organizational learning and narrowly defined
performance and accountability. The conclusions are directed to awider audience of
both practitioners and researchers in the public sector. We hope that our arguments
will contribute to shifting current public sector philosophy towards “accountability
for learning”.
Alavi, M. & Leidner, D. (2001),. “Knowledge management and knowledge management
systems: Conceptual foundations and research issues”, MIS Quaterly, 25(1), 107-136.
Argyris, C. & Schon, D.A. (1978), Organizational Learning: A Theory of Action Perspective,
Reading, MA: Addison-Wesley.
Argyris, C. & Schon, D.A. (1995), Organizational Learning II: Theory, Method, and Practice,
Reading, MA: FT Press.
Astbury, B. & Leeuw, F.L. (2010), “Unpacking black boxes: Mechanisms and theory building in
evaluation, American Journal of Evaluation, 31(3), 363-381.
Bardach, E. (2006), “Policy dynamics, in: Moran, M., Rein, M. & Goodin, R.E. (eds.), The
Oxford Handbook of Public Policy, Oxford, New York: Oxford University Press, pp. 336-366.
Brown, J. & Duguid, P. (1991), “Organizational learning and communities-of-practice: Toward
aunified view of working, learning and innovation”, Organization Science, 2(1), 40-57.
Cavaleri, S. & Seivert, S. (2005), Knowledge Leadership. The Art and Science of Knowledge-Based
Organization, Amsterdam, Boston: Elsevier.
Cyert, R. & March, J. (1963), Behavioral Theory of the Firm, Englewood Cliffs: Prentice-Hall.
Davies, H., Nutley, S. & Smith, P. (2009) (eds.), What Works? Evidence-based Policy and Practice
in Public Services, Bristol: The Policy Press.
Dierkes, M., Antal, A.B., Child, J. & Nonaka, I. (2001) (eds.), Handbook of Organizational
Learning and Knowledge, Oxford: Oxford University Press.
Dyer, J. & Nobeoka, K. (2000), “Creating and managing a high-performance knowledge-
sharing network: The Toyota case, Strategic Management Journal, 21, 345-367.
Easterby-Smith, M. & Lyles, M. (2011), “The evolving field of organizational learning and
knowledge management, in: Easterby-Smith, M. & Lyles, M. (eds.), Handbook of Organi-
zational Learning and Knowledge Management, Chippenham: Wiley, pp. 1-20.
Easterby-Smith, M., Araujo, L. & Burgoyne, J.G. (1999), Organizational Learning and the
Learn ing Organization: Developments in Theory and Practice, London, Thousand Oaks,
New Dehli: Sage Publications Ltd.
Ein-Dor, P. (2010), “Taxonomies of knowledge”, in: Schwartz, D. & Te’eni, D. (eds.), Encyclopedia
of Knowledge Management, Second Edition (2 volumes), Hershey, New York: IGI Global,
pp. 1490-1500.
Ferlie, E. & Shortell, S. (2001). “Improving the quality of health care in the United Kingdom
and the United States: Aframework for change, Milbank Quarterly, 79(2), 281-315.
Ferry, M. & Olejniczak, K. (2008), The Use of Evaluation in the Management of EU Programmes
in Poland, Warsaw: Ernst & Young – Program “Sprawne Państwo.
Fiol, M. & Lyles, M. (1985), “Organizational learning”, Academy of Management Review, 10(4),
Fugate, B.S., Stank, T.P. & Mentzer, J.T. (2009), “Linking improved knowledge management to
operational and organizational performance”, Journal of Operations Management, 27(3),
Garvin, D. (1993), “Building alearning organization, Harvard Business Review, July-August,
Grieves, J. (2008), “Why we should abandon the idea of the learning organization”, The Learning
Organization, 15(6), 463-473.
Hatry, H. & Davies, E. (2011), AGuide to Data-Driven Performance Reviews, Washington D.C.:
IBM Center for The Business of Government.
18 Karol Olejniczak
Hill, C.J. & Lynn, L. (2008), Public Management: AThree-Dimensional Approach, Washington
D.C.: CQ Press.
Huber, G. (1991), “Organizational learning: The contributing processes and the literatures”,
Organization Science, 2(1), 88-115.
Kim, D. (1993), “The link between individual and organizational learning”, Sloan Management
Review, 35(1), 37-50.
Levitt, B. & March, J. (1988), “Organizational learning”, Annual Review of Sociology, 14, 319-
Lipshitz, R., Friedman, V.J. & Popper, M. (2007), Demystifying Organizational Learning,
Thousand Oaks: Sage Publications, Inc.
Ma, Z. & Yu, K.-H. (2010), “Research paradigms of contemporary knowledge management
studies: 1998-2007”, Journal of Knowledge Management, 14(2), 175-189.
Mahler, J.G. & Casamayou, M.H. (2009), Organizational Learning at NASA: The Challenger and
Columbia Accidents (Public Management and Change series), Washington, DC: George-
town University Press.
Maier, R. (2007), Knowledge Management Systems. Information and Communication Techno-
logies for Knowledge Management, Berlin: Springer.
March, J. (1991), “Exploration and exploitation in organizational learning, Organization
Science, 2(1), 71-87.
McNabb, D.E. (2007). Knowledge Management in the Public Sector: ABlueprint for Innovation
in Government. New York: M.E. Sharpe.
Monavvarian, A. & Kasaei, M. (2007), “AKM model for public administration: The case of
Labour Ministry”, The Journal of Information and Knowledge Management Systems, 37(3),
Moynihan, D. & Landuyt, N. (2009), “How do public organizations learn? Bridging cultural
and structural perspectives”, Public Administration Review, 69(6), 1097-1105.
Nonaka, I. & Takeuchi, H. (1995), The Knowledge-Creating Company: How Japanese Companies
Create the Dynamics of Innovation, Oxford: Oxford University Press, USA.
Örtenblad, A. (2001). “On differences between organizational learning and learning orga-
nization. The Learning Organization, 8(3), 125-133.
Partnership for Public Service (2011), From Data to Decisions. The Power of Analytics, Wa -
shington D.C.: IBM Center for The Business of Government.
Pawson, R. (2013), The Science of Evaluation: ARealist Manifesto, London: Sage Publications
Pawson, R. & Tilley, N. (1997), Realistic Evaluation, London: Sage Publications Ltd.
Pedler, M., Burgoyne, J. & Boydell, T. (1997), The Learning Company. AStrategy for Sustainable
Development (2nd edition), London: The McGraw Hill Company.
Peel, D. & Lloyd, G. (2008), “Re-generating learning in the public realm. Evidence-based policy
making and business improvement districts in the UK”, Public Policy and Administration,
23(2), 189-205.
Perez-Lopez, S., Peon, J. & Ordas, J. (2004), “Managing knowledge: the link between culture
and organizational learning”, Journal of Knowledge Management, 8(6), 93-104.
Polanyi, M. (1966), Tacit Dimension, New York: Doubleday & Co Inc.
Powell, W., Koput, K. & Smith-Doerr, L. (1996), “Interorganizational collaboration and the
locus of innovation: Networks of learning in biotechnology”, Administrative Science Quar-
terly, 41, 116-145.
Prusak, L. (2001), “Where did knowledge management come from?”, IBM Systems Journal,
40(4), 1002-1007.
Rainey, H.G. & Chun, Y.H. (2005), “Public and private management compared”, in: Ferlie, E.,
Lynn, L.E. & Pollitt, C. (eds.), The Oxford Handbook of Public Management, Oxford: Oxford
University Press, s. 72-102.
Rogers, P. (2008), “Using programme theory to evaluate complicated and complex aspects of
interventions, Evaluation, 14(1), 29-48.
Senge, P.M. (1990), The Fifth Discipline: The Art & Practice of The Learning Organization, New
York: Currency Doubleday.
Shillabeer, A., Buss, T.F. & Rousseau, D.M. (2011) (eds.), Evidence-Based Public Management:
Practices, Issues, and Prospects, Armonk, New York, London: M.E. Sharpe.
Simonin, B. (1997), “The importance of collaborative know-how: An empirical test of the
learning organization, Academy of Management Journal, 40(5), 1150-1174.
Slater, S. & Narver, J. (1995), “Market orientation and the learning organization”, Journal of
Marketing, 59(July), 63-74.
Spender, J. (1996), “Making knowledge the basis of adynamic theory of the firm”, Strategic
Management Journal, 17(Winter Special Issue), 45-62.
Spira, J.B. (2011), Overload! How Too Much Information is Hazardous to your Organization,
New Jersey: Wiley.
Studer, R., Benjamins, R. & Fensel, D. (1998), “Knowledge engineering: Principles and
methods”, Data & Knowledge Engineering, 25(1-2), 161-197.
Sunstein, C.R. (2011), “Empirically Informed Regulations, The University of Chicago Law
Review, 78(4), 1349-1430.
Wiig, K.M. (1993), Knowledge Management Foundations: Thinking About Thinking – How
People and Organizations Represent, Create and Use Knowledge, Arlington: Schema Press.
Wiig, K. (2002), “Knowledge management in public administration, Journal of Knowledge
Management, 6(3), 224-239.
1 Discovering the learning mechanism
Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
In this chapter we address the following question: How does learning work
in public organizations? In the course of the chapter we present the steps of our
empirical research that allowed us to gradually build and validate an organizational
learning framework. The final, validated version of the framework is offered in the
Con clusions of this chapter. It should help our reader understand, what elements
form an organizational learning cycle, what factors influence its performance and
quality, and finally, how we can measure and monitor this phenomenon in our public
To answer the opening question we use amixed-methods approach, both at the
level of research design and data analysis. As aresearch strategy we used amodification
of explanatory mixed-method design (a follow-up explanation model) (Creswell
&Clark, 2010, p. 72). Figure 1 illustrates our research process.
Figure 1. Stages of the research process
Source: own study.
Survey with staff of four
Polish ministries
(n=3394, n=51,3%)
SEM, factor
analysis, linear
Interviews with heads
of departments from four
Polish ministries (n=71)
Interviews with public
from 12 OECD countries
process and magnitude
(handbooks n=38,
articles n=1016)
Learning, Learning
Organization, Knowledge
Discovering the learning mechanism
The structure of the chapter closely follows the sequence of our three analytical
stages, allowing us to show how adding new layers of data and different analytical
methods expanded our understanding and allowed us to develop amore com pre-
hensive picture of the phenomenon of organizational learning.
In the next section we briefly present the theoretical framework of organizational
learning grounded in a literature review. Section two discusses the testing of the
theoretical framework using quantitative analysis of data from the survey with
ministry employees. In section three we expand our framework by adding qualitative
data drawn from two sources. We explore the perspective of the heads of studied
departments through in-depth interviews, and then we compare Polish specificity
with international practice, using qualitative data from study visits conducted in
12 OECD countries. Finally, in conclusion, we discuss the key findings and present
aframework for organizational learning in public administration.
But first, we need to understand why Poland constitutes a good subject for
public administration studies. This country can be seen as a European laboratory
of public intervention and modernization of public administration. During the last
25 years Poland has undergone substantial systemic transformation from asocialist
state-owned and centrally planned system, to adynamic market economy. Although
system transformation has been almost completed (Morawski, 2010), Polish public
ad mi nistration is still undergoing modernization. The strongest modernization im-
pulse comes with European Union membership (Czaputowicz, 2008), mostly from
imple mentation of EU-funded programs in the field of Regional Policy (Kozak, 2006).
During the last 10 years the Polish administration has been implementing the European
Union Cohesion Policy – aset of socio-economic development programs worth over
100 billion euro. In order to run EU-financed programs, the number of departments
in Polish ministries has had to adapt to anew set of skills and new philosophy of
public management. At the same time, units not involved in EU programs work in
line with the traditional bureaucratic paradigm. This duality makes Polish Ministries
an interesting case of administration under transformation. In our analysis we looked
for signs of this transformation in the field of organizational learning.
1.1 Stage 1: Developing thetheoretical framework
The aim of the first stage of our research was to develop atheoretical framework
of organizational learning in public administration. For this purpose we conducted
an extensive literature review.
Analytical procedure and methods
Organizational learning constitutes a broad range of phenomena analyzed by
different strands of literature (see: Introduction). We performed an extensive litera-
ture search to pinpoint its driving characteristics for use in our framework. The
starting point for building aframework of organizational learning was areview of
hand books and references in encyclopedias of management, public administration,
22 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
governance, organization studies, knowledge management, organizational learning,
etc. (n = 38). This allowed us to get an overview of the field, identify classic literature
and avoid “citation amnesia” – acommon shortcoming of bibliometrics periodical
searches. What emerged from the overview were three main strands of literature:
Organizational Learning, Learning Organization, and Knowledge Management. We
further explored these three strands by applying asystematic review of the collection
of research articles in the Web of Science and SCOPUS databases. We focused our
search on empirical articles related to the public sector, published between 1990-
2010. The result was asample of 1016 documents. Based on areview of abstracts we
selected articles with clear empirical cases of both private and public administration
organizations (n = 252). To this sample we added 10 top-cited articles from each of
the three branches of literature (according to Web of Science). This ensured that we
would not omit important sources in our analysis that were mostly theoretical in
nature. This analysis was supplemented by areview of 25 definitions from “classic”
publications in each field. For the content analysis we used MAXQDA software
( and an initial coding strategy (Saldana, 2012, p. 100).
Based on the literature overview, for the purposes of our framework, we define
organizational knowledge as aresult of the social process of verifying assumptions,
strategies and “theories in use” through interaction with an environment. This is
followed by reflection and adaptation. Here we follow the view of the majority of
authors from the organizational learning field (Argyris & Schon, 1995, p. 3-30; Cros-
san et al., 1999; Levitt & March, 1988, p. 320; Lipshitz et al., 2007).
Further, we divide institutional learning in our framework into four basic elements:
knowledge, feedback, reflection and adaptation (or process of change). Apart from
learning processes, the framework includes a number of organizational learning
factors. These are the independent variables that can potentially have asignificant
impact on the organizational learning process. Agraphical version of the framework
is presented in Figure 2.
The starting point for an organizational learning framework is a taxonomy
of knowledge adapted from knowledge management (KM) literature. We define
knowledge as “information in action. Instead of distinguishing types according to the
form of knowledge (tacit vs. explicit) we make the distinction based on the content of
knowledge. The three types are (Alavi & Leidner, 2001, p. 113):
Strategic knowledge – “knowing why we do things”, knowledge about the objectives
of the department, its mission and effects expected from the department;
Operational knowledge – “knowing how”, operational knowledge about tools,
procedures that allows us to act smoothly, on time and in accordance with regu-
Contextual knowledge – “knowing what/about”, knowledge about the environment
in which the department operates, understanding the trends, relations and causal
connections policy in the department’s field of expertise.
Discovering the learning mechanism
The second element in our framework is feedback. This is acentral mechanism in
both organizational learning (OL) and learning organization (LO) literature, as well
as in the latest approaches to knowledge management (KM). It allows an organization
to determine whether aparticular activity or process worked or whether should it
be redefined (Sessa, London, 2006, p. 163). Based on the literature from psychology
and system thinking we define feedback as any impulse that informs us about an
organization’s performance (Anderson & Johnson, 1997; Levy et al., 2006; Meadows,
2008). Literature on psychology points to the fact that useful feedback should meet
four key criteria (Kluger & DeNisi, 1996). First, it is vital to acquire feedback from
diversified, external sources. Second, feedback should be collected on aregular basis.
Third, feedback formulated in a constructive and structured way is more useful.
Finally, positive feedback is considered more helpful than negative communication.
What follows feedback is asocial process of reflection (Antal et al., 2001a, p. 5;
Ortenblad, 2001, p. 130). This takes form of discussions, deliberation, and analysis.
Some authors refer to it as “inquiry” (Argyris & Schon, 1995), in which templates,
solutions and mental models used in particular organizations are tested and question-
ed (Fulmer & Keys, 2004).
Reflection can lead to eventual change in knowledge structure and volume. In
other words – it can change the mental models shared by members of the organi za-
tion. This creates feedback-loops – asituation in which certain outputs of the system
(in this case departments activities) influence their environment and then, inputs
from the environment are fed back into the system-organization (Bardach, 2006,
p. 339). Literature identifies three types of loops (also called types of adaptation or
orders of learning) (Antal et al., 2001b, p. 923; Argyris & Schon, 1995, pp. 27-30; Fiol
& Lyles, 1985):
single loop learning – asimple adjustment of actions, procedures and routines that
changes operational knowledge;
double loop learning – requiring in-depth inquiry that leads to substantial change
in the underlying assumptions, premises, values and key theories that were used
for aparticular policy or action;
• deutero-learning – learning to learn, leading to adjustment in the sources and
structures used for information collection and analysis.
In our framework we distinguish afourth type of loop underlying the mission of
an organization. This is strategic loop learning (Bennet & Bennet, 2004, p. 442) that
leads to the adjustment of the main goals and the redefinition of departmental tasks.
The organizational learning factors were elaborated in a different way to the
learning processes. We took a more open approach and put forward only broad
groups of potential factors, instead of a list of detailed hypotheses. The clusters
included personnel, leaders, resources, organizational environment, and interactions
and relations. The reason for taking this approach was twofold. First, the literature
we reviewed described the context of different countries, and mostly – private
organizations. We assumed that the character of causal relations might be significantly
different in the case of the Polish public administration. Second, we wanted to
24 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Determinants of Learning
(independent factors) Processes of Learning
(dependent factors)
organizational culture
Management style
Money, infrastructure
Quality of law, stakeholders
Bredth and intensity
of networks
and reflection
Figure 2. Framework of organizational learning – version 1
Source: own study.
Discovering the learning mechanism
keep amaximum level of openness, in order to take account of factors that are not
sufficiently explored in the international literature.
Each element of the framework was transformed into aset of survey questions,
inspired by earlier survey tools presented in the literature (Marsick & Watkins, 1999;
Perez-Lopez et al., 2004; Preskill & Torres, 1999). However, we adapted some of the
questions to the specific structure and characteristics of Polish ministries.
The framework presented above attempts to combine acyclical approach (loops
of learning) and a linear approach (relations between the organizational learning
factors and processes of learning). Thus, it takes into account the cyclical nature of
organizational functioning, while simultaneously providing astarting point for prac-
tical strategies of organizational change by identifying cause-effect relations.
1.2 Stage 2: Testing the framework in practice –
asurvey with ministry employees
The aim of this stage of our research was to empirically test the theoretical frame-
work using quantitative data analysis. In other words, we wanted to verify, whether
the theory rooted in the literature would prove its validity in practice.
Analytical procedure and methods
The source of data was aComputer-assisted Web Interview (CAWI), conducted in
the period from March 7th to April 4th, 2011 among all employees (with the exception
of heads of departments) of four Polish ministries involved in the project: the Ministry
of Infrastructure, the Ministry of the Interior and Administration, the Ministry of
the Environment and the Ministry for Regional Development.2 The sample examined
consisted of all 3394 ministry employees and the rate of return of the questionnaire
was 51.3% (1741 respondents).
The quantitative tool – the CAWI questionnaire – was structured so that individual
questions were clustered into groups that constitute the broader dimensions, that
is, our analytical categories (see: Annex 13). Some of these were based on questions
taken from earlier studies on knowledge management in organizations and thus, as
such, they were verified within other research projects. Other questions were created
in consultation with practitioners and theoreticians of the Polish governmental
administration system. At the development stage of the questionnaire, we made sure
that most (about 90%) of the questions would have acoherent, five-point Likert scale.
Overall coherence of the questionnaire was verified in several ways. First of all,
we checked their face validity through discussion with project stakeholders. Then we
conducted pilot research that allowed us to collect feedback from the interviewees.
2 Ministries were selected as representations of different organizational and functional solutions
present in the Polish administrative system.
3 Annex 1 presents the questions from the CAWI questionnaire that were used to measure particular
analytical categories. Items are clustered into the categories according to the final version of the organiza-
tional learning framework.
26 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Pilot data was analyzed to make sure that the questionnaire was coherent, using
Cronbachs alpha test. The test results were very high – on average, the components
reached ascale of 0.96.
In our research, we took advantage of both types of factor analysis: first, we at -
tempted to recreate the assumed constructs (confirmation analysis), then, if the first
approach failed, to approach the matter from an exploratory point of view and attempt
to identify new factors. When we had constructed new factors, we reverted back to
confirmation analyses to see how these ‘new’ factors impacted one another.
In terms of the learning processes, the framework assumed the structure discussed
in the previous section (see: Figure 2). It anticipated three types of knowledge
(operational, strategic, contextual), a feedback stage, reflection and five types of
reactions (no reaction, double-loop learning, strategic learning, single-loop learning
and deutero-learning). These feedback-loop components were to exert impact on the
state of types of knowledge, and their indirect impact upon one another. Determinants
of organizational learning were also derived from aliterature review, and consisted of
abroad set of phenomena related to intra- and inter-organizational characteristics.
SEM – Structural Equation Modeling
Prior to commencement of modeling, the survey data was preprocessed. Namely,
the ‘blank’ answers and missing data were replaced with the average for agiven
The next step was to construct the model coefficients themselves. According to
the information obtained at the pilot stage, not all factors that had their equivalents
in the first framework were reflected in the data. Initially we attempted to recover
these elements by building the original framework. However, it turned out that
most elements had not been built in the expected manner4. Their factor loadings
were incoherent (some were very high, others – very low) or negative. Therefore,
we focused on factor analyses that would allow us to obtain the empirical constructs
reflected by the data.
For this purpose, we used factor analysis of the principal components with
orthogonal Equamax rotation. This preliminary analysis was aimed at checking
whether the data would group into other elements than those pre-determined within
the constructed theoretical framework. In this manner, we obtained ten factors –
components of the organizational learning process, which only partially matched
the elements from the theoretical framework (for instance, the knowledge-building
factors); others were entirely new constructs. The analysis consisted of two stages: the
first stage was the factor analysis that pertained to all components of the organizational
learning process, and this resulted in the determination of the ten factors. The second
stage consisted of the identification of explanatory factors. In the case of the latter, the
4 And, to be exact, that is why we conducted pilot stage – we expected that our variables would settle
into consistent factors, and they finally did, although in the end we received different factors than we
Discovering the learning mechanism
procedure was very similar to the identification of the process components; however,
this time, the analysis was performed for each focus area individually: separately for
groups of variables pertaining to different categories such as personnel, resources
etc. As aresult, atotal of 26 explanatory factors regarding the learning process was
On the basis of these factors, new structural models were built. We used confir-
matory factor analysis to redefine the factors present in the data at the SEM level.
At this stage of analysis, we were not interested in correlations between individual
components of the organizational learning process, therefore we applied orthogonal
rotation, at the modeling level5, to de-correlate the individual factors. This in turn
allowed for the construction of partial models, containing, for instance, only the
knowledge- or adaptation-building factors.
In this way, we obtained alink between individual components of the learning
process and the determinants of this process. Thus, our analysis uncovered another
level of 26 factors which had indirect influence on the learning process and which we
describe as ‘determinants of the learning process’. Our overall approach is presented
on Figure below.
Factor C –
The explained
learning process
Factor A –
Factor B –
Figure 3. SEM modeling stages
Source: own study.
Of these 26 factors (determinants) only afew appeared to be important for further
analysis. To determine which factors had significant explanatory power, we correlated
factors from the determinant side with elements of the learning process to see which
5 Orthogonal rotations make it possible to obtain uncorrelated factors. The advantage of this ap-
proach is the possibility to treat factors as unrelated.
28 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
of the determinants actually interacted with the core of our framework. Out of the
26factors, only 7 were correlated relatively strongly (R2 > 30%), and these factors
passed for further statistical analysis.
To summarize, the quantitative analysis was conducted in several stages:
Analysis of missing data
Factor analysis (exploratory) for learning processes
Factor analyses (exploratory) for determinants of the learning process
Creation of factors in the database on the basis of SEM analysis
Analysis of average values of factors for individual departments in the context of
results obtained for individual ministries.
The structural models created described well the common reality in the examined
fragment of the Polish public administration system. However, our objective was
not only to diagnose the processes responsible for learning, but also – and most
importantly – to verify the existence of individual processes in the specific ministries
and departments.
Using the CAWI method enabled us to maximize the number of respondents,
who participated in the survey. The sample obtained was large enough to allow for
complex quantitative analysis leading to the building of an organizational learning
framework. The next section summarizes our findings at this stage.
As Figure 4 shows, our two-level factor analysis resulted in defining 10 dimensions
of learning. We examined the questions hidden behind each dimension and came up
with the four main issues constituting the learning process: reflection mechanisms,
knowledge base, adaptation processes and existing impulses.
Impulses –
& training
Impulses –
& expertise
Figure 4. Framework of organizational learning – version 2
Source: own study.
Discovering the learning mechanism
Compared to our initial, theoretical framework, the factors obtained referred
mainly to the state of reality6, only some of them had aprocessual character. Also,
the framework based on quantitative data failed to confirm the existence of feed back
mechanisms, as described in the literature. In-depth analysis proved that the theore-
tical factors associated with feedback could not be recreated based on the gathered
data. However, another, more general source of knowledge emerged – impulses. They
included two types of impulses: “analyses and expert opinions” and “conferences and
Constructing anew framework was aimed not only at unveiling the processes of
learning in Polish ministries, but also at exploring the potential determinants of these
processes so that they constituted acoherent part of the framework. The analytical
procedure described above resulted in defining 7 organizational learning factors, i.e.:
mutual support, group cohesion, psychological safety, democratic leadership style
both at the level of heads of departments and heads of units, availability of analyses
and information, and quality of expertise.
To sum up, the quantitative analysis described above resulted in major changes in
our theoretical framework. All four learning loops and feedback disappeared, and the
complex cycle of learning was replaced with astatic picture comprising 10 dimensions
of organizational learning, grouped into 4 main categories, i.e. impulses, reflection,
knowledge and adaptation. The first determinants of learning were established,
emphasizing the characteristics of teams and leadership style.
The findings regarding the learning processes triggered the following questions:
First, why was the picture of the learning cycle we obtained from quantitative analysis
more static than processual, even in the area of impulses? Second, why didn’t the
feedback mechanisms appear as apractice of obtaining knowledge in the examined
ministries? In the case of both questions we stipulated that it may be the result of some
integral characteristic of Polish administrative institutions. Answering these questions
called for the use of different methods that would allow us to verify the reasons for the
mismatch between the theoretical framework and the quantitative results. The next
stage, therefore, was to use qualitative data to verify and deepen our analysis.
1.3 Stage 3: Exploring learning in-depth – interviews with leaders
The overall aim of this stage of our analysis was to enrich the framework that
emerged from the quantitative data gathered among ministry employees with the
perspective of public administration leaders, both from Poland and from 12 OECD
We began with interviews with the heads of the studied departments in Poland. In
particular, we wanted to verify two main issues. First, what day-to-day practices are
6 Even if we take into account that some of our factors described processes (e.g. Adaptation or
Reflection), we still only received static information about states rather than processes. Further analyses
were designed to show the impact of individual factors emerging from the determinant, allowing the
recognition process in terms of cause and effect analysis and analysis of the influence of each factor on the
elements of learning.
30 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
hidden behind the static rather than processual picture that emerged from our data.
Second, is feedback gathering, as apractice of obtaining knowledge, really as rare as
was indicated by the quantitative analysis.
Next, we confronted the Polish situation with the first-hand experience of civil
servants in selected OECD countries. We broadened the picture obtained in this way
by interviewing academics specializing in public management in agiven country. We
focused on checking if the reality of foreign administration was coherent with the
literature (occurrence of feedback and structured practices of organizational learning)
and therefore different from what we had observed in Poland. We were also looking
for particular practices supporting organizational learning (see: next chapter).
Analytical procedure and methods
Qualitative data collected in Poland consisted of 71 transcripts and notes from
interviews with the heads of all the departments in the four ministries. Interviews
were conducted using structured interview protocol (see: Annex 2), over the period
of two months in 2011.7 In order to address the questions presented at the beginning
of this section, we used coding and an analytical procedure that consisted of six steps.
First, for each interview we applied an attribute coding that included: (1) The type
of department (Internal service provider vs. Merit – policy department) and (2) the
department’s relation to EU policy (Management of EU funds vs. National issues).
In the second step, two researchers performed random selective coding to develop
adetailed coding list. For this purpose we used acombination of two coding strategies:
structural coding with process coding (Saldana, 2012). Our starting list of phrases
was very general and followed our initial division into three types of knowledge and
feedback (that could overlap with the types of knowledge). These were: (1) How do
they obtain strategic knowledge? (2) How do they obtain operational knowledge?
(3) How do they obtain contextual knowledge? (4) Which process is a feedback
mechanism? Process coding uses gerunds to connote action in the data. It reveals
routine actions that form wider tactics and strategies. This coding fitted well the
description of knowledge as aprocess. Moreover, it allowed us to focus our search on
the possible dynamics that were missing in the quantitative analysis.
In the third step, each coder moved to the 2nd coding cycle for pilot data, in order
to come up with more summative groupings. We applied pattern coding (Saldana,
2012, p. 209) in asearch for repeated activities and similarities.
In step four, we built inter-coder agreement. Coding pattern of one, overlapping
interview was compared between two coders. Coherence was very high. Differences
in coding were discussed and joint definitions were clarified. That procedure allowed
us to increase reliability of the research. At this stage we also decided to introduce
code categories that would allow us to explore characteristic and quality of learning
7 The interview scenario was constructed on the basis of the literature review. Interviews were con-
ducted by members of the research team who participated in the development of the theoretical model as
well as survey and interview scenarios. The average length of interview was 45 minutes.
Discovering the learning mechanism
practices, i.e. structure, regularity, positive or negative character, and utility from the
user perspective. The final list is presented on Table 2.
In the fifth step, two researchers conducted coding for the whole set of data
(71interviews), using the list of categories that emerged from the pilot coding. Again
we combined two types of coding – this time provisional coding with magnitude
coding. Provisional coding allows a“start list set of coded data prior to fieldwork
and generated in the preliminary investigation’ (Saldana, 2012, p. 144). It focuses
inquiry and at the same time allows flexibility because it can be modified during the
research. Magnitude coding allows assigning the intensity of frequency to particular
phenomena (Saldana, 2012, p. 72). By applying this technique we were able to
evaluate the extent to which each practice is structured (that is regular, organized as
procedures, routines). Each fragment of the interview was also coded with multiple
codes (so-called simultaneous coding) e.g. types of knowledge, regularity, knowledge
Table 2. Coding categories and coding results
Code Definition Number of coded
knowledge knowing why 0399
knowledge knowing how 0284
knowledge knowing what/about 0249
feedback An impulse acquired from an external source that
provides an evaluative response to action undertaken by
the recipient 0358
source Sources of information acquired by a department; 16
sub-codes, including “other” 1017
regularity The regularity of obtaining knowledge from a given
source; 3 sub-codes: high, medium and low.0457
positive or
negative The positive or negative character of given feedback;
binary code – 2 sub-codes 0117
A formalized and/or systematic process of acquiring
knowledge from a given source of information; binary
code – 2 sub-codes 0343
perceived utility An explicitly stated opinion on the usefulness of a given
source of information; 3 sub-codes: high, medium and
low 0245
Source: own study.
In the final step, we applied amixed-methods approach in order to draw quantified
results from qualitative data. We assessed the main features of knowledge acquisition
practices used across the entire sample, and broke down the results according to two
32 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
types of departments, i.e. those dealing directly with EU-funds and the rest. However,
quantitative analysis and interpretation of qualitative data had its limitations. Data was
derived from structured questionnaires aimed at exploring key learning processes,
and thus providing only partial information on the absolute frequency of agiven
phenomenon. Moreover, narrative could be fragmented, with agiven issue surfacing
in several places in the course of the interview. To address these limitations we focused
on relative values, e.g. comparing the performance of two types of departments and
using the code relations browser.
The second data set consisted of 114 transcripts and notes from in-depth inter-
views conducted during the study visits in 12 OECD countries8. In each country
astudy visit comprised of three inter-related parts: on the spot interviews, followed
up by desk research and aliterature review on practices of organizational learning
and knowledge management. We conducted semi-structured interviews with cen-
tral governmental managers and academic experts to (1) establish state-of-the-art
organizational learning and knowledge management in each country, and (2) to
identify promising practices of organizational learning. Interview transcripts and
relevant documents identified by our interviewees were analyzed with MAXQDA
soft ware, using abasic structural coding system (Saldana, 2012, p. 84-87).
It should be noted that this part of our research did not aspire to be asystematic
overview of OECD countries. Rather, it was designed as aset of national exploratory
case studies. We were interested more in getting an idea of the range of existing
solutions than in areview of practices in each organization. We focused mainly on
identifying examples of day-to-day practices on (1) obtaining knowledge, (2) getting
feedback, and (3) storing knowledge.
Applying the above-described procedures led us to number of observations. First,
we describe the emerging picture of organizational learning in Polish ministries. Then
we move on to report the key observations from the study visits, which influenced the
final version of the organizational learning framework.
We discovered that, in case of Polish ministries, sources of knowledge are located
mostly inside the administration, often inside the given institution (see: Table 3).
The main channel of obtaining strategic knowledge is from heads of the ministry.
Operational knowledge is drawn predominantly from training sessions and different
control/audit activities. Mechanisms for acquiring contextual knowledge seem to be
generally less frequent, with expert analyses and contacts with other units of public
administration being most common.
Mapping sources of feedback revealed asimilar pattern. The majority of impulses
obtained comes from inside the public administration system, with heads of ministry
and external control activities being the main sources. Typical external sources, i.e.
stakeholders and clients, are responsible for only 12% of collected feedback. More
8 The methodology of this step is described in detail in chapter 3 of this book.
Discovering the learning mechanism
than 70% of recorded feedback falls into the category of strategic knowledge, and
afurther quarter regards operational knowledge.
Table 3. Results of the mixed-method analysis – sources of knowledge
Sources Type of knowledge Feedback
strategic operational contextual
heads of the ministry 37% 02% 08% 20%
contacts within the ministry 07% 01% 08% 07%
contacts within public administration 07% 04% 16% 06%
recipients/clients 06% 01% 04% 07%
stakeholders 05% 00% 06% 05%
system of indicators 09% 01% 01% 09%
internal audit 00% 12% 00% 09%
external audit 02% 06% 00% 05%
external control 04% 14% 01% 14%
expert analyses and research 03% 06% 19% 03%
guidelines 03% 04% 03% 00%
internet 00% 03% 04% 00%
media 02% 01% 06% 03%
training 00% 20% 05% 00%
own experience and practice 04% 12% 01% 02%
other 10% 13% 17% 09%
Source: own study.
The regularity of feedback inflow is poor (see: Table 4), with almost half of ob -
served feedback falling into the low regularity category. Systems of indicators were
by far the most regular source of feedback, while impulses obtained from within the
ministry were mostly of an incidental and ad hoc nature.
Structured feedback is rather rare, occurring only in 38% of analyzed cases, and in
less than athird of cases, when it comes in response to impulses regarding strategic
knowledge. Systems of indicators, expert analyses and external controls tend to provide
structured feedback more often than average, while contacts within the ministry and
public administration system relies mostly on unstructured communication.
Finally, negative feedback is more prevalent than positive feedback. This imbalance
is particularly evident in the case of communication within the ministry.
On the basis of these observations we can come up with three more general
observations related to our framework of organizational learning. First, there is
adynamic in the everyday learning of Polish departments. However, these processes
34 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
are unstructured, irregular and – most of all – internal. Our survey questions were
focused on relations and interactions with the environment as the main channel of
knowledge and learning. Polish ministries clearly miss this connection. That is why in
the quantitative model learning elements appeared as static categories.
Table 4. Results of mixed-method analysis – characteristics of feedback
Characteristics of feedback
structured 38% (no. of coded segments: 72)
unstructured 62% (117)
high regularity 22% (42)
medium regularity 32% (60)
low regularity 46% (86)
negative 58% (67)
positive 42% (48)
Source: own study.
Second, feedback is present in Polish ministries, but its inflow from outside the
Ministry is very limited. Feedback is dominated by one source – heads of the mi-
nistries (political appointees) and it is directed solely to senior management (heads
of the departments). It is both unstructured and irregular, often in form of asimple
message e.g. “Well done” or “we have aproblem. Astatement from one of our inter-
views illustrates this issue well:
It [feedback] has never been formalized in any way. If Iknow that something is
going wrong, it is usually thanks to some current feedback. But it has never happened in
asystemic way. [pause]. But on the other hand, from various conversations Iknow that
Iam positively evaluated. However, it is not like there are any specified criteria for this
evaluation. [Interview – Poland]
As aresult, there is little concrete content to be passed from senior management
to the staff of the departments. That is why our quantitative analysis that explored the
staff’s point of view, did not register the presence of organizational feedback.
Third, it is worth assessing the usefulness of the observed feedback from atheo-
retical point of view. Feedback most useful for learning should share the following
characteristic (Kluger & DeNisi, 1996): be acquired from diversified sources external
to the organization, be collected on aregular basis, and formulated in aconstructive
and structured way. Comparing this list to the Polish situation we have to state that
none of these criteria is met. That means that in its current form the use of feedback
for learning is very limited.
The absence of structured, regular processes of learning and lack of feedback
from the environment led us to the final question: Is this atypical trait of central
administration or just apeculiarity of the Polish public administration and something
that could be improved? In order to solve this puzzle we moved to the last stage of our
exploration – an international comparison.
Discovering the learning mechanism
As an outcome of study visits conducted in selected OECD countries, we identified
78 interesting practices of organizational learning and knowledge management.9 We
compared the results with the coded data from interviews with Polish senior civil
servants and discovered only afew, quite isolated cases of similar practices in Po-
land10. These findings allowed us to conclude that the absence of structured, regular
processes of learning and lack of feedback from the environment is indeed apecu-
liarity of public administration in transition when compared to other countries with
developed administration systems.
Analysis of the interviews conducted during the study visits allowed us to in -
troduce further improvements to our organizational learning framework. First of all,
many interviewees highlighted the role of feedback in the process of organizational
learning. It turned out that in more mature administration systems, the feedback is
usually structured, may take many different forms, and is derived from avariety of
sources. Thus, we decided to replace anarrow ‘analyses and expertise’ element (part
of the impulses category), with abroader category of feedback.
Second, the quantitative analysis emphasized the dynamic nature of the learn-
ing process. Static categories derived from the former analytical step might be trans-
formed into alogical sequence of steps that reflects the iterative and cyclical cha-
racter of organizational functioning. Our interviewees pointed to the fact that only
an on-going, cyclical process leads to accumulation of knowledge and raises the
organization’s effectiveness.
Third, the analysis of international practices aimed at enhancing learning processes
allowed for elaborating new determinants of organizational learning. Describing the
feedback, our interviewees pointed to the key role of reference frameworks. These
practical systems of goals and indicators serve as acompass in the everyday work of
an organization, and allows the impulses from external sources to be organized into
aconsistent message about the results of adepartment. The reflection upon incoming
impulses turned out to be much more codified, than it is in the case of the Polish public
administration. But these routines, checklists and procedures are not rigid. Instead, they
are constantly redefined and adjusted, drawing on the experiences of an organization.
The question of the ability to fully tap the potential of organizational learning
practices turned our attention to the issue of individual traits of personnel. In the
CAWI questionnaire, under the personnel theme, we included only questions regard-
ing the characteristics of work performed by agiven person (workload, infrastructural
barriers, etc.). Further statistical analysis proved they are not significant for organi-
zational learning. However the qualitative stage of analysis allowed us to elaborate
three individual traits that raise the capability of organizational learning, i.e. critical
thinking, goal-oriented thinking and system thinking.
9 Their short, unified descriptions in English are available at the project webpage:
10 These are namely: (1) a newsletter implemented in one of the four studied ministries, (2) a com-
munity of practice in the field of audit experts, (3) three cases of the use of performance budgeting for re-
flection on departmental performance, (4) use of evaluation studies and their recommendations in a few
departments related to EU-fund recommendations, (5) use of regulatory impact assessment in Polish
administration (a new development only mentioned in one of the interviews).
36 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Interviews with heads of departments allowed us to look at the question of
resources and relations of adepartment from adifferent perspective. The quantitative
analysis, drawing on the knowledge of regular employees, failed to acknowledge the
role of financial resources. It seems that the role of this issue is recognized only at the
senior level, where the responsibility to allocate the funds is located. Similarly, the
importance of relations with both the remote and immediate environment (especially
the relations between heads of departments and their political supervisors) is better
reflected at the managerial level.
To sum up, this stage of analysis put feedback back among the elements of orga-
nizational learning, and allowed to uncover the dynamic and cyclical nature of the
organizational learning process. Major changes occurred in the part of the framework
depicting the determinants of organizational learning. 8 new factors were elaborated,
i.e. the reference framework, codification of practices, goal-oriented thinking, system
thinking, and critical thinking, relationships with both the immediate and remote
environment, and financial resources were included under the broadened category
of financial and technological resources. Together with 7 factors elaborated in the
quantitative stage, these 15 determinants were grouped under 6 thematic areas, i.e.
personnel, teams, leadership style, resources, procedures and customs, and relation-
ships with the external environment.
1.4 Conclusions – the organizational learning framework
Thanks to research carried out in the Polish ministries we know that organizational
learning is adynamic mechanism, which consists of (1) aset of learning processes and
(2) factors that support these processes.
These two elements together, and the relations between them, constitute the so-
called learning mechanism (see: Figure 5). The definitions of all elements of the learning
mechanism, i.e. learning processes, and learning determinants are presented in Tables
6 and 7. The description includes the role that each element plays in supporting the
performance of an organization or its organizational learning processes.
Learning processes form an action cycle (the blue cycle in the center of Figure5),
which allows an organization to create new knowledge, and on the basis of this
knowledge – to adapt to challenges of the complex and dynamic reality. The cycle
consists of four elements, i.e. impulses, reflection, knowledge and adaptation. In other
words, adepartment obtains information from external sources (including feedback),
which induces reflection. This eventually leads to creation of new knowledge, which,
in turn, serves as abasis for decisions altering the current activities of adepartment
(i.e.adaptation). Adepartment might than learn about the outcomes of this adapta-
tion, drawing on feedback received from the external environment. Asituation such
as this indicates that afull loop of the learning cycle has been completed.
The cyclical process described above should occur in regard to particular pro-
jects, issues, and tasks that a given department carries out. The performance of
organizational learning depends both on the quality of particular elements of the
cycle (i.e. learning processes), and on the ability to systematically combine them.
Discovering the learning mechanism
Every organization, in order to carry out its activities and reach its objectives,
needs human resources (staff, teams, leaders) and physical resources (infrastructure);
it also utilizes various procedures and has relationships with its external environment.
The proposed organizational learning framework takes into account all of these fields.
Our focus is, however, only on those dimensions of the organizational resources,
procedures and relationships that influence the learning processes. These findings
fit well into the results of recent research on critical success factors of organizational
learning in public administration (Pokharel & Hult 2009, Barette et al., 2008). How-
ever, our framework provides a more comprehensive, multi-layer description of
learning determinants, ranging from the individual level, through teams and the orga-
nizational level, to relations with the external environment. Furthermore, it includes
both soft, cultural dimensions (customs, leadership style), as well as the ‘hard ware’ of
an organization (procedures, financial and technological resources).
Particular factors support only apart of the learning cycle. The study conducted
in the Polish ministries allows us to indicate which processes are most likely to be
influenced by agiven factor. Knowing the relations between learning processes and
the phenomena that support them, we can determine the set of factors that needs to
be strengthened in order to enhance agiven stage of the learning cycle (see: Figure 6).
It is worth noting that our framework resembles a classic Kolb’s model of
experiential adult learning (Kolb, 1984), which treats an organization as a living
organism. This approach might prove helpful to understand, as well as measure, dif-
ferent inter-organizational processes.
Table 5. The practical utility of framework – tool for monitoring organizational learning
The organizational learning framework has a nested structure. This means that: (a) a list of one
hundred survey items measures the frequency of certain behaviors in an organization; (b) survey
items are clustered to measure elements of the organizational learning mechanism; (c) these
elements are graphically arranged into wider categories: processes of learning and determinants
of learning.
So, looking at Figure 5 and Annex 1, consider this example. Two survey items comprise the
element labeled “Conferences and Training”, while five survey items construct the element called
“Feedback”. These two elements are grouped under the name “Impulses”, which in turn is one
of the four clusters (impulses, reflection, knowledge, adaptation) that build the most general
category called “Processes of Learning”.
This nested logic allows public managers to measure and monitor easily all aspects of orga-
nizational learning at the different levels of their agency. Namely it allows:
(1) Collecting reliable data on the learning mechanism
Employees of an organization respond anonymously to survey items. Particular questions
measure the frequency of certain behavior in their organization important for organizational
(2) Aggregating survey data and turning the data into information
Validated formulas allow: (a) aggregation of individual responses into elements of the learning
framework; (b) demonstration of the condition of each element of the learning mechanism
(e.g. system thinking, mutual support, feedback) at the 1-10 scale (1 = lowest intensity,
10 = highest).
38 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Table 5 – continued
(3) Visualizing and comparing the results of the organization
The Prezi template allows combining, visualizing and animating different layers of data:
(a) showing on one screen dashboard the bigger picture - intensity of all processes and deter-
minants of learning; (b) zooming in and out of each element of the mechanism (e.g. impulses
– feedback; strategic knowledge) and see results of all survey items that have built that element;
(c) comparing and benchmarking results of own organization with average of Polish ministries,
mean of whole organization (if survey covered different units within organization) or even, if
survey has been repeated, changes over time.
(4) Conducting constructive data-driven discussion about the condition of an organization
The agenda for a team meeting allows leaders and members of the organization: (a) to
engage in conversation grounded in data; (b) to identify the reasons for the observed situation;
(c) to discuss possible improvements in organization and (d) to evaluate the effectiveness of
implemented management solution over time.
Please note that survey questions are presented in Annex 1 of this book. The template for the
on-line survey, all analytical formulas, Prezi templates, data for comparison and agenda for
discussion are available for download for free from project web page:
In the conclusions of this chapter we presented a framework of organizational
learning for public administration. This framework has been empirically developed
and tested, and it relies both on qualitative and quantitative analyses. It combines
both the perspective of public administration under transformation, and mature
administrative systems from leading OECD countries. It reflects the viewpoint of
both regular employees (CAWI questionnaire), and senior management (in-depth
interviews). It attempts to bridge the gap between theoretical literature and everyday
practice. The universal nature of the proposed framework helps to describe the
mechanism of organizational learning in various public organizations, and to re-
create the causal relations leading to the current state of this phenomenon.
The framework, as it has been presented in Table 5, has high practical value. We
believe that it could help the public administrations of countries in transition to begin
thinking about organizational learning in a structured way. Senior management
as well as staff would appreciate (as testing in the Polish ministries indicated) its
usefulness in monitoring organizational learning in their agencies. This framework
provides them with reliable data on the learning mechanism. It gives insight into the
functioning of different levels and aspects of a given organization without losing the
bigger picture of the whole organization. Finally it allows for making management
decisions and testing organizational improvements based on analysis grounded in
data (for detailed information see: The selection of management
tools designed to support elements of the learning mechanism, and eventually advance
the whole organizational learning process, are presented in next chapter of this book.
Discovering the learning mechanism
Table 6. Processes of organizational learning
Name of the process What it is How it benefits the organization
Conferences and
training Participation of employees of a department in conferences
and training related to their area of work. Participation in conferences and training helps the
organization to acquire new knowledge, find inspiration and
ideas for novel approaches to challenges and to current
Feedback All information from the external environment of a department
assessing the efficiency, effectiveness and usefulness of the
activities carried out by this department.
It can be in the form of analyses and expertise, monitoring
data, principals’ assessments (e.g. politicians), opinions of
the stakeholders of a given policy, comparisons with other
departments, etc.
Feedback is like school grades – it allows us to understand
whether things are going in the right direction, and whether
the tasks implemented by a department bring the expected
results, and benefit their recipients.
Top-down reflection Discussions and analyses of issues important for a whole
department, conducted with the participation of a department’s
Top-down reflection allows employees of a whole department
to think about incoming information, relate it to their own work
and eventually translate it into knowledge useful in the specific
context of their department.
Bottom-up reflection Discussion and analyses of received impulses and the current
situation, which take place in a unit of a department, among its
Bottom-up reflection allows employees of units and teams to
think about incoming information, relate it to their own work
and eventually translate into knowledge useful in the specific
context of their unit.
Contextual knowledge Knowledge of the environment in which a department
operates and of the subject related to its tasks, held by staff
of a department. The ability to explain the trends and possible
causes of various phenomena in a given sector or policy area.
Helps to explain what is happening around the department,
in a given policy area, subject or sector, which is linked to
the activities of the department. It also allows you to relate
observed changes to the situation of the department.
Strategic knowledge Employees’ knowledge of the objectives of a department,
of the expected effects of the activities undertaken by
a department, of its role in the ministry, in the system of public
policies, and in society.
Strategic knowledge explains to employees how their work
contributes to the realization of the most important tasks of the
organization. It allows them to guide their activities towards
a common, overarching goal.
Operational knowledge Technical, operational knowledge (know-how), associated with
the use of different tools, operating methods, application of
effective processes and procedures.
Allows employees to continuously improve established
processes, procedures, and operating methods. As a result, it
improves workflow in a department.
40 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Table 6 – continued
Name of the process What it is How it benefits the organization
Operational adaptation Change in operational issues – the current working methods,
procedures, ways of performing daily activities – made as
a result of reflection on the impulses that reached a unit or
a department.
Thanks to operational adaptation, an organization changes
the way it performs daily activities (doing the same things
more efficiently). It improves the smoothness and efficiency of
Strategic adaptation Change in the future directions of a department, the tasks or
the perception of the area, in which a department operates.
Occurs under the influence of reflection on the impulses that
have reached the organization.
With strategic adaptation, a department responds to the
challenges and needs of the evolving environment. As
a result, both the effectiveness and usefulness of the activities
performed by a department are improved.
Political adaptation Change in the course of action, revision of the purpose of
a department, under the influence of a political or personnel
change at the highest levels of the ministry.
Thanks to the political adaptation, a department adjusts to
a vision put forward by the leaders of the organization –
Discovering the learning mechanism
Table 7. Determinants of organizational learning
Name of the
determinant What it is How it supports the learning process
Goal-oriented thinking The ability to use cause-effect reasoning, to perceive and
define the activities of a department in the form of causal
relationships: challenges and needs – inputs – processes –
outcomes (positive changes).
Thanks to this skill employees can identify and obtain
feedback from external sources, which concerns the most
important outcomes of an organization’s activities. They
can also use this ability to pursue critical reflection, draw
conclusions and strengthen their strategic knowledge.
System thinking Identifying relationships and interdependences, perceiving the
broader context in which public activities and different projects
are taking place. Awareness of the dynamics of phenomena
in time.
Thanks to this skill employees can identify and acquire
information from external sources, related to the context and
longitudinal effects of a given activity. On this basis, they can
pursue critical reflection and strengthen their strategic and
contextual knowledge.
Critical thinking The ability to ask questions, to formulate problems clearly, to
build arguments, to evaluate evidence and its credibility, and
the ability of logical inference.
This ability allows employees to identify reliable sources
of information, to build a clear argument during team
discussions, and to base decisions on firm evidence.
Depending on the topic, it can improve different kinds of
knowledge (operational, strategic, and contextual).
Mutual support Support provided by co-workers in the face of emerging
problems. It is a prerequisite of cooperation and fruitful discussions in the
team. It also strengthens operational knowledge.
Group cohesion Good relationships in the team, mutual kindness and a spirit of
cooperation. It benefits team cooperation, and supports common reflection
on improving one’s work and its effects.
Psychological safety Freedom to express opinions (including critical ones),
acceptance of different views occurring among the team
members, lack of fear of risk-taking, absence of deliberate
disturbance of colleagues.
It is necessary for the smooth functioning of the team. It
enables collective reflection, and learning from successes and
failures of a department and its teams.
Democratic leadership
style – heads of
A style of team management. Democratic leaders encourage
staff to discuss and to put forward ideas, provide inspiration,
and respect employees’ independence. They can also clear up
misunderstandings between employees. Such leaders ensure
that employees are informed of their roles and the objectives
and tasks of a department.
Democratic leadership enables organizations to develop
reflection on the effectiveness and efficiency of the
department. It strengthens contextual knowledge and strategic
42 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Table 7 – continued
Name of the
determinant What it is How it supports the learning process
Democratic leadership
style – heads of units A style of team management. Democratic heads of units
encourage their staff to discuss and to put forward ideas,
provide inspiration, and respect employees’ independence.
They can clear up misunderstandings between staff members
and ensure that employees are informed of their roles and the
objectives and tasks of a unit.
Democratic heads of units enable their teams to develop
reflection on the effectiveness and efficiency of the unit.
They strengthen contextual, strategic as well as operational
Availability of analyses
and information Availability and accessibility of databases, publications,
information used in everyday work. A source of inspiration and impulses. It supplies the reflection
process with facts, and – depending on the scope of
information – helps to build contextual, strategic or operational
Financial and
The money available to a department for training,
commissioning of expertise reports, obtaining information, but
also for the equipment used in everyday work.
Financial resources allow participation in training, collecting
feedback, and facilitating reflection processes. In turn,
technological resources facilitate e.g. collection and
processing of knowledge, or communication between team
Reference framework The function of a department translated into a set of
practical information and indicators, by which the department
monitors the effects of its activities and relates them to the
broader goals set at the institution level. Thus, the reference
framework sets a benchmark to evaluate the performance of
a department. It should include the opinions of the clients of
a department and its stakeholders.
Allows the impulses from external sources to be organized
into a consistent message about the results of a department.
In simple terms, it tells us whether we have succeeded as an
organization. Therefore, it creates a framework for meaningful
top-down reflection and builds all kinds of knowledge.
Codification of
practices Well-established practices of commissioning research and
expertise, of internal reflection in teams, of knowledge sharing
and storing information. The codification may take the form
of an internal procedure, checklist, template, action scenario,
manual, or custom.
The codification of practices allows the organization to
remember the modes of action which proved useful.
Depending on the subject, it may support all three types of
with immediate
The breadth and intensity of contacts a department has
with other departments within the same ministry, as well as
relationships with political superiors.
It allows useful feedback to be gathered from the immediate
environment, i.e. from within the ministry.
Discovering the learning mechanism
Table 7 – continued
Name of the
determinant What it is How it supports the learning process
Relationship with
remote environment The breadth and intensity of contacts a department has with
actors from beyond the ministry – stakeholders of a given
policy, academics, consultants, experts, other ministries, think
tanks, etc.
Allows diverse and useful feedback to be gathered. The more
sources a department has, the wider perspective it may have.
As a result, a department gains a more objective view and
a deeper knowledge of the needs of stakeholders and the
appropriate lines of action.
Quality of expertise The ability to obtain knowledge from independent experts, and
the general assessment of the quality of external research and
Expertise of high quality is an important source of feedback.
It provides essential input to reflection processes in
a department, and supports contextual and strategic
44 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Figure 5. The mechanism of organizational learning
of organizati
of departments
of units
adaptation Strategic
of organ
and training
Discovering the learning mechanism
Developed by: Karol Olejniczak, Łukasz Widła; graphic design: Tomasz Zuchniewicz
Financial and
of practices
of expertise
Access to
analyses and
knowledge Strategic
with remote
with immediate
onal learning
46 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Figure 6. Relations between processes and determinants of organizational learning
and training
Goal-oriented thinking
System thinking
Critical thinking
Mutual support
Group cohesion
Psychological safety
Democratic heads of departments
Democratic heads of units
Access to analyses and information
Financial and technological resources
Reference framework
Codification of practices
Relation with immediate environment
Relation with remote environment
Quality of expertise
Discovering the learning mechanism
1.5 References
Alavi, M. & Leidner, D. (2001), “Knowledge management and knowledge management sys-
tems: Conceptual foundations and research issues”, MIS Quaterly, 25(1), 107-136.
Anderson, V. & Johnson, L. (1997), Systems Thinking Basics: From Concepts to Causal Loops,
Waltham, MA: Pegasus Communications.
Antal, A.B., Dierkes, M., Child, J. & Nonaka, I. (2001a), “Introduction”, in: Dierkes, M., Antal,
A.B., Child, J. & Nonaka, I. (eds.), Handbook of Organizational Learning and Knowledge,
Oxford: Oxford University Press, USA, pp. 1–7.
Antal, A.B., Dierkes, M., Child, J. & Nonaka, I. (2001b), “Organizational learning and know-
ledge: Reflections on the dynamics of the field and challenges for the future”, in: Dierkes,
M., Antal, A.B., Child, J. & Nonaka, I. (eds.), Handbook of Organizational Learning and
Knowledge, Oxford: Oxford University Press, USA, pp. 921-939.
Argyris, C. & Schon, D.A. (1995), Organizational Learning II: Theory, Method, and Practice,
Reading, MA: FT Press.
Bardach, E. (2006), “Policy dynamics, in: Moran, M., Rein, M. & Goodin, R.E. (eds.), The
Oxford Handbook of Public Policy. Oxford, New York: Oxford University Press, pp. 336-366.
Barrette J., Lemyre L., Corneil W. & Beauregard N. (2007), “Organizational learning among
senior public-service executives: An empirical investigation of culture, decisional latitude
and supportive communication, Canadian Public Administration, 50(3), 333-354.
Bennet, A. & Bennet, D. (2004), “The partnership between organizational learning and
knowledge management, in: Holsapple, C. (ed.), Handbook on Knowledge Management 1:
Knowledge Matters, Heidelberg: Springer, pp. 439-455.
Creswell, J.W. & Clark, V.L. (2010), Designing and Conducting Mixed Methods Research, Lon-
don: Sage Publications, Inc.
Crossan, M., Lane, H. & White, R. (1999), “An organizational learning framework: From
intuition to institution, Academy of Management Review, 24(3), 522-537.
Czaputowicz, J. (2008) (ed.), Administracja publiczna. Wyzwania wdobie integracji europejskiej,
Warszawa: Wydawnictwo Naukowe PWN.
Fiol, M. & Lyles, M. (1985), “Organizational learning”, Academy of Management Review, 10(4),
Fulmer, R. & Keys, B. (2004), “Aconversation with Chris Argyris: The father of organizational
learning”, in: Starkey, K., Tempest, S. & McKinlay, A. (eds.), How Organizations Learn:
Managing the Search for Knowledge, London: Thomson, pp. 16-28.
Kluger, A. & DeNisi, A. (1996), “The effects of feedback interventions on performance: Ahis-
torical review, ameta-analysis, and apreliminary feedback intervention theory”, Psy cho-
logical Bulletin, 119(2), 254–284.
Kolb, D.A. (1984), Experiential Learning: Experience as the Source of Learning and Development,
Englewood Cliffs: Prentice Hall.
Kozak, M. (2006), “System zarządzania europejską polityką regionalną wPolsce wpierwszym
okresie po akcesji”, Studia Regionalne iLokalne, 2(24), 75-97.
Levitt, B. & March, J. (1988), “Organizational learning”, Annual Review of Sociology, 14, 319-
Levy, P., Pogson, C. & Chau, S. (2006), “Feedback, in: Rogelberg, S. (ed.), Encyclopedia of
Industrial and Organizational Psychology, Thousand Oaks: Sage Publications Inc.
Lipshitz, R., Friedman, V.J. & Popper, M. (2007), Demystifying Organizational Learning,
Thousand Oaks: Sage Publications, Inc.
48 Karol Olejniczak, Jakub Rok, Łukasz Widła, Anna Domaradzka
Marsick, V.J. & Watkins, K.E. (1999), Facilitating Learning Organizations: Making Learning
Count, Aldershot: Gower Publishing Company.
Meadows, D.H. (2008), Thinking in Systems: APrimer, Vermont: Chelsea Green Publishing.
Morawski, W. (2010) (ed.), Modernizacja Polski. Struktury, agencje, instytucje. Warszawa: Wy-
dawnictwa Akademickie iProfesjonalne.
Ortenblad, A. (2001), “On differences between organizational learning and learning organi-
zation, The Learning Organization, 8(3), 125-133.
Perez-Lopez, S., Peon, J. & Ordas, J. (2004), “Managing knowledge: The link between culture
and organizational learning”, Journal of Knowledge Management, 8(6), 93-104.
Pokharel M.P. & Hult K.M. (2009), “Varieties of organizational learning: Investigating learning
in local level public sector organizations”, Journal of Workplace Learning, 22(4), 249-270.
Preskill, H. & Torres, D.R.T. (1999), Evaluative Inquiry for L earning in Organization s, Thousands
Oaks: Sage Publications, Inc.
Saldana, J. (2012), The Coding Manual for Qualitative Researchers. 2nd edition, Los Angeles,
London: Sage Publications Ltd.
Sessa, M. & London, V. (2006), Continuous Learning in Organizations: Individual, Group, and
Organizational Perspectives, New York: Psychology Press.
2 Searching for inspiration.
Practices from twelve countries
Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
In this chapter we address the question: What practices could advance learning in
public organizations?
We based the search for inspirational solutions in the field of organizational
learning on research conducted in twelve countries of the Organization for Economic
Co operation and Development (OECD). These countries are: Australia, France,
Spain, the Netherlands, Japan, Canada, Norway, New Zealand, the United States of
America, Switzerland, Sweden and the United Kingdom. The choice was dictated by
the wish to ensure representativeness, understood as apresence of different models of
public administration.
The set of solutions includes:
• practices derived from systems based on rules specific to the classical model of
public administration (France, Japan, Spain, Switzerland);
examples from systems which combine classical administration with aparticipatory
and conciliatory approach, described in the latest literature as aneo-Weberian
approach (Norway, Sweden) (Pollitt & Bouckaert, 2011);
practices rooted directly in market-oriented new public management (Australia,
New Zealand, the United Kingdom); and
solutions derived from administrative systems reconciling a market-based ap-
proach to the management of public affairs with active civic engagement in public
affairs (Canada, the Netherlands, United States).
The rest of the chapter reflects this typology.
In all twelve countries, we collected data according to the same four-step research
procedure. In the first step, we identified the potentially most interesting institutions,
taking into account both the results of our review of sources, and the opinions and
suggestions of experts from agiven country cooperating with us. In the second step,
we made study visits. In each country, we conducted interviews with researchers
dealing with the issues of public administration (29 interviews) and with officials, i.e.
people experienced in knowledge management (75 interviews). In the course of the
study visits, atotal of 114 interviews were conducted11. In the third step, we extended
the analyses to additional sources of information identified by our respondents, as
11 The interview protocol is presented in Annex 3.
50 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
well as to an own review of specialist literature. In the fourth step, we made aselection
of practices which – in our opinion – are worth promoting and disseminating. This
selection was dictated by the potential usefulness of practices for improvement of
learning mechanisms in the Polish public administration and/or by ahigh degree
of innovation. An important point of reference was also the learning model of
government agencies developed and tested by us, as described in the previous chapter
of this book.
The material we collected is intended as asource of inspiration and does not
pretend to be asystematic, comparative study of knowledge management. Our aim in
selecting the cases was to grasp the nature of the administrative systems from which
they were gathered. We are aware that the set of identified practices includes solutions
that are to some extent related to one another. However, we established that this
apparent similarity often conceals an abundance of solutions shaped by the cultural
context in which the solution is applied.
2.1 Examples from countries with aWeberian Model
of public administration
The knowledge management system in the French central administration is based
on solutions related to the task budget implemented particularly intensely in recent
years. Other practices concern simple solutions such as anewsletter “Trajectoires: la
lettre de la fonction publique” (Trajectories: Public Service Newsletter) or aset of good
practices (Sets of good practices in human resource management). In particular, it is
worth recommending descriptions of coaching (Coaching in public administration –
aguide) and ways of creating and animating communities of practice (Communities
of practice) – due to the very practical tips for improving the use of these solutions in
public administration.
Below we describe two selected practices for organizational learning: Trajectories:
Public Service Newsletter and Coaching in public administration – aguide.
Table 8. Trajectories: Public Service Newsletter
This Newsletter is issued monthly by the Ministry of Public Service.
It contains brief notes describing the latest studies and reports, new websites (or parts of existing
sites), important events, etc. (usually, it consists of three or four pages).
The Newsletter can be downloaded from an open-access website, it can be also subscribed to–
in this case, new issues are sent to the e-mail address.
The pithy and readable form allows a very quick overview of the most important events concerning
the operation of public administration in France and easy access to detailed information.
Essentially, the Newsletter serves as a guide, a list of issues or a point of access to independent
texts (reports, documents, websites, articles, etc.).
Source: based on Płoszaj (2013a).
Searching for inspiration. Practices from twelve countries
Table 9. Coaching in public administration – a guide
This Guide was prepared by the French Ministry of Public Service.
The Guide presents in a comprehensive, yet very concise way, various aspects of the use of
coaching in public organizations.
It includes a review of basic issues related to the objectives, organization and delivery of coaching.
Good practices, supported by valuable and communicative examples are also presented there.
A very useful part of this guide are the fragments devoted to procedural and technical issues,
such as conducting public procurement procedure for services, models of contracts, performance
indicators, as well as issues related to the ethical aspects of coaching in public administration.
The Guide provides many practical solutions. It offers advice and gives examples of public
organizations which made successful use of the coaching method.
The usefulness of this guide is determined by three features: it is adapted to specific needs; it
is supported by the authority of important public institutions; it is concise, with an attractive and
clear graphic design.
Source: based on Płoszaj (2013a).
The practices of organizational learning used in this country provide lots of inspi-
ration. Majority of practices are related to the management of human resources (Job
rotation of civil servants; Planned Development of Human Resources, Common rooms,
Benkyo-kai Discussion Gorups). One of unique features of Japanese administration
is also a highly regulated and transparent exchange of personnel between the public
and private sectors. Japanese administration has developed anumber of mechanisms
connected with the reflection on the effects of undertaken measures (Evaluation,
First-hand experience of gemba, Database of hiyari-hatto incidents). Practices such as
the nemawashi and ringi decision-making process and Public Comment Procedure serve
to gain knowledge from a wide range of internal and external stakeholders in order to
make best possible decisions.
Below we describe two particularly worthwhile practices selected from many in
the field of organizational learning: The database of hiyari-hatto incidents and The
nemawashi and ringi decision-making process.
Table 10. Database of hiyari-hatto incidents
The essence of the Database of hiyari-hatto incidents or – in the literal translation – “close-call
incidents” is the regular recording and accumulation of descriptions of incidents, the consequence
of which could be serious accidents or problems arising from such incidents/precedents. All
employees, departments, divisions, as well as offices need to be engaged in these activities.
The practice originated in the field of occupational health and safety, and could be traced back
to Heinrich’s Law which argues that in a workplace, for every accident that causes a major
injury, there are 29 cases of negligence causing minor injuries and 300 accidents that caused no
injuries and were completely ignored.
Information about an incident is sent in the form of a report to the department responsible for the
planning and implementation of public policy.
52 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
Within the Spanish administration, we regarded three practices in the field of
organizational learning as particularly valuable. The first is related to the system
of sourcing, collecting and using information through the use of information tech-
nology (Knowledge Management 2.0). The essence of the second of these prac tices
is the creation of an open base of software, available for public entities interested in
its use (Andalusian software repository). The third of these practices is related to the
institutionalization of asolution involving the creation, within alarge public organ-
ization, of aunit for quality and knowledge management (Knowledge mana gement
practice in the Andalusian Employment Office). Below we describe the Know ledge
Mana gement2.0 practice.
Table 10 – continued
Its task is, first of all, to register the incident in a database with all the information on its
Secondly, it has to provide feedback in the form of guidelines for the management of risk
associated with the incident or provide appropriate training.
The final step of the procedure is a supplementary report covering other cases of the incident
and describing the preventive measures taken.
Collecting large amounts of data and a uniform method of registration, facilitating analysis and
comparison, is crucial for the success of this practice.
Source: based on Olejniczak (2013c).
Table 11. Nemawashi and ringi decision-making process
The ringi decision-making process is a formal representation of the nemawashi practice (“laying
the groundwork” or – in the literal translation – “digging around the roots of a tree, to prepare it
for a transplant”).
This practice refers to the process of multilateral consultations and bottom-up consensus-
building, which precedes important decisions.
Its characteristic feature is empowering the lowest-ranking and usually the youngest officials
to conduct the process of consultation and intensive communication between all concerned
departments using a document called ringi-sho.
Ringi-sho is a formal proposal containing information in relation to which each interested party
may submit comments and suggestions for amendments.
Acceptance of the proposal is expressed in the form of a signature or stamp on the first page of
the document.
The key to the success of the practice is involving the lowest-ranking officials in the process of
consultation, consensus-building and document circulation.
This practice also requires the development of a transparent documentation format, and super-
vision over the process of consultations with the use of this document.
The main advantage of this practice is that it supports wide-range communication and consen -
sus-building process. As a result of this changes are implemented more efficiently and
lower-ranking employees are more engaged in the decision-making and management of the
Source: based on (Olejniczak, 2013c).
Searching for inspiration. Practices from twelve countries
Table 12. Knowledge Management 2.0
The Knowledge Management 2.0 practice is an IT system, similar to the moderation of
a discussion forum, thanks to which officials and the organizations cooperating with them can
directly share acquired knowledge.
For this purpose, a special application – a form – is used, by which employees can share
information or ask questions, in other words, they can be informed and keep others informed
about their work, problems encountered and their solutions.
To encourage officials to share knowledge, a system of motivation in the form of a small premium
granted to those who are particularly active in this regard was created. Moreover, they win the
respect of their superiors, which builds their position and prestige.
The practice begins with an employee reporting the need to obtain information or the desire
to share own knowledge on a specific case. This signal goes to the department of knowledge
management, which assesses its importance and determines a further course of action (for
example, specifies the information needed to solve the problem reported by the employee).
In the first step, checks are made to see if a similar issue has already been reported, which is
facilitated by an extensive database of past queries and initiatives. If the answer is not in the
database, an attempt is made to find a solution by:
• the department of knowledge management;
• employees of organizations identified as having had a similar experience;
• external experts;
• the agency's management (especially if it is an improvement proposal).
The solution to the problem is entered into the database.
In a situation where a new problem reported by an employee does not find a response through
an algorithm, the interested party is informed of the search failure and is asked to attempt to
solve it on his or her own, and then to submit the information obtained in the process for entry in
the organizational database.
Source: based on Możdżeń (2013).
Public administration in Switzerland is an example of skillful synthesis of me -
chanisms specific to the Weberian model, the rules of the new public management,
and solutions characteristic of co-management. We found four practices particularly
useful. The first relates to a method of project management defined as Hermes.
The second practice concerns the modeling of public services, the simplification of
administrative procedures and the development of e-government (SimpA). The third
of these practices is the e-Government Development Strategy. The fourth practice is
atool for electronic voting, serving civic activation. Below we describe the SimpA
Table 13. SimpA practice
Modeling of administrative procedures is a practical manifestation of the use of modeling of
e-government services.
This tool is part of the legislative program in the Canton of Vaud, whose objective is to simplify
administrative procedures.
54 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
2.2 Examples from countries with aNeo-Weberian Model
In the course of its evolutionary development, the public administration in Norway
developed many original practices of organizational learning (Culture of consensual
management, Flexible working conditions). It is worth noting the practice associated
with the establishment of objectives for the administration and determination of indi-
cators for measuring them (Missions and objectives of the administration). Another
practice, which is worth promoting, is related to the mechanisms of forming an
institutional memory of public offices (Mentoring – program of “patrons). The
practice of apartner forum, which is amechanism for the exchange of knowledge
and experience of officials and the academic community, is particularly inspiring
(Partnerforum). The practice described as a Program of acquiring specialists for
administration is equally interesting. An example of the institutionalization of expert
potential for the modernization of the public sector is the Specialized Agency for Public
Below we describe two practices in the field of organizational learning, which
should be given special attention: Partnerforum and Program of acquiring specialists
for administration.
Table 13 – continued
Modeling of administrative procedures is a method of self-evaluation of the participation level of
a particular organizational unit in the field of public services. This tool is primarily focused on the
relationship with the environment, that is, with citizens.
However, it also refers to the institutional dimension of the administration’s co-operation with
other public, private and non-governmental organizations.
The application of the model is supported by the thesis that, while the strategic aspect of
designing services are commonly known, there are not many instruments which define both the
vision and the methodology of their implementation in terms of the transfer of activeness and
competence for greater civic interactivity.
For this purpose, a matrix was created. It not only allows monitoring of the process and measuring
the degree of availability of public services – and therefore their benchmarking – but is also
a tool for strategic management, improving future relations with service users (citizens, business
institutions, other public organizations and NGOs).
Implementation of the SimpA program started in March 2010, and its achievements so far include
180 simplification proposals gathered in the so-called idea boxes, more than 50 proposals
developed by an internal working group and the creation of three consultation groups with
external partners – business, citizens and communes.
Source: based on Chrabąszcz (2013).
Searching for inspiration. Practices from twelve countries
Table 14. Partnerforum
Partnerforum is an initiative of two higher education institutions – the University of Oslo and the
Norwegian School of Business – launched in 1993, aimed at sharing knowledge and experience,
and the integration of officials of ministries and central offices and academic researchers.
Initially, it involved the participation of 12 partner institutions, and currently, there are already
21 partners from public administration and the academic sector. The project is prestigious and
participation in it is paid.
As a part of the Partnerforum, regular meetings are held. Their subject is determined by officials
and oscillates, among others, around such issues as: democracy, efficiency, ethics, government
policy, human resource management, competence improvement, innovation, international
affairs, justice, climate policy, communication, state and local government, governance and the
Meetings of the Partnerforum are held in the form of “breakfast meetings” (9.00-11.00 a.m.,
usually four times a year) and an all-day conference (half-yearly – in spring and autumn), and
The success of the initiative is based on the way it functions – it is a voluntary program and the
officials participating in it have a major impact on shaping its character, the subjects of meetings
and their course.
The academics conducting them focus on practical issues and a workshop model. Each quarter,
assemblies of contact persons from each of the partner institutions take place, during which
issues relevant to them are discussed.
The Partnerforum program unites different elements of the learning process. In addition, it helps
to break the vertical nature of the central administration through the creation of groups of people
working in different departments and ministries.
It also affects the processes of learning within the organization, because the officials participating
in the meetings of the program later share the acquired knowledge and materials with office
colleagues through the Intranet.
Source: based on Jakubek-Lalik (2013).
Table 15. Program of acquiring specialists for public administration
The Program of acquiring specialists for public administration functions, among others, in the
Norwegian Agency for Public Roads.
While grappling with the problem of finding highly skilled professionals – in this case, engineers
constructing roads and bridges – the said Agency decided to start a special long-term program
encouraging people to connect their professional career with the public administration.
This program is based on a partnership with higher education institutions and involves, among
others, financing scholarships for the most talented students, organizing paid internships with
the possibility of subsequent employment, financing doctoral studies and research for already
employed officials, and the incentives to undertake an academic career.
This program makes it possible to combine work in public administration with a scientific career.
The aim of this action is also to prevent the emigration of highly skilled workers for financial
Moreover, the Agency operates a Centre for Competence Development, an institution which
oversees the professional development of employees.
The advantages of this practice are that high-class specialists are acquired and kept in the
In addition, it serves to strengthen the scientific and didactic potential of partner higher education
institutions, which cooperate with the administration training highly qualified personnel for its needs.
Source: based on Jakubek-Lalik (2013).
56 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
Some of the practices we analyzed, drawn from the public administration in Swe-
den, concern extensive knowledge production mechanisms for implementing public
po licies (Collective decision-making within the government, Research Committees, Sub-
stan tive assessment of reports). Other are associated with the development of adminis-
trative staff for the effective and efficient execution of public tasks (Open access to the
civil service, Individualization of employment conditions, Training focused mainly on
the development of ‘soft’ skills and qualifications).
Below we describe two of these practices, i.e. Research Committees and Training
focused mainly on development of ‘soft’ skills and qualifications.
Table 16. Research committees
Research Committees within the Swedish government are established for the analysis of a par-
ticular problem requiring legislative intervention and the presentation of proposed solutions.
The Swedish government institutions use standard methods of acquiring knowledge from outside
through commissioning both scientific institutions and expertise from commercial entities (e.g.,
advisory companies).
Research Committees are an intermediate solution between commissioning expertise outside
and creating public policies based solely on own resources, i.e. experts employed in the ad -
The model of Research Committees makes a significant contribution to improving the quality of
The factors crucial for their success include considerable independence, adequate funding and
the right composition of personnel. The essence of expert committees is the combination of
different points of view.
They are usually composed of the representatives of three groups: representatives of the po -
litical division of ministries (minister, secretaries of state, political advisers); representatives of
the substantive departments of ministries; representatives of academic communities, renowned
experts in the field of issues dealt with by the committee.
The committees work on the basis of a mandate (terms of reference) granted by the government,
which determines mainly: the public policy area covered by the interest of the committee; specific
problems which should be solved by the committee; and the deadline for completion of its work.
The result is a report describing specific problems identified by the committee and presenting
a proposed solution, especially through legislative intervention.
Reports by the Swedish expert committees are usually the starting point for legislative changes.
Source: based on Sześciło (2013c).
Table 17. Training focused mainly on the development of ‘soft’ skills and qualifications
In the Swedish government administration, in principle, recruitment is open and based on
assessment of qualifications and skills. In turn, a flexible remuneration system recruitment of
high-class specialists from the private sector or academic community.
People entering the Swedish government administration are already suitably qualified in the field
of public policy they will be dealing with. Their preparation for work in a particular area may result
from experience stemming from activities in the private sector or from scientific work. They do
not need education or training in the field which they will be dealing with in the administration.
Searching for inspiration. Practices from twelve countries
Table 17 – continued
As a result, the training policy in Swedish government institutions is focused on areas other than
raising the qualifications of the officials in the fields of public policy, which they deal with every
The following priority areas can be distinguished in the training policy of the Swedish government
administration: leadership; ethical attitudes and behavior; procedures within the collective
decision-making process in force in the Swedish government; operation procedures and
decision-making mechanisms within the European Union; training propagating a customer-
oriented model of activities; improvement of the widely understood managerial skills related to
the management of teams and projects.
The principal advantage of the Swedish model of training in the civil service is the increase in the
managerial potential of personnel and the facilitation of the creation of leaders in administration.
Therefore, the officials are specialists not only in the areas which they are dealing with. In
addition, they acquire skills which enable them to manage their work better, understand its
importance in the political and institutional context and understand the values and norms specific
to public service.
Source: based on Sześciło (2013c).
Table 18. Talent management
The key role in the implementation of this program is played by the Strategic Centre for
Leadership, Learning and Development, functioning in the structure of the Australian Public
Service Commission (APSC), directly subordinate to the Prime Minister and the Cabinet
2.3 Examples from countries with New Public Management
We observed many inspiring examples from the field of organizational learning
in Australia. We found the solutions which involve building the capacity of officials
in the creation of law (Legislative preparation program), and shaping the leaders of
public programmes (Leader preparation program) particularly valuable. Our attention
was also drawn to practices relating to the creation of conditions for tapping the
potential of particularly talented people, who take up work in the public sector (Tal e nt
The solution aimed at gathering the views of stakeholders on the quality of func-
tioning of public administration (Service cards) and the techniques of strengthening
the mechanisms of organizational learning by identifying the objectives, intentions,
attitudes and interests of the implementers and stakeholders of aspecific public policy
(Mental models) are also worth noticing.
Below we describe two practices of organizational learning – from among many
deserving promotion – drawn from the Australian administration, i.e. Talent mana-
gement program and Mental models.
58 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
Table 19. Mental models
Mental models were key elements of efforts to strengthen the processes of organizational
learning undertaken in the Ministry of Health in Australia.
These models are designed to focus the administration on the customer by mapping the objec-
tives, intentions, attitudes and aspirations of individuals/groups for whom the public ad minis-
tration is operating.
The primary value of this practice is to provide an instrument to identify the objectives and mo-
tivation of the main recipients of the ministry's actions.
The use of this approach facilitates consultations and negotiations with the ministerial partners,
whose needs, goals and expectations are better recognized due to these models. It also faci-
litates understanding and communication both between different groups of employees within the
same ministry, and between its employees and external stakeholders.
The concept of Mental models uses a wide range of research methods, particularly surveys,
interviews and focus groups.
Source: based on Sześciło (2013a).
New Zealand
The public management system in this country is rich in solutions for building
the capacity of organizational learning. Asignificant number of these practices is
related to integrated strategic management (Multiannual plans of action; Performance
improvement framework). Many of the practices developed in the administration of
this country refer to the management of human resources and improvement of in-
ternal communication mechanisms (Monthly evaluation of the implementation of
indi vidual plans of professional development, Action Learning Sets, Intranet directory
Table 18 – continued
This program consists of a year-long training course, workshops and individual training
sessions addressed to mid-level officials who have the potential and aspirations for promotion to
managerial positions in the administration.
The basis for classifying an officer to participate in the program is meeting the following criteria:
very good work results;
abilities and skills necessary to perform managerial functions (e.g., the ability to think critically,
adapt to new situations, keep ethical standards);
involvement in the creation of public policies which bring benefit to citizens;
ability and willingness to share visions and ideas with others;
aspirations – the desire to take up high-ranking positions in the administration.
The program includes several different tools addressed to its participants, including:
a few days' session consisting of group work;
• group coaching;
• individual coaching;
• mentoring;
implementation of a joint project by a group of participants;
so-called job shadowing, that is, a simulation of situations and events which can occur in
a specific position of work.
Source: based on Sześciło (2013a).
Searching for inspiration. Practices from twelve countries
Table 21. Growing leaders
The essence of Growing leaders, implemented in the Ministry of Health, is a comprehensive
preparation of mid-level officials to perform managerial functions in public administration.
The program is addressed to officials who have particular predispositions to perform managerial
functions in the future.
They are covered by a system of training, workshops and ongoing guidance from the unit of
human resources management in the ministry.
The subject of the program is to strengthen skills related to team management and project
management. It should be emphasized that it does not include training in the area of public policy
which a given official is dealing with.
The main value of this practice is that it strives to ensure high-quality management personnel,
which has a crucial impact on the learning mechanisms in public institutions.
of ministry employees, Internal communication tools). The practice associated with
the construction of leadership in public administration is particularly interesting
(Growing leaders).
Below we describe two out of many inspiring practices in the area of organizational
learning: Performance improvement framework and Growing leaders.
Table 20. Performance improvement framework
Performance improvement framework is an important part of the New Zealand experiment for
directing the activities of the public administration at achieving measurable results (outputs,
With regard to the ministries, the system of management by results is based on a specific
contract concluded annually between the ministry and its political superior (i.e. the minister).
This contract specifies the results (outputs, outcomes) which the ministry is expected to generate
in the sphere of public policy it is responsible for. The second element of this “transaction” is the
commitment of the minister to ensure funds for the ministry in the budget, in the amount allowing
the achievement of planned results.
Performance improvement framework is a tool designed for the comprehensive improvement of
performance of public administration institutions in New Zealand. Their objective is to facilitate
the ministries, agencies and other institutions performing assessment of performance.
The subject of assessment are all the areas of the organization, including the fulfillment of
its main functions towards citizens, leadership, external relations, personal development of
employees, resource management and financial issues. The result of applying this method is
a list of problems and areas which require improvement in a given organization.
Performance improvement framework is a tool which can be used for self-assessment of the
ministry or other government agency, or for the so-called formal assessment used by external
This tool is based on a relatively simple methodology of rating, showing the effectiveness of
a given organization in each of the critical areas of its operation.
The list of specific “critical” areas subject to assessment includes, among others, the following
elements: organizational culture, values and norms in the organization; leadership; vision,
strategy and objectives; control and audit; organizational structure, division of roles and tasks;
interaction with the minister; cooperation and partnership with external stakeholders; capacity
building of the Ministry's personnel; involvement of employees; financial management.
Source: based on Sześciło (2013b).
60 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
Table 21 – continued
The main value of this practice is that it strives to ensure high-quality management personnel,
which has a crucial impact on the learning mechanisms in public institutions.
Growing leaders aims to strengthen the competence of the officials who achieve high results
in their current work and have a predisposition for promotion to managerial positions within the
organization, but who do not necessarily have the relevant skills in the area of team management
and project management.
A potential weakness of the program can be difficulties with selecting participants from within the
organization – to what extent the units of human resource management in the ministries are able
to identify the employees who have predispositions to participate in such a project.
Source: based on Sześciło (2013b).
United Kingdom
The administration of the United Kingdom provides many interesting solutions.
Some of them are based on the use of feedback (Action Learning Sets; 360 feedback).
Others are directed at the development of leadership skills. The aim of some of these
solutions is sharing knowledge in the organization and supporting the process of col-
lective reflection (Developmental peer-review; 360 feedback; Intra-ministerial seminars
– DECC School). Further practices drawn from the British experience relate to
evidence-based policy (Database containing analyses of strategic challenges; Data base
of instruments supporting strategic thinking; Regulatory Impact Assessment – quality
assurance mechanisms). It is also worth noting the practice focused on strengthening
operational knowledge, serving the effective organization of daily work (Social net-
work – Yammer).
Below we describe two practices in the field of organizational learning, which are
worth promoting: Developmental peer-review and Regulatory impact assessment –
quality assurance mechanisms.
Table 22. Developmental peer-review
This practice involves direct provision of feedback on the operation of a given organization and
recommendations for improvement of its functioning.
The described practice is a tool for organizational change, because it is focused on the iden-
tification of areas for improvement and the development of guidelines for achieving the desired
Details of the Developmental peer-review implementation process may vary depending, inter
alia, on the specifics of administration in a given country or institutional environment. The
following description is based on the example taken from local governments in England and
Wales, where such a review system has been operating since 1999.
The procedural model was developed and implemented by the Improvement and Development
Agency (IDeA). It met with great interest from local governments – annually, around 70 reviews
are carried out, and almost all English local governments have undergone the review at least
once (Nicolini et al., 2011).
Searching for inspiration. Practices from twelve countries
Table 22 – continued
The process of Developmental peer-review should satisfy four main conditions: undergoing the
review is voluntary; the reviewed organization is the owner of the process – it has a decisive
impact, among others, on the selection of partners, research methods and the dissemination of
results; reviewers are chosen based on experience and are properly trained – thus, they have
the authority and skills to effectively provide support; research and feedback are subordinate to
the priority of constructiveness and support for organizational change.
The review process consists of four main stages: process initiation and planning of the review;
preparation for the review; conducting an “on-the-spot” check; and feedback and report.
Interested organizations apply to the institution managing the process, which selects a co -
ordinator. Then, this person visits the reviewed organization in order to discuss the goals and
challenges of such a process and the terms of participation.
Once the organization takes a formal decision to participate (which is associated with a fee
covering the costs of the process), the coordinator meets the team of organization members
responsible for the review. The aim of the meeting is to adjust the review process to the needs of
the organization, and therefore, priority issues are selected, criteria for the selection of reviewers
and detailed terms of the future review are established.
In the second stage, the coordinator selects members of the reviewing group (five persons and
a coordinator) from the list of trained partners. Preparations for the review include, among others,
informing all members of the reviewed organization about the planned review and sending
documentation allowing them to prepare for the review. The review lasts a week and is carried
out directly in the analyzed organizations, in relation to the diagnostic model which reflects the
“ideal organization”. The review methods used are participatory observation, interviews with
internal and external stakeholders, and a review of documents.
Conclusions are discussed within the group, and then presented to representatives of the
organization being reviewed. On the basis of the review, a written report containing conclusions
and recommendations is created, the quality of which is also verified by the employees of IDeA.
The reviewed organization is required to develop a document which contains a description
of measures dealing with the problems identified in the report, along with a plan for their
implementation. The entire process – from the application to receipt of the final report – lasts
approximately four months (Jones 2005).
Source: based on Rok (2013).
Table 23. Regulatory impact assessment – quality assurance mechanisms
The essence of this practice is to support the process of performing Regulatory Impact As-
sessment (RIA) – ensuring comprehensiveness, efficiency and high quality results.
The core of the practice is a diagram describing the consecutive steps in the process of verifying
RIA quality, and the extension – a template of the final document of the Regulatory Impact
Assessment and an MS Excel form, developed by experts, which facilitates calculation of the
long-term effects of regulations as regards energy consumption and greenhouse gas emissions.
The described practice is associated with evidence-based policy, since it uses a detailed cost-
benefit analysis.
The need to ensure high quality RIA acquired special importance along with the advent of the
global economic crisis and the return of the Conservative Party to power.
Under the banner of making savings and deregulation, Prime Minister Cameron's government
decided to reduce the number of regulations and reduce their impact on the private sector (HM
Government 2011).
62 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
2.4 Examples from countries with aGovernance model
Organizational learning in the Canadian public administration has been insti-
tu tionalized through the creation of aspecial knowledge management strategy and
orga nizational units (Chief Knowledge Officer). In addition, administration of this
country uses many practices in the area of human resource management, which aim
to enhance the capacity of organizational learning (Assessment of employees and Re-
ten tion of knowledge of leaving employees), evaluation of implemented projects and
pro ces ses (Summaries of completed actions), as well as the exchange of knowledge
between different parts of the organization and organizations (Communities of
practice). Aparticularly interesting and inspiring idea, successfully implemented in
Canada, is the GCpedia – Wikipedia created by and for officials.
Below we describe two of the practices in the area of organizational learning:
GCpedia and Summaries of completed actions.
Table 23 – continued
To this end, the following mechanisms and institutions were implemented or strengthened
(HM Government in 2011; OECD 2011; RPC 2011): One in, one out: the principle condi-
tioning the enactment of new legislation (connected with burdens for business or non-govern-
mental organizations) with the withdrawal of an already existing regulation which imposed
costs of the same size or larger; Sunsetting: each newly implemented regulation (connected
with burdens for business or NGOs) will be automatically withdrawn after seven years unless
under the (obligatory) evaluation of the effects of its functioning a decision is made about its
maintenance during the first five years; Better Regulation Executive: an institution developing
and supervising the management policy of regulations, safeguarding deregulation and the
quality of enacted legislation; Better Regulation Units (BRU) operate in each ministry; Regulatory
Policy Committee (RPC): an independent institution set up to verify the quality of RIA; Reducing
Regulation Committee: a government sub-committee responsible for strategic oversight of the
implementation of the deregulation priority; accepts draft regulations on the basis of the RIA
issued by the RPC.
The process of creating and verifying the quality of RIA is consistent with the ROAMEF cycle, i.e.,
fits into the following logical sequence: rationale, objectives, assessment, monitoring, evaluation
and feedback.
The RIA is created in parallel with the draft of each new regulation.
The author of the assessment is the team preparing a given draft, taking care of consultation with
stakeholders and conducting analysis using tools supporting research procedures.
The advantage of the described practice is that it creates conditions which allow the process to
be based on solid research foundations. The development of tools simplifying the performance
of in-depth analyses on the impact of proposed regulations allows the use of advanced research
methods by individuals without extensive expertise.
Source: based on Rok (2013).
Searching for inspiration. Practices from twelve countries
Table 24. GCpedia
GCpedia (Government of Canada Encyclopedia) is a web portal of a Wiki type (a type of websites
where content can be created and modified in a simple and fast way, through a web browser,
using a simple, intuitive language).
The creation of GCpedia is an initiative of the secretariat of the Treasury Board, which is one of
the key institutions of central public administration in Canada.
GCpedia is created by and for the employees of the Canadian public administration; it is a so-
lution akin to the so-called Web 2.0, since its contents are – as mentioned – created and modi fied
by users, as well as in cooperation with them, which is not limited in space.
This portal is an example of the use of a new approach in the public administration aimed at
openness and cooperation, using technologies which are modern, but at the same time easy
to use.
GCpedia has an internal nature – only employees of the Canadian public administration have
access to it, both passive (viewing content) and active (modifying content).
GCpedia is used not only to create thematic entries, but also to create and share common docu-
ments, projects, reports, notes from meetings, summaries of evaluations and other publications,
as well as for discussion. Therefore, it also functions as a thematic Internet forum managed by
The aim of the initiators of GCpedia was to create a platform for the easy exchange of information
between employees of the Canadian administration, regardless of where they work (which is
a particularly important aspect in such a big country), as well as organizational – the portal also
aims to overcome division of public administration into silos (Bostelaar, 2010).
GCpedia uses the MediaWiki software – free and open, distributed free of charge by the Wiki-
media Foundation.
The portal was launched in 2008 and relatively quickly gained popularity among officials. As of
May 2012, GCpedia had over 32 thousand registered users and over 18 thousand pages, which
were visited nearly 15 million times.
The biggest challenge faced by GCpedia is ensuring the spontaneous activity of many people
(in order to fulfill its function, it must include a critical mass of articles, which, moreover, must be
updated regularly) and establishing a knowledge sharing culture.
Source: based on Płoszaj (2013b).
Table 25. Summaries of completed actions
The Summary of completed actions is a simple and widespread practice of knowledge mana-
gement. Its aim is to draw conclusions from implemented actions, projects and processes.
The essence of this approach is the analysis of completed actions in order to use the experience
gained in the future.
Such initiatives form a fairly broad group of organizational practices known under various names:
after-action review, learning histories, case studies, lessons learned, project postmortem, post-
project reviews.
Individual approaches differ in terms of organization and emphasis, but their essence is always
similar. The key objective is that a given project or process, when it is finished, is always subjected
to analysis, in order to address questions such as: What was done successfully and why? What
was done unsuccessfully and why? What could be done better? What should be paid attention
to when implementing similar actions in the future?
Analysis of the implemented project or process is generally carried out during a meeting with the
people involved in a given action.
Exchange of experience, different perspectives and discussion are essential for gaining a proper
understanding of what happened, what worked and what did not.
64 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
The Netherlands
In the case of organizational practices used in the administration of the Netherlands,
the specific mechanisms for the creation of organizational innovation (Laboratory of
innovation), the methods for the synthesis of sources and critical reflection (Argument
maps), and the cause and effect description of the activities of public sector entities
(Logic models) are particularly worth noting.
Below we describe two practices, in our opinion particularly valuable, in the field
of organizational learning, i.e. Laboratory of innovation and Knowledge brokers.
Table 26. Laboratory of innovation
The Laboratory of innovation (LI) is a practice aimed at testing innovative solutions without
running the risks of core actions performed by the organization.
The LI is a small interdepartmental team, consisting of people with experience in research and
management. Its members are delegated to it part-time and are subordinate to the board of
This practice is used in two offices in the Netherlands – the National Audit Office (NAO) and the
Netherlands Statistics Office (NSO).
An employee who has an idea for improving the office's work through the use of new management
techniques may report it to the LI. The LI team along with the originator tries to specify the idea,
outline a plan for its implementation, and then implement it jointly.
If an idea proves worthwhile, it is usually incorporated into the regular actions of the organization.
What motivates employees to submit ideas is self-development and recognition among the co-
This practice reaches out directly to the inventiveness and knowledge of employees, and creates
a safe space to take risks and test potentially useful solutions.
Source: based on Olejniczak (2013a).
Table 25 – continued
Reliable diagnosis is the basis for the formulation of reliable conclusions and recommendations,
usually taking the form of a short document (note), which then can be used as a knowledge base
in the implementation of similar actions in the future (also by other members of the organization).
The main advantage of the use of summaries of completed actions is that they create the op -
portunity to both learn from mistakes, and identify good practices.
When applying this practice, a certain formalization of the process is important. Firstly, procedures
must be in place ensuring that every important action is completed with the relevant summary.
Secondly, the process of evaluating and drawing conclusions should have a defined course.
Thirdly, organization of the process should ensure real commitment from the appropriate people.
Source: based on Płoszaj (2013b).
Searching for inspiration. Practices from twelve countries
Table 27. Knowledge brokers
Knowledge brokers are public institutions, which act as intermediaries between the scientific
sphere and the world of public policy.
Examples of such “knowledge brokers” are: the Knowledge Institute for Mobility Policy (KIM) and
the Crime and Justice Research Centre (CJRC). Both of these institutions are associated with
ministries (the KIM with the Ministry of Transport, the CJRC with the Ministry of Justice), but at
the same time have the status of independent units.
Brokers make syntheses and translate research results into the pragmatic language of policy
makers and government administration. They provide information constituting the basis of de-
cision-making and outline the available options, but do not participate in the decision-making
process, remaining impartial.
Due to its size (20 employees), the KIM is focused on the preparation of so-called meta-analyses,
mainly in the form of short reports, and even just notes, describing the current state of knowledge
on a given topic in the field of transport.
These notes (prepared within a few days) are called “Knowledge at the Table”; they are cha-
racterized by precision, unambiguity and simplicity, being a valuable and fast source of infor-
mation for decision-makers.
The functions of the CJRC are more complex. In addition to the creation of meta-analyses
and synthesis notes, it also conducts its own research, undertakes innovative issues, and has
a special meta-base linking statistics and data sources from the field of crime and the judiciary.
The employees of both institutions are mainly analysts with university degrees.
Each year, both the KIM and the CJRC conduct an opinion poll of information needs among their
ministries, which results in the creation of a framework research plan. This plan provides these
organizations with responsiveness and flexibility. They are in fact open to the current needs of
the ministries and are able to anticipate their expectations by identifying and analyzing issues
potentially important for for future public debate.
Source: based on Olejniczak (2013a).
The United States of America
The practices of organizational learning, used in the American administrative
system, are an example of an adequate mix of market management mechanisms with
apragmatic approach to the performance of clerical tasks and the ethos of public
service. The examples of practices derived from the business sector are solutions for
the measurement of objectives and results of action (Mission, goals, performance indi-
cators; Data-driven performance reviews, Dashboards, Ranking of agencies). In turn,
examples of solutions rooted in the pragmatism of the American administration are
concepts relating to the exploration of good practices (Sessions for sharing good prac-
tices, Contest of project ideas, Employees’ suggestion program). The intention of building
the potential of organizational learning by strengthening the public service ethos can
be seen in the mechanisms associated with the formation of communities of practi-
tioners, who are interested in sharing their knowledge for the better performance of
public tasks (Communities of practice) or the creation of conditions for changes in
administration by aspecific system of recruitment (Cohort recruitment).
Below we describe two particularly interesting practices used in the American ad -
mi nis tration in the field of organizational learning: Data-driven performance reviews
and Communities of practice.
66 Stanisław Mazur, Adam Płoszaj, Karol Olejniczak
Table 28. Data-driven performance reviews
The Data-driven performance review is a strategic tool of managing an organization. It consists
of regular, structured meetings, focusing on the review of key data about the progress of the
organization in achieving results. This practice is an element of a broader, result-oriented
management trend (performance mea su rement). This practice is essential for the implementation
of public policies based on evidence. It puts a discussion in order and places it on substantive
The central element of such discussions is quantitative data, but qualitative data is also used to
improve the work of the American administration.
The meetings differ from the typical working meetings – they have a regular form, with an ordered
structure and discussion procedure.
Participants of the meeting include management staff and employee representatives.
The analyzed indicators cover all the elements of a logical model, which forms the basis of
operations in a given department, but the emphasis is placed on a products and results.
Employees preparing the meeting are expected to: collect the required data and summarize
them in a transparent manner; identify, in cooperation with the management, the main issues
and questions for discussion; inform participants of the meeting about the program.
The meetings are based on several key principles: participants should be aware that the data
set is not perfect and complete, data is only the basis for discussion; an open atmosphere
of discussion should be maintained, personal references should be avoided, even in case of
unsatisfactory results, and emphasis should be placed on common discussion and solving
After the meeting, the employees responsible for the Performance Reviews maintain continuity in
the process of improving actions. In practice, this means consistent implementation of decisions
taken at the meeting and recording the degree of implementation of the adopted findings.
The effectiveness of this practice is based on the continuous commitment of the management
(employees must see that the collected data is actually the axis of discussion) and the efficiency
of the process of identifying, collecting and preparing data (it should be important and have
a transparent form of presentation).
Source: based on Olejniczak (2013b), Hatry & Davies (2011).
Table 29. Communities of practice
Communities of practice are informal social networks of people with similar goals and professio-
nal interests.
The participants of networks discuss challenges, and share knowledge, best practices, suc-
cessful solutions and ideas on how to solve problems which are the subject of meetings of
a given community.
Several communities of practice operate in the America Government Accountability Office (GAO)
(e.g., for contacts with the media, for new research methods).
All Communities of practice are horizontal and connect people from different parts and levels of
the organization.
Meetings of these communities are held during working hours, and their form and frequency
depend on the participants. For example, the “HR” Community of practice meets every month
during a lunch break.
The Group includes between 20 and 30 people working in different departments and at different
levels of the GAO. These people are either employed in human resource departments or the
issue of human capital is one of the fields of research and control pursued by them in other
Searching for inspiration. Practices from twelve countries
2.5 Conclusions from the international comparison
The differentiated solutions in the field of organizational learning used by public
administrations in the twelve countries covered by our study prompt us to make
comparisons and formulate general regularities, as well as present emerging trends.
Organizational learning is becoming increasingly important
Organizational learning has always been present in public administration. In
recent decades, however, it has acquired special importance and its nature has changed
significantly. The reason is the growing comprehensiveness of public affairs and the
related need to seek more effective ways to manage them. The search is accompanied
by two phenomena occurring in parallel. The first one is associated with attempts
to limit public spending. The second phenomenon is associated with the increase
in social expectations towards the administration, in particular with regard to the
quality of public services it provides. The way to reconcile what is economically
possible with what is socially expected is seen, among others, in strengthening the
capacity of organizational learning in public administration. This trend, focusing
on the relationship between the potential learning capacity of public organizations
and the quality of public policy, is becoming increasingly evident in the field of
organizational learning. Many researchers raise questions about the sources and
mechanisms of organizational learning and how the acquired knowledge can improve
the quality of decision-making processes, and contribute to solving public problems
more effectively.
Table 29 – continued
Participants of a session share information on what they do at work; from time to time, they invite
external guests (experts, academics specializing in human resources).