ArticlePDF Available

Abstract and Figures

This article discusses recent trends to incorporate the results of systematic research (or ‘evidence’) into policy development, program evaluation and program improvement. This process is consistent with the New Public Management (NPM) emphasis on efficiency and effectiveness. Analysis of evidence helps to answer the questions ‘what works? and ‘what happens if we change these settings?’ Secondly, some of the well known challenges and limitations for ‘evidence-based’ policy are outlined. Policy decisions emerge from politics, judgement and debate, rather than being deduced from empirical analysis. Policy debate and analysis involves an interplay between facts, norms and desired actions, in which ‘evidence’ is diverse and contestable. Thirdly, the article outlines a distinction between technical and negotiated approaches to problem-solving. The latter is a prominent feature of policy domains rich in ‘network’ approaches, partnering and community engagement. Networks and partnerships bring to the negotiation table a diversity of stakeholder ‘evidence’, ie, relevant information, interpretations and priorities. Finally, it is suggested that three types of evidence/perspective are especially relevant in the modern era – systematic (‘scientific’) research, program management experience (‘practice’), and political judgement. What works for program clients is intrinsically connected to what works for managers and for political leaders. Thus, the practical craft of policy development and adjustment involves ‘weaving’ strands of information and values as seen through the lens of these three key stakeholder groups. There is not one evidence-base but several bases. These disparate bodies of knowledge become multiple sets of evidence that inform and influence policy rather than determine it.
Content may be subject to copyright.
The Australian Journal of Public Administration, vol. 67, no. 1, pp. 1–11 doi:10.1111/j.1467-8500.2007.00564.x
RESEARCH AND EVALUATION
Three Lenses of Evidence-Based Policy
Brian W. Head1
University of Queensland
This article discusses recent trends to incorporate the results of systematic research (or ‘evi-
dence’) into policy development, program evaluation and program improvement. This process
is consistent with the New Public Management (NPM) emphasis on efficiency and effective-
ness. Analysis of evidence helps to answer the questions ‘what works? and ‘what happens
if we change these settings?’ Secondly, some of the well known challenges and limitations
for ‘evidence-based’ policy are outlined. Policy decisions emerge from politics, judgement
and debate, rather than being deduced from empirical analysis. Policy debate and analysis
involves an interplay between facts, norms and desired actions, in which ‘evidence’ is diverse
and contestable. Thirdly, the article outlines a distinction between technical and negotiated
approaches to problem-solving. The latter is a prominent feature of policy domains rich in
‘network’ approaches, partnering and community engagement. Networks and partnerships
bring to the negotiation table a diversity of stakeholder ‘evidence’, ie, relevant information,
interpretations and priorities. Finally, it is suggested that three types of evidence/perspective
are especially relevant in the modern era – systematic (‘scientific’) research, program man-
agement experience (‘practice’), and political judgement. What works for program clients is
intrinsically connected to what works for managers and for political leaders. Thus, the prac-
tical craft of policy development and adjustment involves ‘weaving’ strands of information
and values as seen through the lens of these three key stakeholder groups. There is not one
evidence-base but several bases. These disparate bodies of knowledge become multiple sets
of evidence that inform and influence policy rather than determine it.
Key words: evidence-based policy,policy development,performance,program management
In many of the mature democracies, the recent
ground-swell of interest in ‘evidence-based
policy’, on the part of both government offi-
cials and social researchers, represents both an
opportunity and a challenge. For public man-
agers and political leaders, the opportunity is
apparent for continuous improvement in policy
settings and program performance, on the basis
of rational evaluation and well-informed debate
of options. The prospect of mutual benefits for
managers, researchers and citizens is alluring.
This is the modern promise of evidence-
based policy improvement, albeit the attempt to
link the social sciences and public policy has a
much older lineage in the history of progressive
reform movements.2The social sciences and
public decision-makers have not always had
close and cordial relations; indeed, there has
been a history of mutual distrust between these
sectors during the last two centuries. However,
scientific and technical knowledge has been
greatly prized in the evolution of the modern
state, initially because of its links to economic
growth and national defence, and later to ad-
dress the aspirations for social improvement by
the citizens. Social sciences have been valued
for their contribution to understanding and in-
fluencing social development and well-being.
Democratic decision-makers have increasingly
aspired to anchor many of their social reform
programs in the ‘relevant’ and ‘usable’ knowl-
edge provided by the social sciences.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
2 Three Lenses of Evidence-Based Policy March 2008
The evidence-based movement in modern
public policy is thus the latest version of the
search for usable and relevant knowledge to
help address and resolve problems. My argu-
ment is that this is linked to the modern em-
phasis on rational problem-solving, with its fo-
cus on accurate diagnosis and knowledge of
causal linkages. It is also congruent with impor-
tant modern strategic concerns with risk anal-
ysis and appropriate mitigation responses. In
the more technocratic version of the evidence-
based approach, the aspiration is to produce the
knowledge required for fine-tuning programs
and constructing guidelines and ‘tool-kits’ for
dealing with known problems. Hence, the cur-
rently famous phrase that defines much of the
movement – ‘what works?’ (Roberts 2005).
In the context of public policy, governments
remain the major investors and users of ap-
plied social sciences. They do not simply re-
ceive, scan and utilise research; they engage on
many levels to influence the processes and the
products. Their direct methods of shaping the
applied sciences include:
investment in government-funded research
units on specific problems;
managing the policy-research functions in-
side many government agencies; and
commissioning external consultants to under-
take specific contract research (Saunders and
Walter 2005).
Governments also exercise strong indirect in-
fluence through:
determining national priority areas (eg, for
allocation of competitive public funding of
research);
providing rewards and recognition for
commercially-focussed knowledge and tech-
nical forms of scientific excellence; and
encouraging contestability in some policy
arenas by diversifying their sources of advice,
including think-tanks and contractors (Stone
and Denham 2004); and encouraging a wider
range of instruments to deal with policy chal-
lenges, like market-based mechanisms and
de-regulatory options.
Major investment in the applied social sciences
is part of a cycle of producing, analysing, man-
aging and reinvesting in the ‘bank’ of use-
ful knowledge. Large data sets are system-
atically collected. Evidence-based approaches
claim to fill important gaps in the value chain
as data is transformed into information and us-
able knowledge. Large organisations have in-
troduced ‘knowledge management’ strategies
to address the complex tasks of collection,
analysis and dissemination. The management
information (decision-support) systems on the
desktop of senior managers are hungry for
information to underpin performance indica-
tors and monitor program or business trends.
Some disciplines are seen as more valuable than
others for these purposes, eg, the quantitative
precision of financial accounting, cost/benefit
analysis, risk auditing, and health eco-
nomics may be more credible than the
hermeneutic approaches of history and cultural
sociology.
In short, the rise and promotion of ‘evidence-
based’ orientations within government agen-
cies is consistent with the public sector’s
increased interest in efficiency and effective-
ness. Evidence-based policy is believed to pro-
vide great assistance in answering some of
the key questions of New Public Management
(NPM):
what options will ‘deliver the goods’?
how can programs be improved to get greater
‘value for money’?
how can innovation and competition be ex-
panded to drive productivity?
how can program managers achieve spe-
cific ‘outcomes’ for clients and stakeholders
(rather than just ‘manage programs’)? and
in summary, ‘what works?’ (Davies, Nutley
and Smith 2000; 6 2002; Reid 2003)
Context: Evidence for Addressing Complex
Policy Problems
In the 1970s and 1980s the exponents of
NPM managerialism used innovative analytical
frameworks to tackle traditional problems and
to improve program performance information
in each portfolio area. During this period, the
de-regulation of many policy domains and out-
sourcing of services by the state was linked to
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 3
the key focus on program performance issues,
although this strategy posed some risk to the
steering capabilities in some public sector agen-
cies (Pollitt and Bouckaert 2000; McLaugh-
lin, Osborne and Ferlie 2002). However, by
the 1990s there was a significant political shift
towards tackling difficult interlinked issues,
exemplified by the UK government’s champi-
oning of evidence-based policy as a foundation
for joined-up government (UK Cabinet Office
1999; Parsons 2002, 2004). This period saw an
increased investment in central units for policy
analysis and commissioning evidence-based
consultancy reports. Dealing more directly with
major complex issues required taking a more
comprehensive approach towards policy design
and service delivery (Schorr 1988).
The 1990s saw the rise of policy processes
that were potentially less technocratic and more
open to ‘network’ approaches. In essence, this
meant that the new managerialist approaches
were often supplemented by new mechanisms
and process loops, variously described as com-
munity engagement, multi-stakeholder consul-
tation, and partnering across stakeholder sec-
tors (Kernaghan 1993; Kooiman 2000; Osborne
2000; Edwards 2003; Casey 2004; Head 2007).
Greater levels of cooperation and partnership
among governments, non-governmental organ-
isations (NGOs) and business, became asso-
ciated with a widespread rhetoric promoting
‘collaboration’, ‘joined-up’ services, and mul-
tiple ‘networks’ linking stakeholders and sec-
tors (Bruner, Kunesh and Knuth 1992; Bardach
1998; Head 1999; Mandell 2001; Bakvis
2002; Sullivan and Skelcher 2002; Reddel and
Woolcock 2004).
Political leaders and public managers in-
creasingly moved to tackle complex unresolved
problems, in response to the demands and pres-
sures of citizens for whom services remained
inadequate, piecemeal or inappropriate. The
frustration felt by managers in public agencies
regarding the poor rate of return on major social
program investments, led them to search more
widely for new approaches. These new direc-
tions were facilitated by the willingness of ma-
jor NGOs, professions and other stakeholders
to engage in partnerships or collaborative ap-
proaches to addressing major issues. Endemic
social problems seemed to persist regardless of
the massive funding directed towards their alle-
viation. Performance information showed that
results were not being achieved. What could be
done to deal better with domestic poverty; poor
educational attainment; juvenile crime and re-
cidivism; drug and alcohol abuse; preventable
diseases; the appalling conditions facing many
indigenous communities; and the systemic dis-
advantages suffered by peoples in developing
countries?
This awakening of political interest created
opportunities for the behavioural and applied
social sciences to offer solutions, in the form
of new approaches to gaining greater control
over fuzzy and messy realities. Incremental
adjustment around ‘business as usual’ would
no longer suffice. Solutions based on old-
fashioned ideological recipes became much
less persuasive. New more integrated ap-
proaches to policy interventions were warmly
welcomed in some agencies.
From the viewpoint of performance-based
programs and evidence-based policy, the ques-
tion arose as to what kind of investment in
data/information would be needed to generate
the necessary knowledge, both to understand
complex problems and then to create viable so-
lutions. This period saw a major increase in use
of new technologies for data-gathering and data
analysis in order to measure the nature and ex-
tent of problems, assess the current impacts of
service systems, and provide benchmarks for
judging future performance.
However, some social scientists and policy
analysts began to question whether the per-
sistence of complex social problems was re-
ally attributable largely to a lack of informa-
tion, ie, ‘gaps’ in the database (Schon 1983;
Schon and Rein 1994). They suggested that
obtaining more data to fill the known gaps
would not necessarily get us onto the highway
toward good policy solutions, because much
of the policy puzzle is about reconciling dif-
ferent value perspectives. Continuing to in-
vest in building information banks for social
scientists and decision-makers would still re-
main important. But what kinds of information
would be of most value for stakeholders and
decision-makers dealing with the challenges
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
4 Three Lenses of Evidence-Based Policy March 2008
of complex inter-related problems with many
stakeholders?
‘Evidence’ Revisited: Types of Problems,
Types of Knowledge
The problems addressed by policy-makers are
many and varied. At one end of the spectrum,
problems may be seen as discrete, bounded,
and linked to particular sets of information and
actors. In such policy arenas, a ‘technical’ ap-
proach to problem-solving by relatively narrow
circles of actors may be dominant. This tech-
nocratic approach may be seen as sufficient to
meet the requirements of efficiency and effec-
tiveness by those involved. Technical expertise
is important in most policy areas, but its dom-
inant position in many areas is increasingly
contested. For example, many policy areas in
which scientific and engineering expertise had
achieved substantial control (eg, transportation,
energy, water supply) have now become subject
to intense debate and uncertainty.
At the other end of the spectrum, problems
may be seen as complex, inter-linked and cross-
cutting. Simple technical solutions by experts
are unavailable or unworkable. In these cir-
cumstances, a ‘negotiated’ and ‘relational’ ap-
proach to problem-solving may emerge (Innes
and Booher 1999; Hemmati 2002; Lewicki,
Gray and Elliott 2003; Lovan, Murray and Shaf-
fer 2004). The latter approach is a prominent
feature of policy domains that are rich in ‘net-
work’ approaches, partnering and community
engagement. It is argued here that a techni-
cal problem-solving approach to knowledge in
each discrete policy area is increasingly inad-
equate. Policy development arrangements are
experimenting with broader relational and sys-
temic approaches. Networks and partnerships
bring to the negotiation table a diversity of
stakeholder ‘evidence’, ie, relevant informa-
tion, interpretations and priorities. The argu-
ment is that addressing complex inter-linked
problems requires a strong emphasis on the so-
cial relations and stakeholder perceptions in-
herent in policy direction and program systems.
This has implications for how we think about
problems, relevant knowledge, policy and
program design, implementation, and evalua-
tion. In short, our ideas about ‘evidence-based’
policy may change character as we move from
a technical approach towards a more relational
approach.
Traditionally, the ‘evidence’ base seen as
the foundation for evidence-based policy is
the knowledge generated by applied research,
whether undertaken inside or outside of gov-
ernment agencies. This includes the general ev-
idence about broad trends and explanations of
social and organisational phenomena, as well
as specific evidence generated through per-
formance indicators and program evaluations
(Nutley, Davies and Walter 2002; Oakley et al.
2005). However, my argument is that the effec-
tiveness (success) of policies and programs is
not just a matter for applied social-science re-
search (including program evaluation reports).
As we come to a fuller appreciation of the com-
plexities of modern inter-dependent problems,
with a corresponding broadening in the focus of
policy attention, it becomes clear that there are
multiple forms of policy-relevant knowledge,
that are vital to understanding the issues and
the prospects for the success of policy inter-
ventions.
In this broader view, there is not one
evidence-base but several bases (Pawson et al.
2003; Schorr 2003; Davies 2004). These dis-
parate bodies of knowledge become multiple
sets of evidence that inform and influence pol-
icy rather than determine it. In this broader
understanding of policy-relevant knowledge,
the prestige and utility of ‘scientific evidence’,
validated by the standards of scientific method-
ology, remains a very significant input to pol-
icy development. Thus, rigorous and system-
atic research has great value, but needs to be
placed in a wider context. Hence, it is argued
that effective policy – its design, implementa-
tion, and evaluation – depends on several evi-
dentiary bases. These are all involved, directly
or indirectly, in the development and assess-
ment of ‘good programs’ and help us to un-
derstand ‘effectiveness’ in a more holistic and
networked policy environment.
The general milieu in which government
policies and programs operate is of course
the public sphere – of public debate, public
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 5
opinion, civic awareness and popular culture.
This milieu both informs and responds to pub-
lic policy, and colours the ways in which po-
sitions are argued and knowledge-claims are
advanced. However, evidence-based policy is
not primarily concerned with how the general
political culture shapes policy directions. In a
more particular sense, there are three impor-
tant kinds of knowledge (and corresponding
views of ‘evidence’) that are especially salient
for policy. These forms of knowledge arise
from:
political know-how;
rigorous scientific and technical analysis; and
practical and professional field experience.
They provide three ‘lenses’ for policy anal-
ysis and three lenses for understanding
the evidence-base(s) of policy debate (see
Figure 1). They all work in their different ways
with particular interpretations of the constraints
and limitations of public opinion.
Three Lenses
1.Political knowledge for this purpose is the
know-how, analysis and judgement of politi-
cal actors. These analysing and judging activi-
ties include several vital elements relevant to
evidence-based policy – such as considering
and adjusting strategies or tactics; undertaking
agenda-setting; determining priorities; under-
taking persuasion and advocacy; communicat-
ing key messages and ideological spin; shap-
ing and responding to issues of accountability;
building coalitions of support; and of course
negotiating trade-offs and compromises. Mak-
ing contextual judgements about the possible
and the desirable are inherent in this form of
knowledge.
This ‘political’ form of knowledge inheres
primarily in politicians, parties, organised
groups, and the public affairs media. But al-
though some of the knowledge is private and
esoteric, most of it is also widely dispersed
in popular forms among the public and espe-
cially by and through the mass media. This
knowledge is diffuse, highly fluid, and heavily
contested owing to its partisan and adversar-
ial context. Policy, seen through the political
lens, is about persuasion and support rather than
about objective veracity.
Partisanship, and bias in knowledge, are not
solely confined to the political sphere – with
its characteristic polemics, debates, ideologi-
cal assertions and counter-claims. However the
implications of ‘political’ know-how for the
use of ‘evidence’ are very significant. Most
simply, a selection of convenient ‘facts’ may
be harnessed to an argument; and large areas
of other information are then either ignored,
dismissed as tainted, or otherwise deemed ir-
relevant. This partisan usage of evidence is of-
ten regarded as ‘typical’ political behaviour and
part of the ‘game’ of political argument. In
the political game, it is widely understood that
special pleading and deception are normalised.
Sometimes the partisan use of evidence is tac-
tical, casual or opportunistic; but sometimes it
is more systematically linked to a cohesive ide-
ological outlook, characterised by some com-
mentators as faith-based politics.
Importantly for my argument, there are some
areas of policy that become the subject-matter
of clear government commitments. These com-
mitments are no longer (if they ever were) open
to further debates about the nature of the prob-
lem, the best policy solution, and the range
of evidence relevant to assessing policy effec-
tiveness. This means some policy positions are
‘data-proof’ or ‘evidence-proof’, in the sense
that their evidence ‘base’ has been narrowed
and buttressed by political commitments, per-
haps closely linked to the values and ideological
positions of political leaders or parties. Some
policy preferences allow only certain kinds of
‘evidence’ to be noticed. Critical commentary
under these circumstances is unwelcome. Some
contentious problems may become defined in
‘official’ terms, in ways that tend to privilege
some evidence as relevant and to rule out other
evidence as irrelevant or merely ideological. In
this context, the ‘official’ framing of a prob-
lem is also crucial in regard to what research is
commissioned and its terms of reference. Rela-
tively few research and consultancy projects are
commissioned without some expectation that
the reports may assist in upholding a certain
viewpoint.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
6 Three Lenses of Evidence-Based Policy March 2008
Figure 1. Three Lenses of Knowledge and Evidence
2.Scientific (research-based) knowledge,is
the product of systematic analysis of current
and past conditions and trends, and analysis
of the causal inter-relationships that explain
conditions and trends. In relation to policy re-
view and program assessment, there is a range
of disciplinary and cross-disciplinary knowl-
edges (economics, law, sociology, public ad-
ministration, evaluation, etc) that make highly
useful contributions to policy and program un-
derstanding and improvement. There is seldom
any consensus among social scientists on the
nature of problems, the causes of trends or re-
lationships, and the best approach for solutions.
Various scientific disciplines may have differ-
ent methodological approaches, and may offer
complementary or sometimes competing per-
spectives on complex issues. It is perhaps not
surprising that inter-disciplinary approaches
have come to the fore in recent decades for ad-
dressing multi-layered social problems.
Scientific (research-based) forms of knowl-
edge primarily comprise the work of profes-
sionals trained in systematic approaches to
gathering and analysing information. A con-
cern with the quality and consistency of data is
fundamental to a scientific approach to analy-
sis. Nevertheless, methodological choices have
to be made. At one end of the spectrum in the
behavioural and applied social sciences, ‘sys-
tematic reviews’ apply rigorous standards to ex-
amine the state of current knowledge, giving
recognition only to those studies which clearly
focus on assessing the causal effects of specific
interventions. The so-called ‘gold standard’ for
a rigorous experimental approach – adopted in
the medical, biological and healthcare sciences
– entails the use of randomised controlled trials
(RCTs) to test the efficacy of specific interven-
tions (Cochrane Collaboration website). This
approach, often championed as the most rigor-
ous strategy for assessing ‘what works’ in a spe-
cific policy field, has also been promoted and
applied in some social program areas, includ-
ing criminology (Davies 2004; Petticrew and
Roberts 2005; Campbell Collaboration web-
site). Rigour is thus sometimes associated with
a preference for quantitative behavioural data,
although qualitative (attitudinal) data are in-
creasingly seen as central in helping to explain
the conditions and nature of behavioural change
(Davies 2004; Percy-Smith 2005). At the other
end of the spectrum, some methodologies asso-
ciated with a hermeneutic approach, including
a large proportion of ‘action-research’ projects,
tend to regard policy and program assessment
as more akin to iterative social learning projects
than to the experimental sciences.
3.Practical implementation knowledge,in
the present context of policy and program
effectiveness, is the ‘practical wisdom’ of
professionals in their ‘communities of prac-
tice’ (Wenger 1998) and the organisational
knowledge associated with managing program
implementation. These professional and man-
agerial communities are often segmented rather
than well connected. They operate within and
across the public sector, the private sector, and
the not-for-profit NGO sector. Relevant occu-
pational groupings include program delivery
managers, contract managers, enterprise man-
agers, and the diverse range of professionals
and para-professionals who are engaged in di-
rect service provision (Pawson et al. 2003) or
who provide support services linked into the
policy programs of government.
These managers and professionals wrestle
with everyday problems of program imple-
mentation and client service. Their roles may
require managing upwards, downwards, and
outwards to external stakeholders (O’Toole,
Meier and Nicholson-Crotty 2005). Their prac-
tical experience in delivery often tends to be
under-valued by the political and scientific sec-
tors. The sphere of ‘practice’ operates with
evolving bodies of knowledge that tend to be
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 7
specific to each professional niche. The train-
ing regimes for managers and professionals
are often linked to concepts of ‘best practice’
and to relevant research bases for assessing
‘effective’ practices. Their formal bodies of
knowledge evolve, and are subject to debate in
‘communities of learning’ (Wenger 1998). But
they also tend to become systematised and cod-
ified, and linked to standards and guidelines.
In large organisations, best-practice guidelines
may become overlaid with bureaucratic rules
and protocols.
While the pressure towards systematisation
is significant, and the search for technical solu-
tions (eg, build another IT system) is endemic,
some areas of practice are less than impressed
by the business and engineering models. The
professional ethos in human services makes
room for unique cases and for the meaning-
systems of clients (Schon 1983). This provides
the ‘mental space’ for creating organisational
climates that are more favourable to case-based
learning and more broadly the adoption of ‘or-
ganisational learning’ approaches. Social real-
ities are not seen as cut-and-dried and control-
lable, but as evolving challenges with unique
characteristics. Time is seen as a necessary in-
gredient of social and personal improvement.
However, the influence of this more open-ended
approach is severely tested whenever the politi-
cal system shifts into crisis response mode, with
a heightened demand for rapid responses, rig-
orous risk-management and standardisation.
Some Implications
It has been suggested above that there are three
broad types of knowledge and evidence that are
central to the design, implementation and eval-
uation of policies and programs. There is not
one evidence-base but several bases. These dis-
parate bodies of knowledge become multiple
sets of evidence that inform and influence pol-
icy rather than determine it. The three lenses
of policy-relevant knowledge comprise three
perspectives on useful and usable information.
Each of these types has its distinctive proto-
cols of knowledge, of expertise, of strategy, and
what counts as ‘evidence’, albeit it is also clear
that there are ongoing internal debates on such
matters within each knowledge area.
How do these three lenses and forms of
knowledge fit together? There is a considerable
case-study literature on ‘policy communities’
and ‘policy networks’ that may include partici-
pants from more than one institutional or indus-
try sector. There is also a considerable literature
in public administration concerning manage-
rial challenges and processes in various portfo-
lio areas. There is also a literature on specific
managerial knowledges and other professional
knowledges; but rather less on how the sphere
of management practice interacts with the other
spheres of policy choices by politicians and the
findings of systematic research. In fact, there
has been surprisingly little research directly
on the question of how the three clusters of
political, research and professional/managerial
knowledge interact. By contrast there has been
increasing attention to the bilateral relations
between public policy and the social sciences
(Davies, Nutley and Smith 2000; Stone 2001;
UKCSS 2003; Edwards 2004; Saunders and
Walter 2005).
Some useful research questions about the
three lenses might therefore include the follow-
ing:
is there a low or high level of mutual aware-
ness, recognition and understanding of each
other’s approaches?
is there is a substantial ‘cultural divide’ be-
tween these forms of knowledge (Shonkoff
2000), and if so, are there any useful mech-
anisms/incentives to promote working to-
gether more closely?
in the jostling for salience or in the compe-
tition between these sets of ideas, does one
form of knowledge typically ‘trump’ the oth-
ers – eg, in the policy domain, does politics
typically predominate over science and prac-
tice?
Answers to all these questions are likely to vary
greatly in different situations. From the view-
point of government, there is an expectation that
research findings will assist but not determine
policy directions and adjustments (UK Cabinet
Office 1999). There is a robust debate about
where and how the research contributions can
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
8 Three Lenses of Evidence-Based Policy March 2008
‘add most value’ to the policy process. For ex-
ample, is research most useful to government
processes at the commencement of the ‘policy
cycle’ (identifying and scoping a problem), or
in a later stage of options analysis (examining
different costs and impacts), or in the program
evaluation phase when effectiveness is being re-
considered? Researchers are themselves some-
times na¨ıve about what kind of policy analy-
sis will be seen as relevant, and about how to
communicate and package their research out-
comes most effectively for government officials
(Edwards 2004; UKCSS 2003).
Governments tend to have a strong expec-
tation that the managers and professionals
who deliver services and implement programs
will do so with technical skill and efficiency.
For those employed within the public sector,
effective implementation is the key consider-
ation, and the government’s control over pri-
orities and program design is very clear. For
those employed at arm’s-length from govern-
ment, efficient implementation is reinforced by
contractual relations and accountabilities for
service delivery. With a few powerful excep-
tions, communities of practice may feel dis-
enfranchised, especially when those who are
practical experts in delivery are not centrally in-
volved in early discussions about how programs
are designed and delivered. The political exec-
utives of government are not especially adept
at hearing and seeking out the voices of imple-
menters, especially those who suggest that the
program goals cannot be delivered because ei-
ther the framework is flawed or resources are
clearly inadequate. Nevertheless, some shar-
ing of perspectives occurs. For example, both
public sector managers and political leaders
are successful only by making astute practical
judgements about priorities, garnering support
for taking action, and persuading stakeholders
about trade-offs and preferred options.
Researchers may believe that problem-
definition and analysis should be closely
linked to the data assembled by systematic
research. However, in the realm of public pol-
icy development, governments are in the busi-
ness of framing issues and agendas. Strategic
policy work is conducted in the context of
debates about issues and agendas. Thus, pol-
icy decisions are not deduced in a neutral
and objective manner from empirical-analytical
work, but from politics, judgement and debate
(Majone 1989). Policy debate and analysis in-
volves interplay between facts, norms and de-
sired actions, in which ‘evidence’ is diverse
and contestable. Different stakeholders within
the business, NGO and government sectors are
likely to have divergent views on what is the
key problem. For governments, problems and
issues become seen as worthy of investigation
owing to a confluence of circumstances, such
as:
a perception of crisis or urgency;
the role of political mandates and priorities;
the role of expert judgement and advice (con-
sultants, inquiries, etc);
organisational and issue histories; or
the changing context of social values and pub-
lic opinion.
The definition and focus of problems are also
likely to be viewed differently through the three
lenses of research, professional practice and
government policy-making (and of course there
will be some debates within each of these).
Some policy arenas are more divergent and
strongly contested than others. Over time, as
Mulgan (2005) suggests, some fields may be-
come relatively stable and consensual; others
may be subject to ‘profound disagreements’;
and others again may be lacking a solid infor-
mation base or a track record of on-ground ex-
perience. The implications of these differences
in the nature of policy arenas are potentially
significant. Contentious (ie, unsettled and tur-
bulent) policy areas may tend to generate more
heat than light, as the terms of evidentiary de-
bate may be overwhelmed by partisan voices,
despite the best efforts of those who wish
to retain an ‘objective’ stance. Current policy
examples might include strategies to address
major emergent issues such as climate-change
responses; value-based issues at the intersec-
tion of bio-ethics and bio-technologies; and de-
bates about procedural and substantive ‘fair-
ness’ for workers and employers in industrial
relations reform. Researchers who seek to make
an objective contribution in such areas may risk
being harnessed to positions proposed by strong
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 9
advocates on one side of the debate and accord-
ingly disparaged by others.
Conclusions
The demands for efficient and effective govern-
ment have fostered the need for performance
information. This has provided leverage for
applied social research, concerned with pro-
gram evaluation, implementation effectiveness,
and new models for tackling complex issues
using new policy instruments and processes.
This served to legitimate the general concept
of evidence-based policy. However, there is a
large difference between a technical problem-
solving approach to knowledge, and a broader
relational and systemic approach to knowledge
that is located in multi-stakeholder networks.
This article has suggested there are three
main kinds of challenge to the rational mission
of ‘evidence-based’ policy. One arises from the
inherently political and value-based nature of
policy debate and decision-making. Policy de-
cisions are not deduced primarily from facts
and empirical models, but from politics, judge-
ment and debate. Policy domains are inherently
marked by the interplay of facts, norms and
desired actions. Some policy settings are data-
resistant owing to governmental commitments.
Secondly, information is perceived and used
in different ways, by actors looking through
different ‘lenses’. From this perspective, there
is more than one type of relevant ‘evidence’.
I have drawn attention to ‘three lenses’ that
are especially important, centred on political
know-how, systematic research, and profes-
sional practice. These perspectives all provide
important contributions to policy development,
but defensiveness and negativity are as com-
mon as cooperation. Although the context of
decision-making is dynamic and negotiated,
these key actors are anchored in institutional
settings that make shared perspectives difficult
to attain.
The third challenge to a rationalist concept of
evidence-based policy is that the complex mod-
ern arrangements of networks, partnerships and
collaborative governance are difficult to har-
ness to the traditional forms of knowledge
management, policy development and program
evaluation in the public sector (Agranoff and
McGuire 2003). Networks bring to the table a
diversity of lived experience and therefore a di-
versity of ‘evidence’ (relevant information, in-
terpretations, priorities, and perspectives), not
only about what works but also about what is
worthwhile and meaningful. The evidence-base
for understanding success factors for complex
policy design and implementation may need to
address the conditions under which innovation,
new thinking and new solutions may emerge in
a dynamic environment (Osborne and Brown
2005). The three-lenses approach suggests that
there may be importantly divergent perspec-
tives on whether and how to increase mutual
understanding and shared objectives.
Endnotes
1. The author is grateful for comments on these
ideas from a number of colleagues over sev-
eral years, including John Alford, Meredith Ed-
wards, Geoff Gallop, Richard Mulgan, John
Wanna, and the anonymous referees for AJPA,
none of whom bear any responsibility for re-
maining errors of argument and interpretation.
2. The historical roots of such an alliance be-
tween political power and systematic knowl-
edge reach back to the late Enlightenment,
whose leading thinkers sought to undermine
the traditional capacity of governments to rely
on appeals to precedent, authority and religious
values as the basis of government legitimacy
(Staum 1996; Head 1982; Berry 1997). Sys-
tematic social science later fortified the pro-
gressive impulses for social improvement un-
derlying New Liberalism, early social democ-
racy, and the foundation of institutions such as
the London School of Economics and Political
Science in 1895 (Lichtheim 2000: chapter 4).
References
6, P. 2002. ‘Can Policy Making be Evidence-Based?’
MCC: Building Knowledge for Integrated Care
10(1):3–8.
Agranoff, R. and M. McGuire. 2003. Collaborative
Public Management: New Strategies for Local
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
10 Three Lenses of Evidence-Based Policy March 2008
Governments. Washington DC: Georgetown Uni-
versity Press.
Bakvis, H. 2002. ‘Pulling Against Gravity? Horizon-
tal Management in the Canadian Federal Govern-
ment.’ In Knowledge, Networks and Joined-Up
Government, ed. M. Considine. IPSA Research
Committee Proceedings, Centre for Public Pol-
icy, Melbourne, 57–75.
Bardach, E. 1998. Getting Government Agencies to
Work Together. Washington DC: Brookings Insti-
tution Press.
Berry, C.J. 1997. The Social Theory of the Scottish
Enlightenment. Edinburgh: University of Edin-
burgh Press.
Booher, D.E. 2005. ‘Collaborative GovernancePrac-
tices and Democracy.’ National Civic Review
93(4):32–46.
Bruner, C., L.G. Kunesh and R.A. Knuth. 1992. What
Does Research Say about Interagency Collabora-
tion? URL: <http://www.ncrel.org>.
Campbell Collaboration. About the Campbell
Collaboration. URL: <http://www.campbellco-
llaboration.org/>.
Casey, J. 2004. ‘Third Sector Participation in the
Policy Process.’ Policy and Politics 32(2):241–
257.
Cochrane Collaboration. The Cochrane Collabora-
tion: The Reliable Source of Evidence in Health
Care. URL: <http://www.cochrane.org/>.
Commonwealth Foundation. 2004. Tri-Sector Dia-
logues: Synthesis of Dialogues on Partnership
Approaches to Governance. London: Common-
wealth Foundation, Citizens and Governance Pro-
gram.
Davies, P. 2004. Is Evidence-Based Policy Possible?
The Jerry Lee Lecture, Campbell Collaboration
Colloquium, Washington.
Davies, H.T., S.M. Nutley and P.C. Smith, eds. 2000.
What Works? Evidence-Based Policy and Prac-
tice in Public Services. Bristol: Policy Press.
Edwards, M. 2003. ‘Participatory Governance.’
Canberra Bulletin of Public Administration
107:1–6.
Edwards, M. 2004. Social Science Research and
Public Policy: Narrowing the Divide. Policy Pa-
per 2. Canberra: Academy of Social Sciences in
Australia.
Head, B.W. 1982. ‘The Origins of “La Science So-
ciale” in France 1770–1800.’ Australian Journal
of French Studies 19(2):115–132.
Head, B.W. 1999. ‘The Changing Role of the Public
Service: Improving Service Delivery.’ Canberra
Bulletin of Public Administration 94:1–3.
Head, B.W. 2005. ‘Governance.’ In Ideas and Influ-
ence: Social Science and Public Policy in Aus-
tralia, eds P. Saunders and J. Walter. Sydney:
UNSW Press, 44–63.
Head, B.W. 2007. ‘Community Engagement – Par-
ticipation on Whose Terms?’ Australian Journal
of Political Science 42(3):441–454.
Hemmati, M. 2002. Multi-Stakeholder Processes for
Governance and Sustainability. London: Earth-
scan.
Innes, J.E. and D.E. Booher. 1999. ‘Consensus
Building and Complex Adaptive Systems: A
Framework for Evaluating Collaborative Plan-
ning.’ Journal of the American Planning Asso-
ciation 65(4):412–423.
Innes, J.E. and D. Booher. 2003. The Impact of
Collaborative Planning on Governance Capac-
ity. Working Paper 2003/03, Institute of Urban
and Regional Development, University of Cali-
fornia, Berkeley.
Kernaghan, K. 1993. ‘Partnership and Public Ad-
ministration.’ Canadian Public Administration
36(1):57–76.
Kooiman, J. 2000. ‘Societal Governance.’ In Debat-
ing Governance: Authority, Steering and Democ-
racy, ed. J. Pierre. Oxford: Oxford University
Press, 138–164.
Lewicki, R.J., B. Gray and M. Elliott, eds. 2003.
Making Sense of Intractable Environmental Con-
flicts. Washington: Island Press.
Lichtheim, G. 2000. Europe in the Twentieth Cen-
tury. London: Phoenix Press.
Lovan, W.R., M. Murray and R. Shaffer, eds.
2004. Participatory Governance: Planning, Con-
flict Mediation and Public Decision-Making in
Civil Society. Aldershot: Ashgate.
Lowndes, V. and C. Skelcher. 1998. ‘The Dynam-
ics of Multi-Organisational Partnerships.’ Public
Administration 76(3):313–333.
Majone, G. 1989. Evidence, Argument and Persua-
sion in the Policy Process. New Haven: Yale Uni-
versity Press.
Mandell, M.P., ed. 2001. Getting Results through
Collaboration: Networks and Network Structures
for Public Policy and Management. Westport:
Quorum Books.
McLaughlin, K., S.P. Osborne and E. Ferlie, eds.
2002. New Public Management: Current Trends
and Future Prospects. London: Routledge.
Mulgan, G. 2005. The Academic and the Policy-
Maker. Presentation to Public Policy Unit, Oxford
University, 18 November.
Nutley, S., H. Davies and I. Walter. 2002.
Evidence-Based Policy and Practice: Cross Sec-
tor Lessons from the UK. Working Paper 9,
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 11
Research Unit for Research Utilisation, Univer-
sity of St Andrews.
Oakley, A., D. Gough, S. Oliver and J. Thomas.
2005. ‘The Politics of Evidence and Methodol-
ogy: Lessons from the EPPI-Centre.’ Evidence
and Policy 1(1):5–31.
Osborne, S.P., ed. 2000. Public-Private Partner-
ships. London: Routledge.
Osborne, S.P. and K. Brown. 2005. Managing
Change and Innovation in Public Service Organ-
isations. London: Routledge.
O’Toole, L.J., K.J. Meier and S. Nicholson-
Crotty. 2005. ‘Managing Upward, Downward and
Outward: Networks, Hierarchical Relationships
and Performance.’ Public Management Review
7(1):45–68.
Parsons, W. 2002. ‘From Muddling Through to Mud-
dling Up – Evidence Based Policy-Making and
the Modernisation of British Government.’ Pub-
lic Policy and Administration 17(3):43–60.
Parsons, W. 2004. ‘Not just Steering but Weaving:
Relevant Knowledge and the Craft of Building
Policy Capacity and Coherence.’ Australian Jour-
nal of Public Administration 63(1):43–57.
Pawson, R., A. Boaz, L. Grayson, A. Long and C.
Barnes. 2003. Types and Quality of Knowledge
in Social Care. Knowledge Review No. 3, So-
cial Care Institute for Excellence. Bristol: Policy
Press.
Percy-Smith, J. 2005. What Works in Strategic Part-
nerships for Children? Ilford: Barnardo’s.
Petticrew, M. and H. Roberts. 2005. Systematic Re-
views in the Social Sciences. Oxford: Blackwells.
Pollitt, C. and G. Bouckaert. 2000. Public Manage-
ment Reform: A Comparative Analysis. Oxford:
Oxford University Press.
Reddel, T. and G. Woolcock. 2004. ‘From Consul-
tation to Participatory Governance?’ Australian
Journal of Public Administration 63(3):75–87.
Reid, F. 2003. Evidence-Based Policy: Where is the
Evidence for it? Working Paper 3, School for Pol-
icy Studies, University of Bristol.
Roberts, H. 2005. ‘What Works?’Social Policy Jour-
nal of New Zealand 24:34–54.
Saunders, P. and J. Walter, eds. 2005. Ideas and In-
fluence: Social Science and Public Policy in Aus-
tralia. Sydney: UNSW Press.
Schon, D.A. 1983. The Reflective Practitioner: How
Professionals Think in Action. New York: Basic
Books.
Schon, D.A. and M. Rein. 1994. Frame Reflection:
Toward the Resolution of Intractable Policy Con-
troversies. New York: Basic Books.
Schorr, L.B. 1988. Within Our Reach: Breaking the
Cycle of Disadvantage. New York: Anchor Dou-
bleday.
Schorr, L.B. 2003. Determining ‘What Works’ in
Social Programs and Social Policies: Towards
a More Inclusive Knowledge Base. Washington,
DC: Brookings Institute.
Shonkoff, J.P. 2000. ‘Science, Policy and Practice:
Three Cultures in Search of a Shared Mission.’
Child Development 71(1):181–187.
Staum, M.S. 1996. Minerva’s Message: Stabilizing
the French Revolution. Montreal and Kingston:
McGill-Queens University Press.
Stone, D. 2001. Getting Research into Policy? Pa-
per for Global Development Network Conference,
Rio de Janeiro, December.
Stone, D. and A. Denham, eds. 2004. Think Tank Tra-
ditions: Policy Research and the Politics of Ideas.
Manchester: Manchester University Press.
Sullivan, H. and C. Skelcher. 2002. Working across
Boundaries: Collaboration in Public Services.
Houndmills UK: Palgrave Macmillan.
UK Cabinet Office. 1999. Professional Policy Mak-
ing for the Twenty First Century. London: Cabinet
Office.
UKCSS [UK Commission on the Social Sciences].
2003. Great Expectations: The Social Sciences in
Britain. London: Commission on Social Sciences.
Wenger, E. 1998. Communities of Practice: Learn-
ing, Meaning and Identity. Cambridge: Cam-
bridge University Press.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
... As knowledge is produced and disseminated at a constantly accelerating speed, choosing the knowledge to be used as evidence is one key task of experts today (Holst and Molander, 2019;Maasen and Weingart, 2005;Moore, 2017;Sanderson, 2006;Stanziola, 2012). Head (2008) categorizes knowledge in policymaking into three types: political, scientific, and practical implementation knowledge. Political knowledge involves the expertise and judgement of political actors, while scientific knowledge encompasses academic research and systematic information gathering. ...
... Practical implementation knowledge, also known as tacit knowledge, is also emphasized by Holst and Molander (2019). Each type offers a distinct perspective on evidence and policymaking (Head, 2008). This suggests that policymaking cannot strictly be based on evidence, which would involve a systematic process from identifying the policy problem to implementing evidence into practice (Sackett, 2000). ...
... Several researchers have pointed out that policymaking is a complex process influenced by factual information, societal norms, and existing commitments (Cairney, 2016;Head, 2008;Parkhurst, 2017). No matter what evidence is provided and whether the decision-making process is claimed to be evidence-based orinformed, governments often have made commitments (Head, 2008), in which case "evidence" is used to legitimize these existing commitments. ...
Thesis
Full-text available
This dissertation focuses on the use of evidence and expertise in the recent school reforms in the Nordic countries, and the co-creation of global policy space and national policy contexts through the social interactions and networks of knowledge and experts. The study positions itself at the intersection of several disciplines, including education, sociology, geography, and political sciences. Drawing inspiration from previous research in education policy transfer, globalization studies, spatiotemporal theories, and assemblage thinking, this research sheds light on the interwoven and assemblage nature of the national, Nordic, and international policy contexts. A novel approach to data analysis (both quantitative bibliometric data and qualitative interview data) is employed to trace moments when the global dimension surfaces within the national context. With this reading this dissertation explores instances where, in processes of globalization, global evidence produced especially by the international organizations influences national policy formation or where national actors engage in shaping global policy within the realms of national, regional, and global policy creation spaces. In essence, the findings of this dissertation underscore that globalization is not merely an overarching force superseding national autonomy and actors but, rather, a process that transforms national actors into accomplices of global forces through the amalgamation of evidence, expertise, and educational reforms.
... Pragmatism can arguably fit the bill. Although it has been described in different ways, it is perhaps most widely known by its layman notion highlighting its concern with results, commonly regarded as "what works" (Head 2008). Philosophically, however, it is a rich tradition, particularly influential academically as well as practically in the first half of the 20th century (Menand 1997). ...
Article
Full-text available
Qualitative Comparative Analysis (QCA) is a set‐theoretic approach that enables accounting for causal complexity, incorporating features of qualitative and quantitative methods. Pioneered in sociology, it is increasingly used in comparative research to explain social phenomena in various disciplines. Despite its growing adoption, and because of its eclectic nature, some methodological themes have proven elusive. Its philosophy of science is one such issue. Philosophical pragmatism, which focuses on action, rather than the mind as rationalism or things as empiricism, is proposed to fill this void. Moving beyond the positivist‐interpretivist dichotomy, this paper argues that pragmatism's philosophical wagers support QCA's assumptions and practice. Moreover, pragmatism's implications in terms of practical consequences, provisionality of insights, plurality of method, and public participation provide fertile soil on which QCA‐based inquiry can sow, producing pragmatic comparative research.
... The phenomenon of data-centric comparisons and representations have caused intricate ways in which universities and administrators respond to quality assurance and, in the meantime, for improved performance data . This logic is further strengthened in evidence-informed policymaking (Head, 2008), where numbers provide objective evidence to legitimise policy decisions. This active governing approach constrains administrators' work, focusing it predominantly on achieving specified, typically numeric, goals, which are primarily driven by a desire for increased productive data. ...
Article
Full-text available
The rapid expansion of transnational higher education (TNHE) has become a prominent feature of the global higher education field. However, research investigating the quantity-driven approach in TNHE remains limited. This study investigates dual dynamics of quantitative growth and qualitative enhancement within Chinese TNHE, focusing on the role of data governance. Drawing on 15 semi-structured interviews with administrators from the international departments of four Chinese universities, this research illuminates how data-driven policies shape decisions at the intersection of quantitative expansion and qualitative enhancement. The findings indicate that administrators’ efforts are primarily driven by the pursuit of numerical targets established by data-centric policies. Such governance emphasises the instructive significance of data, leading administrators to recognise and utilise data as a key tool in decision-making processes and prioritise over quality. This study critiques the misrepresentation of the quality and effectiveness of TNHE programs, leading to a misrecognition where data metrics dictate educational value.
... These National Large-Scale Standardized Assessments are crafted to generate comprehensive data on educational systems, encompassing student performance, teacher effectiveness, and other relevant metrics (Chudowsky & Pellegrino, 2003). Their primary purpose is to inform and shape policy formulation and decision-making processes (Head, 2008). Over the years, large-scale assessments have evolved into pivotal tools for evidencebased policymaking. ...
Conference Paper
Full-text available
Abstract Large-scale standardised assessments can exclude certain groups of students at various stages, from the test’s design to the analysis of its results. Students with disabilities, for example, may find it challenging to access the test due to the design or technological barriers, leading to their exclusion (Hickey, 2015). Additionally, biased or stereotyped analyses, such as the belief that girls are inherently worse at math, can exclude certain students (Villani & Carbone, 2020). The impact of these exclusion mechanisms can have significant implications for students’ learning paths and the entire school system. Over the past twenty years, there have been efforts to make large-scale standardised assessments more inclusive (Cawthon & Shyyan, 2022; Hickey, 2015). According to Cawthon and Shyyan (2022), strategies for addressing assessment accessibility aim to balance equity in testing experiences while maintaining standardised test designs. However, this balance raises questions about what is measured and the various methods available to demonstrate an understanding of that construct.
Chapter
Why did evidence-informed policy reform of public schools lead to policy conflict and failure? This chapter first presents a government politics perspective on evidence-informed policy. It is argued that ambitions of more evidence in public school policy promoted the priority of evidence in the policy area. Policy capacity was enhanced in the 2000s and early 2010s inspired by international trends and in response to government ambitions of addressing performance problems through the increased reliance on evidence and performance information. In a 2013 public school reform in Denmark, however, barriers to collecting evidence in combination with a reform of teachers’ working conditions triggered conflict and polarisation. The chapter illustrates the difficulties which can arise in political knowledge management of evidence-informed policies, the powerful dynamics of policy conflict and the risk of taking bold political action in a policy area with organised opposition from key stakeholders. The chapter ends by considering how the Ministry of Children and Education has attempted to limit the damage caused by policy conflict.
Chapter
How and why did the evidence-based management of active labour market policies shape the practical knowledge of public professionals and policy outcomes? The chapter first considers the tension between a central wish to increase policy performance by imposing evidence and the reactions of public service professionals responsible for implementing policies in practice. On this basis, it highlights the role of Ministry of Employment in accumulating evidence and in performing an evidence-based management of active labour market policy. Unlike the case of public school policy, the level of policy conflict remained low, and reactions from public professionals were limited. The chapter shows the complex relationship between central ministries and local public professionals as well as the challenges involved in achieving desired policy outcomes.
Chapter
How and why does the use of evidence in different government ministries affect policy success and failure? The concluding chapter summarises lessons from the three preceding empirical chapters to address the main research question of the book. The chapter argues that each type of evidence in policy relates to scientific, political, practical and symbolic knowledge and thereby elaborates the typology of evidence utilisation in government ministries presented in Chap. 1. On this basis, the chapter considers the prospects and challenges of government ministries to use evidence for taming policy problems in light of the prospects of using artificial intelligence. Understanding the interaction with other types of knowledge and the consequences of high-speed information allows a nuanced discussion of the gains and challenges associated with using evidence in public policy in the future.
Chapter
This book discusses how and why different ways of utilising evidence in government ministries affect policy success and failure. Chapter 1 focuses on the key roles that government ministries play in formulating and implementing policy and influencing the use of evidence in policy-making. The chapter first presents the research gaps and questions explored in the book. Then, arguments for why evidence-based policy remains an appealing ideal for political decision-makers are presented. It is then explained why hopes of evidence-based policy might be disappointed lending to moderators of evidence uptake in policy. Based on distinctions between selective and comprehensive evidence input and variation in formal policy decisions and informal policy management, a typology of evidence in policy is presented. The typology allows the book to define different types of evidence utilisation, which guide the focus of the following empirical chapters. The chapter concludes by considering the Danish context and methods applied to study the use and effects of evidence in government ministries based on three case studies focused on public budgeting policy, public school policy and active labour market policy.
Article
Full-text available
Background Low-quality care for low back pain (LBP) is pervasive in Australia. Drivers of low-quality care have been identified elsewhere and include misconceptions about LBP, vested interests and limited funding for evidence-based interventions. Yet, the literature that identified such drivers is not specific to the Australian context, and therefore, it is likely to represent only part of the local problem. This study aimed to determine where the most influential drivers of LBP care are in the Australian healthcare system and what could be done to address them. Methods Clinical leaders from various disciplines, academics, hospital managers, policy-makers, consumers involved in LBP advocacy, board members of relevant health profession boards and private insurers were invited to participate in one-on-one interviews. Interviews were transcribed verbatim. Interview data were analysed using content analysis. Results We interviewed 37 stakeholders. Challenges that hinder LBP care in Australia included variability in care and inconsistent messages, funding models that are not supportive of appropriate care for LBP, the community’s understanding of LBP, vested interests and commercial forces, difficulties in accessing timely and affordable conservative care, neglect of social determinants and health inequities, short consultations, siloed practices, uncertainties that stem from gaps in evidence and the experience of having LBP, individual and contextual variability, the mismatch between evidence and practice, the Australian healthcare system itself, the lack of political will and acknowledgement of LBP as a public health issue, stigma, the need to improve human aspects and the compensation system. When discussing factors that could improve LBP care, participants raised collaboration, changes in funding, improvement of access to – and affordability of – models of care and care pathways, public health campaigns targeting LBP, enhancement of policy and governance, increasing and better training the workforce, consideration of inequities, making improvements in information sharing and reforming the worker’s compensation sector. Conclusions LBP is a wicked problem, influenced by several systemic factors. An agenda for system change in the LBP landscape should be guided by a collaborative, coherent and integrated approach across sectors to enhance quality of care and system efficiency for those who seek and provide care.
Article
Full-text available
Political, scientific-administrative, as well as practical knowledge, are important for evidence-based policies in the public service. However, empirically, these forms of knowledge have mainly been studied independently, highlighting the need to better understand their interaction in welfare policy. On this basis, the article focuses on understanding the role of practical knowledge in evidence-based welfare policies, using two case studies. Empirically, active employment and public school policy in Denmark are studied as examples of welfare policy during the period from 2010 to 2022, based on documents and interviews with key policy actors. Based on the case studies, a three-stage model of the role of practical knowledge in evidence-based welfare reform is developed. The model illustrates that the role of practical knowledge changes at different stages of evidence-based policy. Public professionals may both rely on practical knowledge when implementing policy in response to evidence-based management and use it when acting as policy actors to re-politicise evidence-based policies.
Chapter
Leading scholars in the field of governance examine the effectiveness of the different non-institutional strategies at the disposal of modern governments in tackling issues of urban decline, public administrations, governmental regionalization, budget deficits and global economics. The governance approach to political science yields a new perspective on the role of the state, domestically as well as in the international arena. Globalization, internationalization, and the growing influence of networks in domestic politics means that the notions of state strength and the role of the state in society must be re-examined.
Chapter
The Cambridge Companion to the Scottish Enlightenment - edited by Alexander Broadie September 2019
Article
Collaboration between governments, business, the voluntary and community sectors is now central to the way public policy is made, managed and delivered. This book provides the first comprehensive and authoritative account of the theory, policy and practice of collaboration. Written by two leading authorities in the field the book explores the experience of collaboration in regeneration, health and other policy sectors, and assesses the consequences of the emergence of public-private partnerships contrasting the UK experience to that elsewhere in the world.
Book
A leading MIT social scientist and consultant examines five professions--engineering, architecture, management, psychotherapy, and town planning--toshow how professionals really go about solving problems.
Chapter
For a few years now, governance as a concept has been a catchword in many corners of social science disciplines. Apparently there is a need for such a concept, although a bandwagon effect cannot be denied either. In the chapter I advance ideas presented in Modern Governance: government-society interactions (Kooiman 1993). In that book, attention was drawn to recent developments in those interactions with a ‘co’-, public-private character, offset against a ‘go-it-alone’ government perspective. In the past few years the literature on ‘governance’ has almost ‘exploded’ in many different areas and disciplines, and my own explorations in this field have also continued (Kooiman 1999).2 The present chapter presents the main outlines of some aspects of this continued effort: the question of different modes and orders of governance as patterns of societal governance. While ‘Modern Governance’ was still strongly ‘government-oriented’, this chapter broadens the perspective in the sense that it looks at governance as societal, with public as well as private ‘governors’ participating. While their roles may differ between societal levels and from sector to sector, the essence of the argument is that governance of modern societies is a mix of all kinds of governing activities and structures, in this chapter conceptualized as modes and orders. These mixes can be seen as ‘answers’ of those societies to changing governing demands. While this point of view is partly in line with other recent theorizing on governance, it also differs in important aspects (Rhodes 1997). The approach pursued here may be phrased in the form of a working definition, the elements of which will be made clear in the chapter itself. Social-political or interactive governing will be considered to be arrangements in which public as well as private actors aim at solving societal problems or create societal opportunities, aim at the care for the societal institutions within which these governing activities take place, and phrasing the principles according to which these activities are carried out. The term governance denotes conceptual or theoretical ideas about such governing activities.
Chapter
This chapter introduces evidence-based policy and practice in public services and discusses the themes covered by the book. It presumes that there are two broad users of evidence: policy makers and practitioners. The book is organised first by exploring the role of evidence in different models of the policy process, and then by focusing on the role of evidence in specific public policy areas. The second half of the book then picks up some of the recurrent themes to explore cross-sectoral issues of evidence generation, before moving on to consider evidence implementation. The turn of the century has seen evidence embedded in the political and policy rhetoric of the day, and infused in the newly transformed professional ethic of many service professionals. Bringing such diverse accounts from across the public sector together under a coherent framework, the chapter suggests, will offer new insights and scope for cross-sectoral learning.
Book
Such diverse thinkers as Lao-Tze, Confucius, and U.S. Defense Secretary Donald Rumsfeld have all pointed out that we need to be able to tell the difference between real and assumed knowledge. The systematic review is a scientific tool that can help with this difficult task. It can help, for example, with appraising, summarising, and communicating the results and implications of otherwise unmanageable quantities of data. This book, written by two highly-respected social scientists, provides an overview of systematic literature review methods: Outlining the rationale and methods of systematic reviews; Giving worked examples from social science and other fields; Applying the practice to all social science disciplines; It requires no previous knowledge, but takes the reader through the process stage by stage; Drawing on examples from such diverse fields as psychology, criminology, education, transport, social welfare, public health, and housing and urban policy, among others. Including detailed sections on assessing the quality of both quantitative, and qualitative research; searching for evidence in the social sciences; meta-analytic and other methods of evidence synthesis; publication bias; heterogeneity; and approaches to dissemination.
Article
Challenges the common assumption that policy analysts engage in a purely objective technical assessment of policy alternatives. This book argues that what analysts really do is produce policy arguments that are based on value judgements and are used by policymakers in the course of public debate.