Content uploaded by Brian Head
Author content
All content in this area was uploaded by Brian Head on Oct 17, 2017
Content may be subject to copyright.
The Australian Journal of Public Administration, vol. 67, no. 1, pp. 1–11 doi:10.1111/j.1467-8500.2007.00564.x
RESEARCH AND EVALUATION
Three Lenses of Evidence-Based Policy
Brian W. Head1
University of Queensland
This article discusses recent trends to incorporate the results of systematic research (or ‘evi-
dence’) into policy development, program evaluation and program improvement. This process
is consistent with the New Public Management (NPM) emphasis on efficiency and effective-
ness. Analysis of evidence helps to answer the questions ‘what works? and ‘what happens
if we change these settings?’ Secondly, some of the well known challenges and limitations
for ‘evidence-based’ policy are outlined. Policy decisions emerge from politics, judgement
and debate, rather than being deduced from empirical analysis. Policy debate and analysis
involves an interplay between facts, norms and desired actions, in which ‘evidence’ is diverse
and contestable. Thirdly, the article outlines a distinction between technical and negotiated
approaches to problem-solving. The latter is a prominent feature of policy domains rich in
‘network’ approaches, partnering and community engagement. Networks and partnerships
bring to the negotiation table a diversity of stakeholder ‘evidence’, ie, relevant information,
interpretations and priorities. Finally, it is suggested that three types of evidence/perspective
are especially relevant in the modern era – systematic (‘scientific’) research, program man-
agement experience (‘practice’), and political judgement. What works for program clients is
intrinsically connected to what works for managers and for political leaders. Thus, the prac-
tical craft of policy development and adjustment involves ‘weaving’ strands of information
and values as seen through the lens of these three key stakeholder groups. There is not one
evidence-base but several bases. These disparate bodies of knowledge become multiple sets
of evidence that inform and influence policy rather than determine it.
Key words: evidence-based policy,policy development,performance,program management
In many of the mature democracies, the recent
ground-swell of interest in ‘evidence-based
policy’, on the part of both government offi-
cials and social researchers, represents both an
opportunity and a challenge. For public man-
agers and political leaders, the opportunity is
apparent for continuous improvement in policy
settings and program performance, on the basis
of rational evaluation and well-informed debate
of options. The prospect of mutual benefits for
managers, researchers and citizens is alluring.
This is the modern promise of evidence-
based policy improvement, albeit the attempt to
link the social sciences and public policy has a
much older lineage in the history of progressive
reform movements.2The social sciences and
public decision-makers have not always had
close and cordial relations; indeed, there has
been a history of mutual distrust between these
sectors during the last two centuries. However,
scientific and technical knowledge has been
greatly prized in the evolution of the modern
state, initially because of its links to economic
growth and national defence, and later to ad-
dress the aspirations for social improvement by
the citizens. Social sciences have been valued
for their contribution to understanding and in-
fluencing social development and well-being.
Democratic decision-makers have increasingly
aspired to anchor many of their social reform
programs in the ‘relevant’ and ‘usable’ knowl-
edge provided by the social sciences.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
2 Three Lenses of Evidence-Based Policy March 2008
The evidence-based movement in modern
public policy is thus the latest version of the
search for usable and relevant knowledge to
help address and resolve problems. My argu-
ment is that this is linked to the modern em-
phasis on rational problem-solving, with its fo-
cus on accurate diagnosis and knowledge of
causal linkages. It is also congruent with impor-
tant modern strategic concerns with risk anal-
ysis and appropriate mitigation responses. In
the more technocratic version of the evidence-
based approach, the aspiration is to produce the
knowledge required for fine-tuning programs
and constructing guidelines and ‘tool-kits’ for
dealing with known problems. Hence, the cur-
rently famous phrase that defines much of the
movement – ‘what works?’ (Roberts 2005).
In the context of public policy, governments
remain the major investors and users of ap-
plied social sciences. They do not simply re-
ceive, scan and utilise research; they engage on
many levels to influence the processes and the
products. Their direct methods of shaping the
applied sciences include:
•investment in government-funded research
units on specific problems;
•managing the policy-research functions in-
side many government agencies; and
•commissioning external consultants to under-
take specific contract research (Saunders and
Walter 2005).
Governments also exercise strong indirect in-
fluence through:
•determining national priority areas (eg, for
allocation of competitive public funding of
research);
•providing rewards and recognition for
commercially-focussed knowledge and tech-
nical forms of scientific excellence; and
•encouraging contestability in some policy
arenas by diversifying their sources of advice,
including think-tanks and contractors (Stone
and Denham 2004); and encouraging a wider
range of instruments to deal with policy chal-
lenges, like market-based mechanisms and
de-regulatory options.
Major investment in the applied social sciences
is part of a cycle of producing, analysing, man-
aging and reinvesting in the ‘bank’ of use-
ful knowledge. Large data sets are system-
atically collected. Evidence-based approaches
claim to fill important gaps in the value chain
as data is transformed into information and us-
able knowledge. Large organisations have in-
troduced ‘knowledge management’ strategies
to address the complex tasks of collection,
analysis and dissemination. The management
information (decision-support) systems on the
desktop of senior managers are hungry for
information to underpin performance indica-
tors and monitor program or business trends.
Some disciplines are seen as more valuable than
others for these purposes, eg, the quantitative
precision of financial accounting, cost/benefit
analysis, risk auditing, and health eco-
nomics may be more credible than the
hermeneutic approaches of history and cultural
sociology.
In short, the rise and promotion of ‘evidence-
based’ orientations within government agen-
cies is consistent with the public sector’s
increased interest in efficiency and effective-
ness. Evidence-based policy is believed to pro-
vide great assistance in answering some of
the key questions of New Public Management
(NPM):
•what options will ‘deliver the goods’?
•how can programs be improved to get greater
‘value for money’?
•how can innovation and competition be ex-
panded to drive productivity?
•how can program managers achieve spe-
cific ‘outcomes’ for clients and stakeholders
(rather than just ‘manage programs’)? and
•in summary, ‘what works?’ (Davies, Nutley
and Smith 2000; 6 2002; Reid 2003)
Context: Evidence for Addressing Complex
Policy Problems
In the 1970s and 1980s the exponents of
NPM managerialism used innovative analytical
frameworks to tackle traditional problems and
to improve program performance information
in each portfolio area. During this period, the
de-regulation of many policy domains and out-
sourcing of services by the state was linked to
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 3
the key focus on program performance issues,
although this strategy posed some risk to the
steering capabilities in some public sector agen-
cies (Pollitt and Bouckaert 2000; McLaugh-
lin, Osborne and Ferlie 2002). However, by
the 1990s there was a significant political shift
towards tackling difficult interlinked issues,
exemplified by the UK government’s champi-
oning of evidence-based policy as a foundation
for joined-up government (UK Cabinet Office
1999; Parsons 2002, 2004). This period saw an
increased investment in central units for policy
analysis and commissioning evidence-based
consultancy reports. Dealing more directly with
major complex issues required taking a more
comprehensive approach towards policy design
and service delivery (Schorr 1988).
The 1990s saw the rise of policy processes
that were potentially less technocratic and more
open to ‘network’ approaches. In essence, this
meant that the new managerialist approaches
were often supplemented by new mechanisms
and process loops, variously described as com-
munity engagement, multi-stakeholder consul-
tation, and partnering across stakeholder sec-
tors (Kernaghan 1993; Kooiman 2000; Osborne
2000; Edwards 2003; Casey 2004; Head 2007).
Greater levels of cooperation and partnership
among governments, non-governmental organ-
isations (NGOs) and business, became asso-
ciated with a widespread rhetoric promoting
‘collaboration’, ‘joined-up’ services, and mul-
tiple ‘networks’ linking stakeholders and sec-
tors (Bruner, Kunesh and Knuth 1992; Bardach
1998; Head 1999; Mandell 2001; Bakvis
2002; Sullivan and Skelcher 2002; Reddel and
Woolcock 2004).
Political leaders and public managers in-
creasingly moved to tackle complex unresolved
problems, in response to the demands and pres-
sures of citizens for whom services remained
inadequate, piecemeal or inappropriate. The
frustration felt by managers in public agencies
regarding the poor rate of return on major social
program investments, led them to search more
widely for new approaches. These new direc-
tions were facilitated by the willingness of ma-
jor NGOs, professions and other stakeholders
to engage in partnerships or collaborative ap-
proaches to addressing major issues. Endemic
social problems seemed to persist regardless of
the massive funding directed towards their alle-
viation. Performance information showed that
results were not being achieved. What could be
done to deal better with domestic poverty; poor
educational attainment; juvenile crime and re-
cidivism; drug and alcohol abuse; preventable
diseases; the appalling conditions facing many
indigenous communities; and the systemic dis-
advantages suffered by peoples in developing
countries?
This awakening of political interest created
opportunities for the behavioural and applied
social sciences to offer solutions, in the form
of new approaches to gaining greater control
over fuzzy and messy realities. Incremental
adjustment around ‘business as usual’ would
no longer suffice. Solutions based on old-
fashioned ideological recipes became much
less persuasive. New more integrated ap-
proaches to policy interventions were warmly
welcomed in some agencies.
From the viewpoint of performance-based
programs and evidence-based policy, the ques-
tion arose as to what kind of investment in
data/information would be needed to generate
the necessary knowledge, both to understand
complex problems and then to create viable so-
lutions. This period saw a major increase in use
of new technologies for data-gathering and data
analysis in order to measure the nature and ex-
tent of problems, assess the current impacts of
service systems, and provide benchmarks for
judging future performance.
However, some social scientists and policy
analysts began to question whether the per-
sistence of complex social problems was re-
ally attributable largely to a lack of informa-
tion, ie, ‘gaps’ in the database (Schon 1983;
Schon and Rein 1994). They suggested that
obtaining more data to fill the known gaps
would not necessarily get us onto the highway
toward good policy solutions, because much
of the policy puzzle is about reconciling dif-
ferent value perspectives. Continuing to in-
vest in building information banks for social
scientists and decision-makers would still re-
main important. But what kinds of information
would be of most value for stakeholders and
decision-makers dealing with the challenges
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
4 Three Lenses of Evidence-Based Policy March 2008
of complex inter-related problems with many
stakeholders?
‘Evidence’ Revisited: Types of Problems,
Types of Knowledge
The problems addressed by policy-makers are
many and varied. At one end of the spectrum,
problems may be seen as discrete, bounded,
and linked to particular sets of information and
actors. In such policy arenas, a ‘technical’ ap-
proach to problem-solving by relatively narrow
circles of actors may be dominant. This tech-
nocratic approach may be seen as sufficient to
meet the requirements of efficiency and effec-
tiveness by those involved. Technical expertise
is important in most policy areas, but its dom-
inant position in many areas is increasingly
contested. For example, many policy areas in
which scientific and engineering expertise had
achieved substantial control (eg, transportation,
energy, water supply) have now become subject
to intense debate and uncertainty.
At the other end of the spectrum, problems
may be seen as complex, inter-linked and cross-
cutting. Simple technical solutions by experts
are unavailable or unworkable. In these cir-
cumstances, a ‘negotiated’ and ‘relational’ ap-
proach to problem-solving may emerge (Innes
and Booher 1999; Hemmati 2002; Lewicki,
Gray and Elliott 2003; Lovan, Murray and Shaf-
fer 2004). The latter approach is a prominent
feature of policy domains that are rich in ‘net-
work’ approaches, partnering and community
engagement. It is argued here that a techni-
cal problem-solving approach to knowledge in
each discrete policy area is increasingly inad-
equate. Policy development arrangements are
experimenting with broader relational and sys-
temic approaches. Networks and partnerships
bring to the negotiation table a diversity of
stakeholder ‘evidence’, ie, relevant informa-
tion, interpretations and priorities. The argu-
ment is that addressing complex inter-linked
problems requires a strong emphasis on the so-
cial relations and stakeholder perceptions in-
herent in policy direction and program systems.
This has implications for how we think about
problems, relevant knowledge, policy and
program design, implementation, and evalua-
tion. In short, our ideas about ‘evidence-based’
policy may change character as we move from
a technical approach towards a more relational
approach.
Traditionally, the ‘evidence’ base seen as
the foundation for evidence-based policy is
the knowledge generated by applied research,
whether undertaken inside or outside of gov-
ernment agencies. This includes the general ev-
idence about broad trends and explanations of
social and organisational phenomena, as well
as specific evidence generated through per-
formance indicators and program evaluations
(Nutley, Davies and Walter 2002; Oakley et al.
2005). However, my argument is that the effec-
tiveness (success) of policies and programs is
not just a matter for applied social-science re-
search (including program evaluation reports).
As we come to a fuller appreciation of the com-
plexities of modern inter-dependent problems,
with a corresponding broadening in the focus of
policy attention, it becomes clear that there are
multiple forms of policy-relevant knowledge,
that are vital to understanding the issues and
the prospects for the success of policy inter-
ventions.
In this broader view, there is not one
evidence-base but several bases (Pawson et al.
2003; Schorr 2003; Davies 2004). These dis-
parate bodies of knowledge become multiple
sets of evidence that inform and influence pol-
icy rather than determine it. In this broader
understanding of policy-relevant knowledge,
the prestige and utility of ‘scientific evidence’,
validated by the standards of scientific method-
ology, remains a very significant input to pol-
icy development. Thus, rigorous and system-
atic research has great value, but needs to be
placed in a wider context. Hence, it is argued
that effective policy – its design, implementa-
tion, and evaluation – depends on several evi-
dentiary bases. These are all involved, directly
or indirectly, in the development and assess-
ment of ‘good programs’ and help us to un-
derstand ‘effectiveness’ in a more holistic and
networked policy environment.
The general milieu in which government
policies and programs operate is of course
the public sphere – of public debate, public
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 5
opinion, civic awareness and popular culture.
This milieu both informs and responds to pub-
lic policy, and colours the ways in which po-
sitions are argued and knowledge-claims are
advanced. However, evidence-based policy is
not primarily concerned with how the general
political culture shapes policy directions. In a
more particular sense, there are three impor-
tant kinds of knowledge (and corresponding
views of ‘evidence’) that are especially salient
for policy. These forms of knowledge arise
from:
•political know-how;
•rigorous scientific and technical analysis; and
•practical and professional field experience.
They provide three ‘lenses’ for policy anal-
ysis and three lenses for understanding
the evidence-base(s) of policy debate (see
Figure 1). They all work in their different ways
with particular interpretations of the constraints
and limitations of public opinion.
Three Lenses
1.Political knowledge for this purpose is the
know-how, analysis and judgement of politi-
cal actors. These analysing and judging activi-
ties include several vital elements relevant to
evidence-based policy – such as considering
and adjusting strategies or tactics; undertaking
agenda-setting; determining priorities; under-
taking persuasion and advocacy; communicat-
ing key messages and ideological spin; shap-
ing and responding to issues of accountability;
building coalitions of support; and of course
negotiating trade-offs and compromises. Mak-
ing contextual judgements about the possible
and the desirable are inherent in this form of
knowledge.
This ‘political’ form of knowledge inheres
primarily in politicians, parties, organised
groups, and the public affairs media. But al-
though some of the knowledge is private and
esoteric, most of it is also widely dispersed
in popular forms among the public and espe-
cially by and through the mass media. This
knowledge is diffuse, highly fluid, and heavily
contested owing to its partisan and adversar-
ial context. Policy, seen through the political
lens, is about persuasion and support rather than
about objective veracity.
Partisanship, and bias in knowledge, are not
solely confined to the political sphere – with
its characteristic polemics, debates, ideologi-
cal assertions and counter-claims. However the
implications of ‘political’ know-how for the
use of ‘evidence’ are very significant. Most
simply, a selection of convenient ‘facts’ may
be harnessed to an argument; and large areas
of other information are then either ignored,
dismissed as tainted, or otherwise deemed ir-
relevant. This partisan usage of evidence is of-
ten regarded as ‘typical’ political behaviour and
part of the ‘game’ of political argument. In
the political game, it is widely understood that
special pleading and deception are normalised.
Sometimes the partisan use of evidence is tac-
tical, casual or opportunistic; but sometimes it
is more systematically linked to a cohesive ide-
ological outlook, characterised by some com-
mentators as faith-based politics.
Importantly for my argument, there are some
areas of policy that become the subject-matter
of clear government commitments. These com-
mitments are no longer (if they ever were) open
to further debates about the nature of the prob-
lem, the best policy solution, and the range
of evidence relevant to assessing policy effec-
tiveness. This means some policy positions are
‘data-proof’ or ‘evidence-proof’, in the sense
that their evidence ‘base’ has been narrowed
and buttressed by political commitments, per-
haps closely linked to the values and ideological
positions of political leaders or parties. Some
policy preferences allow only certain kinds of
‘evidence’ to be noticed. Critical commentary
under these circumstances is unwelcome. Some
contentious problems may become defined in
‘official’ terms, in ways that tend to privilege
some evidence as relevant and to rule out other
evidence as irrelevant or merely ideological. In
this context, the ‘official’ framing of a prob-
lem is also crucial in regard to what research is
commissioned and its terms of reference. Rela-
tively few research and consultancy projects are
commissioned without some expectation that
the reports may assist in upholding a certain
viewpoint.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
6 Three Lenses of Evidence-Based Policy March 2008
Figure 1. Three Lenses of Knowledge and Evidence
2.Scientific (research-based) knowledge,is
the product of systematic analysis of current
and past conditions and trends, and analysis
of the causal inter-relationships that explain
conditions and trends. In relation to policy re-
view and program assessment, there is a range
of disciplinary and cross-disciplinary knowl-
edges (economics, law, sociology, public ad-
ministration, evaluation, etc) that make highly
useful contributions to policy and program un-
derstanding and improvement. There is seldom
any consensus among social scientists on the
nature of problems, the causes of trends or re-
lationships, and the best approach for solutions.
Various scientific disciplines may have differ-
ent methodological approaches, and may offer
complementary or sometimes competing per-
spectives on complex issues. It is perhaps not
surprising that inter-disciplinary approaches
have come to the fore in recent decades for ad-
dressing multi-layered social problems.
Scientific (research-based) forms of knowl-
edge primarily comprise the work of profes-
sionals trained in systematic approaches to
gathering and analysing information. A con-
cern with the quality and consistency of data is
fundamental to a scientific approach to analy-
sis. Nevertheless, methodological choices have
to be made. At one end of the spectrum in the
behavioural and applied social sciences, ‘sys-
tematic reviews’ apply rigorous standards to ex-
amine the state of current knowledge, giving
recognition only to those studies which clearly
focus on assessing the causal effects of specific
interventions. The so-called ‘gold standard’ for
a rigorous experimental approach – adopted in
the medical, biological and healthcare sciences
– entails the use of randomised controlled trials
(RCTs) to test the efficacy of specific interven-
tions (Cochrane Collaboration website). This
approach, often championed as the most rigor-
ous strategy for assessing ‘what works’ in a spe-
cific policy field, has also been promoted and
applied in some social program areas, includ-
ing criminology (Davies 2004; Petticrew and
Roberts 2005; Campbell Collaboration web-
site). Rigour is thus sometimes associated with
a preference for quantitative behavioural data,
although qualitative (attitudinal) data are in-
creasingly seen as central in helping to explain
the conditions and nature of behavioural change
(Davies 2004; Percy-Smith 2005). At the other
end of the spectrum, some methodologies asso-
ciated with a hermeneutic approach, including
a large proportion of ‘action-research’ projects,
tend to regard policy and program assessment
as more akin to iterative social learning projects
than to the experimental sciences.
3.Practical implementation knowledge,in
the present context of policy and program
effectiveness, is the ‘practical wisdom’ of
professionals in their ‘communities of prac-
tice’ (Wenger 1998) and the organisational
knowledge associated with managing program
implementation. These professional and man-
agerial communities are often segmented rather
than well connected. They operate within and
across the public sector, the private sector, and
the not-for-profit NGO sector. Relevant occu-
pational groupings include program delivery
managers, contract managers, enterprise man-
agers, and the diverse range of professionals
and para-professionals who are engaged in di-
rect service provision (Pawson et al. 2003) or
who provide support services linked into the
policy programs of government.
These managers and professionals wrestle
with everyday problems of program imple-
mentation and client service. Their roles may
require managing upwards, downwards, and
outwards to external stakeholders (O’Toole,
Meier and Nicholson-Crotty 2005). Their prac-
tical experience in delivery often tends to be
under-valued by the political and scientific sec-
tors. The sphere of ‘practice’ operates with
evolving bodies of knowledge that tend to be
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 7
specific to each professional niche. The train-
ing regimes for managers and professionals
are often linked to concepts of ‘best practice’
and to relevant research bases for assessing
‘effective’ practices. Their formal bodies of
knowledge evolve, and are subject to debate in
‘communities of learning’ (Wenger 1998). But
they also tend to become systematised and cod-
ified, and linked to standards and guidelines.
In large organisations, best-practice guidelines
may become overlaid with bureaucratic rules
and protocols.
While the pressure towards systematisation
is significant, and the search for technical solu-
tions (eg, build another IT system) is endemic,
some areas of practice are less than impressed
by the business and engineering models. The
professional ethos in human services makes
room for unique cases and for the meaning-
systems of clients (Schon 1983). This provides
the ‘mental space’ for creating organisational
climates that are more favourable to case-based
learning and more broadly the adoption of ‘or-
ganisational learning’ approaches. Social real-
ities are not seen as cut-and-dried and control-
lable, but as evolving challenges with unique
characteristics. Time is seen as a necessary in-
gredient of social and personal improvement.
However, the influence of this more open-ended
approach is severely tested whenever the politi-
cal system shifts into crisis response mode, with
a heightened demand for rapid responses, rig-
orous risk-management and standardisation.
Some Implications
It has been suggested above that there are three
broad types of knowledge and evidence that are
central to the design, implementation and eval-
uation of policies and programs. There is not
one evidence-base but several bases. These dis-
parate bodies of knowledge become multiple
sets of evidence that inform and influence pol-
icy rather than determine it. The three lenses
of policy-relevant knowledge comprise three
perspectives on useful and usable information.
Each of these types has its distinctive proto-
cols of knowledge, of expertise, of strategy, and
what counts as ‘evidence’, albeit it is also clear
that there are ongoing internal debates on such
matters within each knowledge area.
How do these three lenses and forms of
knowledge fit together? There is a considerable
case-study literature on ‘policy communities’
and ‘policy networks’ that may include partici-
pants from more than one institutional or indus-
try sector. There is also a considerable literature
in public administration concerning manage-
rial challenges and processes in various portfo-
lio areas. There is also a literature on specific
managerial knowledges and other professional
knowledges; but rather less on how the sphere
of management practice interacts with the other
spheres of policy choices by politicians and the
findings of systematic research. In fact, there
has been surprisingly little research directly
on the question of how the three clusters of
political, research and professional/managerial
knowledge interact. By contrast there has been
increasing attention to the bilateral relations
between public policy and the social sciences
(Davies, Nutley and Smith 2000; Stone 2001;
UKCSS 2003; Edwards 2004; Saunders and
Walter 2005).
Some useful research questions about the
three lenses might therefore include the follow-
ing:
•is there a low or high level of mutual aware-
ness, recognition and understanding of each
other’s approaches?
•is there is a substantial ‘cultural divide’ be-
tween these forms of knowledge (Shonkoff
2000), and if so, are there any useful mech-
anisms/incentives to promote working to-
gether more closely?
•in the jostling for salience or in the compe-
tition between these sets of ideas, does one
form of knowledge typically ‘trump’ the oth-
ers – eg, in the policy domain, does politics
typically predominate over science and prac-
tice?
Answers to all these questions are likely to vary
greatly in different situations. From the view-
point of government, there is an expectation that
research findings will assist but not determine
policy directions and adjustments (UK Cabinet
Office 1999). There is a robust debate about
where and how the research contributions can
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
8 Three Lenses of Evidence-Based Policy March 2008
‘add most value’ to the policy process. For ex-
ample, is research most useful to government
processes at the commencement of the ‘policy
cycle’ (identifying and scoping a problem), or
in a later stage of options analysis (examining
different costs and impacts), or in the program
evaluation phase when effectiveness is being re-
considered? Researchers are themselves some-
times na¨ıve about what kind of policy analy-
sis will be seen as relevant, and about how to
communicate and package their research out-
comes most effectively for government officials
(Edwards 2004; UKCSS 2003).
Governments tend to have a strong expec-
tation that the managers and professionals
who deliver services and implement programs
will do so with technical skill and efficiency.
For those employed within the public sector,
effective implementation is the key consider-
ation, and the government’s control over pri-
orities and program design is very clear. For
those employed at arm’s-length from govern-
ment, efficient implementation is reinforced by
contractual relations and accountabilities for
service delivery. With a few powerful excep-
tions, communities of practice may feel dis-
enfranchised, especially when those who are
practical experts in delivery are not centrally in-
volved in early discussions about how programs
are designed and delivered. The political exec-
utives of government are not especially adept
at hearing and seeking out the voices of imple-
menters, especially those who suggest that the
program goals cannot be delivered because ei-
ther the framework is flawed or resources are
clearly inadequate. Nevertheless, some shar-
ing of perspectives occurs. For example, both
public sector managers and political leaders
are successful only by making astute practical
judgements about priorities, garnering support
for taking action, and persuading stakeholders
about trade-offs and preferred options.
Researchers may believe that problem-
definition and analysis should be closely
linked to the data assembled by systematic
research. However, in the realm of public pol-
icy development, governments are in the busi-
ness of framing issues and agendas. Strategic
policy work is conducted in the context of
debates about issues and agendas. Thus, pol-
icy decisions are not deduced in a neutral
and objective manner from empirical-analytical
work, but from politics, judgement and debate
(Majone 1989). Policy debate and analysis in-
volves interplay between facts, norms and de-
sired actions, in which ‘evidence’ is diverse
and contestable. Different stakeholders within
the business, NGO and government sectors are
likely to have divergent views on what is the
key problem. For governments, problems and
issues become seen as worthy of investigation
owing to a confluence of circumstances, such
as:
•a perception of crisis or urgency;
•the role of political mandates and priorities;
•the role of expert judgement and advice (con-
sultants, inquiries, etc);
•organisational and issue histories; or
•the changing context of social values and pub-
lic opinion.
The definition and focus of problems are also
likely to be viewed differently through the three
lenses of research, professional practice and
government policy-making (and of course there
will be some debates within each of these).
Some policy arenas are more divergent and
strongly contested than others. Over time, as
Mulgan (2005) suggests, some fields may be-
come relatively stable and consensual; others
may be subject to ‘profound disagreements’;
and others again may be lacking a solid infor-
mation base or a track record of on-ground ex-
perience. The implications of these differences
in the nature of policy arenas are potentially
significant. Contentious (ie, unsettled and tur-
bulent) policy areas may tend to generate more
heat than light, as the terms of evidentiary de-
bate may be overwhelmed by partisan voices,
despite the best efforts of those who wish
to retain an ‘objective’ stance. Current policy
examples might include strategies to address
major emergent issues such as climate-change
responses; value-based issues at the intersec-
tion of bio-ethics and bio-technologies; and de-
bates about procedural and substantive ‘fair-
ness’ for workers and employers in industrial
relations reform. Researchers who seek to make
an objective contribution in such areas may risk
being harnessed to positions proposed by strong
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 9
advocates on one side of the debate and accord-
ingly disparaged by others.
Conclusions
The demands for efficient and effective govern-
ment have fostered the need for performance
information. This has provided leverage for
applied social research, concerned with pro-
gram evaluation, implementation effectiveness,
and new models for tackling complex issues
using new policy instruments and processes.
This served to legitimate the general concept
of evidence-based policy. However, there is a
large difference between a technical problem-
solving approach to knowledge, and a broader
relational and systemic approach to knowledge
that is located in multi-stakeholder networks.
This article has suggested there are three
main kinds of challenge to the rational mission
of ‘evidence-based’ policy. One arises from the
inherently political and value-based nature of
policy debate and decision-making. Policy de-
cisions are not deduced primarily from facts
and empirical models, but from politics, judge-
ment and debate. Policy domains are inherently
marked by the interplay of facts, norms and
desired actions. Some policy settings are data-
resistant owing to governmental commitments.
Secondly, information is perceived and used
in different ways, by actors looking through
different ‘lenses’. From this perspective, there
is more than one type of relevant ‘evidence’.
I have drawn attention to ‘three lenses’ that
are especially important, centred on political
know-how, systematic research, and profes-
sional practice. These perspectives all provide
important contributions to policy development,
but defensiveness and negativity are as com-
mon as cooperation. Although the context of
decision-making is dynamic and negotiated,
these key actors are anchored in institutional
settings that make shared perspectives difficult
to attain.
The third challenge to a rationalist concept of
evidence-based policy is that the complex mod-
ern arrangements of networks, partnerships and
collaborative governance are difficult to har-
ness to the traditional forms of knowledge
management, policy development and program
evaluation in the public sector (Agranoff and
McGuire 2003). Networks bring to the table a
diversity of lived experience and therefore a di-
versity of ‘evidence’ (relevant information, in-
terpretations, priorities, and perspectives), not
only about what works but also about what is
worthwhile and meaningful. The evidence-base
for understanding success factors for complex
policy design and implementation may need to
address the conditions under which innovation,
new thinking and new solutions may emerge in
a dynamic environment (Osborne and Brown
2005). The three-lenses approach suggests that
there may be importantly divergent perspec-
tives on whether and how to increase mutual
understanding and shared objectives.
Endnotes
1. The author is grateful for comments on these
ideas from a number of colleagues over sev-
eral years, including John Alford, Meredith Ed-
wards, Geoff Gallop, Richard Mulgan, John
Wanna, and the anonymous referees for AJPA,
none of whom bear any responsibility for re-
maining errors of argument and interpretation.
2. The historical roots of such an alliance be-
tween political power and systematic knowl-
edge reach back to the late Enlightenment,
whose leading thinkers sought to undermine
the traditional capacity of governments to rely
on appeals to precedent, authority and religious
values as the basis of government legitimacy
(Staum 1996; Head 1982; Berry 1997). Sys-
tematic social science later fortified the pro-
gressive impulses for social improvement un-
derlying New Liberalism, early social democ-
racy, and the foundation of institutions such as
the London School of Economics and Political
Science in 1895 (Lichtheim 2000: chapter 4).
References
6, P. 2002. ‘Can Policy Making be Evidence-Based?’
MCC: Building Knowledge for Integrated Care
10(1):3–8.
Agranoff, R. and M. McGuire. 2003. Collaborative
Public Management: New Strategies for Local
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
10 Three Lenses of Evidence-Based Policy March 2008
Governments. Washington DC: Georgetown Uni-
versity Press.
Bakvis, H. 2002. ‘Pulling Against Gravity? Horizon-
tal Management in the Canadian Federal Govern-
ment.’ In Knowledge, Networks and Joined-Up
Government, ed. M. Considine. IPSA Research
Committee Proceedings, Centre for Public Pol-
icy, Melbourne, 57–75.
Bardach, E. 1998. Getting Government Agencies to
Work Together. Washington DC: Brookings Insti-
tution Press.
Berry, C.J. 1997. The Social Theory of the Scottish
Enlightenment. Edinburgh: University of Edin-
burgh Press.
Booher, D.E. 2005. ‘Collaborative GovernancePrac-
tices and Democracy.’ National Civic Review
93(4):32–46.
Bruner, C., L.G. Kunesh and R.A. Knuth. 1992. What
Does Research Say about Interagency Collabora-
tion? URL: <http://www.ncrel.org>.
Campbell Collaboration. About the Campbell
Collaboration. URL: <http://www.campbellco-
llaboration.org/>.
Casey, J. 2004. ‘Third Sector Participation in the
Policy Process.’ Policy and Politics 32(2):241–
257.
Cochrane Collaboration. The Cochrane Collabora-
tion: The Reliable Source of Evidence in Health
Care. URL: <http://www.cochrane.org/>.
Commonwealth Foundation. 2004. Tri-Sector Dia-
logues: Synthesis of Dialogues on Partnership
Approaches to Governance. London: Common-
wealth Foundation, Citizens and Governance Pro-
gram.
Davies, P. 2004. Is Evidence-Based Policy Possible?
The Jerry Lee Lecture, Campbell Collaboration
Colloquium, Washington.
Davies, H.T., S.M. Nutley and P.C. Smith, eds. 2000.
What Works? Evidence-Based Policy and Prac-
tice in Public Services. Bristol: Policy Press.
Edwards, M. 2003. ‘Participatory Governance.’
Canberra Bulletin of Public Administration
107:1–6.
Edwards, M. 2004. Social Science Research and
Public Policy: Narrowing the Divide. Policy Pa-
per 2. Canberra: Academy of Social Sciences in
Australia.
Head, B.W. 1982. ‘The Origins of “La Science So-
ciale” in France 1770–1800.’ Australian Journal
of French Studies 19(2):115–132.
Head, B.W. 1999. ‘The Changing Role of the Public
Service: Improving Service Delivery.’ Canberra
Bulletin of Public Administration 94:1–3.
Head, B.W. 2005. ‘Governance.’ In Ideas and Influ-
ence: Social Science and Public Policy in Aus-
tralia, eds P. Saunders and J. Walter. Sydney:
UNSW Press, 44–63.
Head, B.W. 2007. ‘Community Engagement – Par-
ticipation on Whose Terms?’ Australian Journal
of Political Science 42(3):441–454.
Hemmati, M. 2002. Multi-Stakeholder Processes for
Governance and Sustainability. London: Earth-
scan.
Innes, J.E. and D.E. Booher. 1999. ‘Consensus
Building and Complex Adaptive Systems: A
Framework for Evaluating Collaborative Plan-
ning.’ Journal of the American Planning Asso-
ciation 65(4):412–423.
Innes, J.E. and D. Booher. 2003. The Impact of
Collaborative Planning on Governance Capac-
ity. Working Paper 2003/03, Institute of Urban
and Regional Development, University of Cali-
fornia, Berkeley.
Kernaghan, K. 1993. ‘Partnership and Public Ad-
ministration.’ Canadian Public Administration
36(1):57–76.
Kooiman, J. 2000. ‘Societal Governance.’ In Debat-
ing Governance: Authority, Steering and Democ-
racy, ed. J. Pierre. Oxford: Oxford University
Press, 138–164.
Lewicki, R.J., B. Gray and M. Elliott, eds. 2003.
Making Sense of Intractable Environmental Con-
flicts. Washington: Island Press.
Lichtheim, G. 2000. Europe in the Twentieth Cen-
tury. London: Phoenix Press.
Lovan, W.R., M. Murray and R. Shaffer, eds.
2004. Participatory Governance: Planning, Con-
flict Mediation and Public Decision-Making in
Civil Society. Aldershot: Ashgate.
Lowndes, V. and C. Skelcher. 1998. ‘The Dynam-
ics of Multi-Organisational Partnerships.’ Public
Administration 76(3):313–333.
Majone, G. 1989. Evidence, Argument and Persua-
sion in the Policy Process. New Haven: Yale Uni-
versity Press.
Mandell, M.P., ed. 2001. Getting Results through
Collaboration: Networks and Network Structures
for Public Policy and Management. Westport:
Quorum Books.
McLaughlin, K., S.P. Osborne and E. Ferlie, eds.
2002. New Public Management: Current Trends
and Future Prospects. London: Routledge.
Mulgan, G. 2005. The Academic and the Policy-
Maker. Presentation to Public Policy Unit, Oxford
University, 18 November.
Nutley, S., H. Davies and I. Walter. 2002.
Evidence-Based Policy and Practice: Cross Sec-
tor Lessons from the UK. Working Paper 9,
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia
Head 11
Research Unit for Research Utilisation, Univer-
sity of St Andrews.
Oakley, A., D. Gough, S. Oliver and J. Thomas.
2005. ‘The Politics of Evidence and Methodol-
ogy: Lessons from the EPPI-Centre.’ Evidence
and Policy 1(1):5–31.
Osborne, S.P., ed. 2000. Public-Private Partner-
ships. London: Routledge.
Osborne, S.P. and K. Brown. 2005. Managing
Change and Innovation in Public Service Organ-
isations. London: Routledge.
O’Toole, L.J., K.J. Meier and S. Nicholson-
Crotty. 2005. ‘Managing Upward, Downward and
Outward: Networks, Hierarchical Relationships
and Performance.’ Public Management Review
7(1):45–68.
Parsons, W. 2002. ‘From Muddling Through to Mud-
dling Up – Evidence Based Policy-Making and
the Modernisation of British Government.’ Pub-
lic Policy and Administration 17(3):43–60.
Parsons, W. 2004. ‘Not just Steering but Weaving:
Relevant Knowledge and the Craft of Building
Policy Capacity and Coherence.’ Australian Jour-
nal of Public Administration 63(1):43–57.
Pawson, R., A. Boaz, L. Grayson, A. Long and C.
Barnes. 2003. Types and Quality of Knowledge
in Social Care. Knowledge Review No. 3, So-
cial Care Institute for Excellence. Bristol: Policy
Press.
Percy-Smith, J. 2005. What Works in Strategic Part-
nerships for Children? Ilford: Barnardo’s.
Petticrew, M. and H. Roberts. 2005. Systematic Re-
views in the Social Sciences. Oxford: Blackwells.
Pollitt, C. and G. Bouckaert. 2000. Public Manage-
ment Reform: A Comparative Analysis. Oxford:
Oxford University Press.
Reddel, T. and G. Woolcock. 2004. ‘From Consul-
tation to Participatory Governance?’ Australian
Journal of Public Administration 63(3):75–87.
Reid, F. 2003. Evidence-Based Policy: Where is the
Evidence for it? Working Paper 3, School for Pol-
icy Studies, University of Bristol.
Roberts, H. 2005. ‘What Works?’Social Policy Jour-
nal of New Zealand 24:34–54.
Saunders, P. and J. Walter, eds. 2005. Ideas and In-
fluence: Social Science and Public Policy in Aus-
tralia. Sydney: UNSW Press.
Schon, D.A. 1983. The Reflective Practitioner: How
Professionals Think in Action. New York: Basic
Books.
Schon, D.A. and M. Rein. 1994. Frame Reflection:
Toward the Resolution of Intractable Policy Con-
troversies. New York: Basic Books.
Schorr, L.B. 1988. Within Our Reach: Breaking the
Cycle of Disadvantage. New York: Anchor Dou-
bleday.
Schorr, L.B. 2003. Determining ‘What Works’ in
Social Programs and Social Policies: Towards
a More Inclusive Knowledge Base. Washington,
DC: Brookings Institute.
Shonkoff, J.P. 2000. ‘Science, Policy and Practice:
Three Cultures in Search of a Shared Mission.’
Child Development 71(1):181–187.
Staum, M.S. 1996. Minerva’s Message: Stabilizing
the French Revolution. Montreal and Kingston:
McGill-Queens University Press.
Stone, D. 2001. Getting Research into Policy? Pa-
per for Global Development Network Conference,
Rio de Janeiro, December.
Stone, D. and A. Denham, eds. 2004. Think Tank Tra-
ditions: Policy Research and the Politics of Ideas.
Manchester: Manchester University Press.
Sullivan, H. and C. Skelcher. 2002. Working across
Boundaries: Collaboration in Public Services.
Houndmills UK: Palgrave Macmillan.
UK Cabinet Office. 1999. Professional Policy Mak-
ing for the Twenty First Century. London: Cabinet
Office.
UKCSS [UK Commission on the Social Sciences].
2003. Great Expectations: The Social Sciences in
Britain. London: Commission on Social Sciences.
Wenger, E. 1998. Communities of Practice: Learn-
ing, Meaning and Identity. Cambridge: Cam-
bridge University Press.
C
2008 The Author
Journal compilation C
2008 National Council of the Institute of Public Administration Australia