Content uploaded by Jonathan Shepherd
Author content
All content in this area was uploaded by Jonathan Shepherd on Jul 28, 2014
Content may be subject to copyright.
231
debate
Key words evidence • quality •standards • public services
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
© The Policy Press • 2007 • ISSN 1744 2648
The production and management of evidence
for public service reform
Jonathan Shepherd
English Public services depend on the production, distribution and application of evidence of
effectiveness. Existing strengths of this process in the UK include many university schools closely
associated with services, a strategic partnership of Research Councils, some service-specific excellence
institutes and successful models for evidence management, particularly in health. However, there
is little recognition that this research is fundamental to society. There are few formal connections
between the national science base, the Research Councils and public ser vices designed to focus
research effort or to increase evidence quality. Components of evidence management systems need
to be recognised as such, integrated and connected. At each stage, practical reforms are identified,
designed to ensure that services continuously develop in response to reliable new knowledge.
Français Les services publics dépendent de la production, de la distribution et de l’application
des preuves d’efficacité. Les forces de ce procédé au Royaume Uni comprennent de nombreuses
écoles universitaires étroitement associées à des ser vices, un partenariat stratégique entre les
Research Councils, certains instituts d’excellence spécialisés en service, et des modèles de réussite
en gestion de preuves, particulièrement dans le domaine de la santé. Cependant le fait que cette
recherche est fondamentale pour notre société est mal reconnu. Il y a peu de liens formels entre
la base nationale scientifique, les Research Councils est les services publics conçus pour centraliser
leur effort de recherche ou pour augmenter la qualité des preuves. Les composants de systèmes
de gestion de preuves devraient être reconnus comme tels, intégrés et connectés. A chaque stade,
des réformes pratiques sont identifiées et conçues pour s’assurer que les services continuent à se
développer en réponse à de nouvelles connaissances fiables.
Español Los servicios públicos dependen en la producción, distribución y aplicación de la evidencia
de eficacia. La solidez existente de este proceso en el Reino Unido incluye muchas facultades
universitarias asociadas de cerca con servicios, una asociación estratégica de Juntas de Investigación,
algunos institutos de excelencia de ser vicio específico y modelos con éxito para la evidencia
administrativa, sobre todo en el sector sanitario. Sin embargo, existe poco reconocimiento de que
esta investigación es fundamental para la sociedad. Existen pocas conexiones formales entre la base
de ciencia nacional, las Juntas de Investigación y los servicios públicos diseñados para enfocar el
esfuerzo de investigación o para aumentar la calidad de evidencia. Componentes de evidencia en
los sistemas de dirección necesitan ser reconocidos como tales, integrados y conectados. En cada
etapa, se identifican reformas prácticas diseñadas para asegurar que los servicios se desarrollan
continuamente en respuesta a nueva información fiable.
232
Public services, dened here as national, publicly funded services, are important for
everyone: citizens, families, children, those in or seeking employment, older people and
international visitors. Reecting this, public service standards and delivery are major
priorities for every political party, every government and all government departments,
assisted by cross-government bodies including the UK Cabinet Oce (Cabinet
Oce, Strategic Policy Making Team, 1999) and the Prime Minister’s Delivery Unit.
Although scrutiny and reform of the delivery of public services have been intense in
recent years, service eectiveness has not, perhaps, received as much attention. The
central questions discussed in this article are what can public services learn from each
other about the generation and implementation of reliable knowledge – evidence
of what works – and which reforms in this context are required? The article is based
on the fundamental principles that public services, including healthcare, education
and criminal justice services, should be eective and cost-eective, and that eective
interventions and the methodology to evaluate them reliably are products of science
theory and application.
More integrated government in the UK is exemplied by many new national and
local partnerships. These bring together for the rst time agencies between which
there has been little or no formal collaboration, and provide new opportunities for
cross-public service comparisons not only by policy makers but also by practitioners,
front-line managers and academics in university schools whose work relates to
public services (Oakley, 1998; Shepherd, 2001). This article sets out the ndings
of such a comparison. Practical reforms are identied, designed to provide generic
and service-specic processes to ensure that evidence standards in all services are
brought up to the levels of the best. The production and management of evidence
are considered here with reference to UK criminal justice, health and education
services particularly. Since the delivery of every intervention needs resources,
and policy makers therefore need information about the costs of alternatives, this
article is also concerned with evidence of resource use.
The article is structured as follows. First, the public policy context is reviewed,
with particular reference to national science strategy, research infrastructure and
how these support public services. Second, the principles of eective and ecient
evidence production and management are considered. Third, ways in which
evidence is currently produced, distributed and applied from the perspectives
of healthcare, higher education and criminal justice are compared. Fourth,
observations are synthesised and reduced to two principal conclusions from which
a series of practical reforms are derived.
Public policy context
The UK government has made public services and science – the source of innovation
and reliable evaluation methodology – key priorities. Three recent initiatives are
summarised here. First, a new framework has been proposed for science, technology
and engineering research and innovation to contribute to economic development
and public services (HM Treasury et al, 2004). Although the invitation to contribute
assumes that Research Councils and other funders commission research according
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
233
to the needs of public services, there are few mechanisms to ensure that this is the
case or that science contributes to raising standards of evidence of eectiveness and
cost-eectiveness. New eective interventions stem, of course, from fundamental
and applied research, the products of which cannot always be foreseen. However, the
strategic roles of the Research Councils in relation to public services should, perhaps,
extend beyond these functions, to support public services specically and to set quality
standards for the evidence on which public services are based.
The eight UK Research Councils, members of the new Research Councils UK
(RCUK)1 strategic partnership which champions research, training and innovation,
are the main UK public investors in ‘fundamental’ research, although ‘fundamental’
is not dened. Surprisingly, research relevant to public services is not included
explicitly in its core list of eight main activities or seven joint services although
research in this category is considered a priority by some individual Research
Councils. Activity in the context of ‘science in society’ is concerned with raising
public awareness and engagement in sciences and innovation. RCUK Evaluation
Guidelines (People, Science and Policy Ltd, 2005) are to do with evaluating projects
designed to communicate science and not evaluation more widely, for example of
public service interventions.
The relevant work of four UK Research Councils is summarised here. The
Biotechnology and Biological Sciences Research Council (BBSRC), while
promoting opportunities for knowledge transfer for the benet of the UK economy
and society, usually assigns responsibility for intellectual property management and
exploitation to the universities and institutes that undertake BBSCRC research
and not to public service bodies. The Engineering and Physical Sciences Research
Council (EPSRC) aims to encourage partnership between researchers and industry,
commerce, government agencies, local authorities, other public bodies, charities
and the service sector (including National Health Service [NHS] Trusts), where
this can help the ‘take-up of research’. However, there is no explicit provision
for the service sector to inform EPSRC research priorities. The work of the
Economic and Social Research Council (ESRC) acknowledges that ESRC ndings
provide invaluable insights for groups as varied as parents, local councillors, police
ocers and business executives. Importantly, the 1982 Rothschild Review of the
precursor Social Science Research Council (SSRC) recommended a greater focus
on empirical research and research related to public concerns, leading to the SSRC’s
restructuring as the ESRC, with committees addressing economic aairs, education
and human development, environment and planning, government and law, industry
and employment and social aairs. However, while ESRC guidance on ‘what social
scientists do’ includes work with government departments, it does not mention work
with public service practitioners, or organisations such as the Campbell Collaboration,
which distils evidence relevant to social and economic services. The evolution of
the ESRC since its inception makes the development of an ESRC Field Trials Unit,
proposed in 2003 (Shepherd, 2003c), to inform policy and practice in education and
the criminal justice system, a very logical next step (see below).
The Medical Research Council (MRC) encourages and supports high-quality
research to improve human health, and produces skilled researchers to improve
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
234
quality of life. There is much more emphasis in this Research Council on translational
approaches at the basic research/practice interface, informed by an MRC-Health
Departments Delivery Group to ensure that its work fully reects national health
priorities. Practice-orientated research is assisted by the MRC Clinical Trials Unit
and has included the development of magnetic resonance imaging, the discovery
that folic acid supplements reduce the risk of spina bida and that combination drug
therapy benets AIDS suerers. Systems to ensure that practice prompts research, that
practitioners are trained in research techniques and that research is translated into
practice are much more developed than in the other Research Councils. Uniquely, the
MRC website includes a point of contact specically for practitioners that welcomes
feedback on clinical research news, and the organisation of the MRC recognises that
practice-related research is fundamental. More recently, the implementation of the
recommendations of the Cooksey Review has integrated basic and applied health
research even further, for example by bringing together MRC and NHS Research
and Development funding (Cooksey, 2006).
The second of the three recent research policy initiatives is the Cross-cutting review
of science and research (HM Treasury et al, 2002). This concluded that while additional
funding from government should be an essential component of the overall science
and research strategy, changes to the frameworks and structures that govern funding
streams were also essential to creating the right incentives. Although the focus of the
Review is ‘... how to maximise the benets provided by public spending on science
and research to the UK’s economy and quality of life’ (para 7, p 12), there is much more
emphasis on the building blocks of science and engineering than on the needs of public
services and how these should be met. Recommendations on knowledge transfer
begin not with the essential rst step of producing evidence, but on ‘transferring the
outputs of research rapidly and eectively into the economy and society’ (para 177,
p 63). This assumes, without justication, that the means of evidence production are
in place and that research is driven by the needs of public services. It is stated that ‘the
most important ways of transferring knowledge, technology and know-how from
the science and engineering base are through the supply of highly skilled people,
and through the publication of the results of research’ (para 181, p 64). However,
publication of the results of research is only a small part of evidence distribution and
application. Just as important are structures dedicated to the overall management of
evidence, such as the medical school model, which integrates research, public service
practitioner education and training and evidence application in the same institution,
and the Excellence Institutes (see below), which distil evidence into guidelines for
practitioners against which practice reform can be audited. Interestingly, the section
in the Cross-cutting Review on ‘Science in government departments’ focuses on the
use of high-quality science and the credibility of departmental policy making, but
does not consider the management of evidence – the role of government scientists in
making judgements about evidence standards, for example – and public services per
se. It is concerned primarily with research inside government rather than the overall
management of research: the emphasis is on the national science and engineering
research base, not on the purposes of research. There is a section on medical research
but no sections on research relevant to other public services.
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
235
The third recent UK science policy initiative – the Roberts Review (Roberts,
2002) – focuses on the supply of people with science skills. It acknowledges that
continuous innovation is key to business success, but there is little or no emphasis
on public service innovation and evaluation. Although the Roberts Review does
not dierentiate between scientists and practitioner-scientists (see below), there is a
focus on engineering: many engineers are practitioner-scientists. There is a welcome
commitment to a new National Centre for Excellence in Science Teaching, an area
that has been in crisis. However, an Excellence Institute serving the whole of primary
and secondary education, drawing on expertise in the National Institute for Health
and Clinical Excellence (NICE) and the other Excellence Institutes, could contribute
substantially to raising service standards more widely.
In summary, although there have been several government proposals and practical
steps to strengthen the science foundations of society, there have been few attempts
to integrate these foundations with public services. The UK government’s approach
to public service reform (Cabinet Oce, Prime Minister’s Strategy Unit, 2006),
refers to ‘innovation’ but only by service professionals in response to competition
and contestability, not by the research community or by organised research and
development. There have been no formal cross-cutting government reviews,
and no legislation on the production, distribution or application of evidence of
eectiveness across services (Jackson, 2006).
This is not the place for an exhaustive review of international policy development
in this area, but the issues discussed here are, of course, not unique to the UK. In
the US, for example, there have been repeated calls for federal eorts to set rigorous
research standards to inform education policy and practice (Coalition for Evidence-
Based Policy, 2003) and these have borne fruit both in the education eld and more
widely, with the subsequent adoption by the Oce for Management and Budgets of
evaluation guidelines on ‘What constitutes strong evidence of a program’s eectiveness’
(Coalition for Evidence-Based Policy, 2004).
The Excellence Institutes
The term Excellence Institute is used here to describe national bodies that distil
guidelines from international evidence for professional practice in public services.
Three are described below.
NICE was formed in 1999 and joined with the Health Development Agency in
2005. NICE is an independent UK organisation responsible for providing national
guidance on the promotion of good health and the prevention and treatment of
ill-health. It has published, at the time of writing, 132 sets of clinical guidelines
and technology appraisals, all of which are rooted in high-quality evidence. NHS
Primary Care Trusts have a statutory obligation to fund the recommendations of
NICE technology appraisals within three months of publication. Health services are
expected to deliver care and public health services according to NICE guidelines,
but there is no requirement as yet, for other services, such as the police and social
services, to comply with guidance published by Excellence Institutes in their elds
of practice.
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
236
The National Centre for Policing Excellence (NCPE)2, established in 2002 and
now part of the new National Police Improvement Agency (NPIA), publishes codes,
practice advice and guidance. The rst such guidance, on domestic violence, was
published in 2004 and 14 further codes and sets of advice were published in 2005,
all, thus far, commissioned by the Association of Chief Police Ocers. In contrast
to NICE guidance and technology appraisals, NCPE guidance has depended almost
exclusively on previous guidance collated from various sources, and to a much lesser
extent on research evidence: the NCPE guidelines on domestic violence, for example,
do not include any references to published research, such as ndings that arresting
suspects (Maxwell et al, 2001) and imposing protection orders (Holt et al, 2002)
reduce repeat victimisation. However, NPIA provides a means to increase the extent to
which policing is informed by reliable evidence of eectiveness, for example through
collaboration with the Campbell Collaboration Crime and Justice Group.
The Social Care Institute for Excellence (SCIE)3, founded in 2001, aims to improve
the experience of people who use social care by developing and promoting knowledge
about good practice in the sector. SCIE develops resources from ‘knowledge’ gathered
from diverse sources and a broad range of people and organisations, and publishes a
range of outputs: research briengs (19 to date), reports (10 to date), practice guides
(ve to date), resources guides (three to date), position papers (four to date), knowledge
reviews (nine to date) and resource packs. Publications are fewer in number than
those issued by NICE and more diverse in nature, with some based on high-quality
evidence. Systematic reviews are the subject of one SCIE report (Macdonald, 2003).
Others have drawn attention to the increasing importance of systematic reviews to
policy making (for example, Boaz et al, 2002) and the use of evidence from diverse
study designs (Popay and Roen, 2003).
In summary, these exemplar Excellence Institutes vary in the extent to which their
published guidelines are based on high-quality evidence (as dened by the NICE
model) or even any evidence at all. Some published guidance is limited to a summary
of non-evidence-based guidance whereas other guidance is based on high-quality
systematic reviews of randomised experiments. This disparity strongly suggests that
the functions and structure of these institutes should be reviewed, and that dialogue
between them and a generic approach to standard setting for the management of
evidence would do much to focus their work. This could be one of the functions of
the Evidence Standards Board proposed below.
Principles
An important principle is that the production, distribution and application of evidence
of eectiveness of public services should be eective and cost-eective. Evidence
production, distribution and application processes should be t for purpose in the
same way that services themselves should be t for purpose. Reforms must start at
dierent levels according to the dierent standards prevalent in the various services,
and there is much to be gained from sharing expertise across services.
Higher public service standards depend on the development of more eective
and more cost-eective practice, and on the identication of and disinvestment
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
237
in ineective practice. Discovery and continuous, organised evaluation have
revolutionised the eectiveness of some public services, for example the NHS.
This process has led to better outcomes, for example in the treatment of heart
disease, most infections and many cancers. On the basis of good-quality evidence,
there has been major disinvestment as well as targeted investment, for example
the switch from hospital admission to daycare for most surgery (Aylin et al, 2005),
and a 70% reduction in wisdom tooth removal since 1996, when it was the most
frequent surgical operation in the UK (Sheldon et al, 2004). All these reforms
have stemmed not from service reorganisation (which has followed, not preceded
the publication of reliable evidence) or from better delivery, but from rst-class
evidence largely produced and distributed by medical and dental schools, the
exemplar university public service schools discussed later in this article.
Public service interventions should be known to be eective on the basis of
ecient, objective evaluation. Evaluation standards determine the extent to which
conclusions are misleading or ambiguous and resources are wasted. Evidence of the
eectiveness and cost-eectiveness of particular interventions should be considered
cumulatively, recognising that the results of pioneering trials can be contradicted
in subsequent experiments (Ioannidis, 2005). It should also be remembered that
the conclusions of many high-quality scientic evaluations can be counterintuitive.
The Scared Straight initiative (Petrosino et al, 2006) and boot camps without a
therapeutic element (Wilson and MacKenzie, 2006) have, for example, been found
to increase rather than reduce recidivism, while blanket counselling after disasters
has been found to make things worse for survivors (Rose et al, 2002).
Where there are objective measures and experimentation is feasible and ethical,
controlled trials, particularly randomised controlled trials (RCTs), provide the most
authoritative conclusions. Hence the emphasis in this article on the contributions
of the national science base where RCT and other methodological expertise reside.
However, applied science and national science capacity have their limits (Weatherall,
1995; Mulgan, 2005). It is the nature of many spheres of practice – policing no less
than surgery – to be as much an art as a science, and this should be acknowledged in
the selection of evaluation/assessment methods. The values, renement and objective
assessment of the art of practice – competencies – should not be neglected as the
boundaries of science expand.
Perspectives and observations
In this appraisal of the generation and implementation of evidence, the perspectives
of an NHS clinical director and consultant, chair of a Crime and Disorder Reduction
Partnership, and clinical professor in medical science and criminology are combined.
The objective is to generate practical proposals that, when implemented, will improve
the utility and increase the quantity of science-based interventions available to policy
makers and practitioners (Walter et al, 2003). It has been prompted, in part, by
previous policy development and implementation (Shepherd, 2003b), for example,
owing from the discovery that the NHS treats large numbers of victims of crime
not reported to the police, by the development of the prototype Crime and Disorder
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
238
Reduction Partnership in England and Wales to enable health and police data and
prevention eorts to be integrated (see guidance to the Crime and Disorder Act:
Home Oce, 1998).
The eectiveness of all UK public services is being improved, but to highly
variable degrees and at very dierent rates, reecting dierences between services
in the production, distribution and application of new knowledge of central
importance to practice. With regard to evidence production, the numbers of eld
trials in medicine have far outstripped numbers in the social sciences, education
and criminal justice combined (Figure 1). For example, in criminal justice only
85 trials of any size were carried out between 1982 and 2004 (Farrington and
Welsh, 2006). This may, in part, reect the lack of focus or consistency among the
Research Councils in relation to the production of research evidence relevant to
public services: the MRC is the only Research Council with a trials unit dedicated
to evaluating interventions.
There are unique challenges to performing RCTs in education and criminal justice
– units of randomisation are often dierent from those in medicine, for example
– but they can be overcome (Farrington and Welsh, 2005). To meet them, a eld
trials unit should be developed by the ESRC or elsewhere in government, to drive
up standards and the quantity of robust evidence in criminal justice, social, education
and economics services. This should capitalise on the many methodological and
process lessons learned in the MRC Clinical Trials Unit and provide a similar
advisory service for social scientists. This is just as important and more urgently
needed than increasing the number or rigour of systematic reviews, since without
the production of high-quality evidence, reviews are of little use.
The production and management of evidence for public service reform
Figure 1: 20th century RCTs in health and in social welfare, education,
crime and justice
0
40,000
80,000
120,000
160,000
200,000
1900-1909
1910-1919
1920-1929
1930-1939
1940-1949
1950-1959
1960-1969
1970-1979
1980-1989
1990-1999
Decade
Number of RCTs
Health – Cochrane Library
Social welfare, education,
crime and justice – Campbell Collaboration
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
239
With regard to evidence distribution, the NCPE, taking one example, produced
15 largely non-science-based guidelines in 2004/05, compared to 44 sets of
science-led guidelines produced in the same period by NICE. The delivery of
cost-eective public services depends on the availability to practitioners, managers,
service purchasers and regulators of concise, readable, prescriptive, evidence-based
guidelines on good practice. These are available in healthcare, but rarely in other
public services. Social workers and police ocers do not, for example, benet from
publications equivalent to the British National Formulary4 and Clinical Evidence5
published by British Medical Journal Publishing, or the clinical guidelines for
doctors and dentists published by NICE.
Although Excellence Institutes have been established to support some public
services by summarising and distributing evidence in the form of authoritative
practice guidelines, they are lacking in others, for example primary and secondary
education (apart from the newly established National Centre for Excellence in
Science Teaching). Reecting this perhaps, reliable evidence of the eectiveness of
synthetic phonics as a teaching method, for example, has been slow to emerge and
has rarely been considered cumulatively. The absence of an Excellence Institute in
the criminal justice system is also striking. Furthermore, education and criminal
justice (including police and victims) services, have, comparatively, very little high-
quality research evidence of eectiveness on which to draw and far fewer sources
of evidence than other public services.
There is currently little or no contact between the UK Excellence Institutes. Formal
dialogue between them and the formation of cross-service public policy institutes,
such as the Washington State Institute for Public Policy,6 which formulates policy
in criminal justice, education and social welfare, would prompt the implementation
of best evidence production and application practice across public services. In the
UK, an Evidence Standards Board comprising representatives of higher education
and private sector producers and reviewers of evidence, the Excellence Institutes,
public service regulators and national service providers could be one way forward.
Its functions would include the setting and maintenance of standards of evidence
across services, standards for guideline production and standards for Excellence
Institute products, taking account of the dierences between services and without
duplicating existing arrangements. Importantly, common evidence quality and
synthesis standards could increase evidence production and, indirectly, the demand
for high-quality evidence.
Organisations that produce, distribute and apply evidence are rarely connected,
either in the same public service or across services. The Economics Methods Group
(EMG)7 is almost unique in this respect. The EMG promotes the production and
distribution of information to policy makers and others about the costs (resource
use) of interventions, facilitating decisions about alternatives, and operates as a
formal partnership within and between the Cochrane Collaboration,8 which
summarises evidence of the eectiveness of health interventions, and the Campbell
Collaboration9 (Davies and Boruch, 2001), which summarises evidence of the
eectiveness of education, crime and justice and social welfare interventions. Joint
approaches like this should be developed far more widely.
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
240
Ecient and eective public service delivery depends on practitioner-managers
such as clinical directors in the NHS and unit commanders in the police service
who, in response to guidelines and high-quality evidence, recognise and discard
ineective practices and adopt proven eective practices promptly. The extent to
which practitioner-managers do this, however, varies enormously. No business case
for a new treatment in the NHS would succeed without convincing evidence of
both treatment eectiveness and cost-eectiveness, while (low-quality) evaluations
of police interventions frequently follow, rather than precede, implementation. Local,
formal opportunities are needed for sharing good practice in the application of
evidence across services to allow police managers, for example, to share with NHS
managers the rapid-response culture generally prevalent in the police service but
less common in the NHS. In the other direction, such opportunities would allow
NHS clinical directors to share their far greater reliance on high-quality evidence
with police colleagues.
The education and training of public service practitioners and their managers
must take account of the constantly enlarging frontiers of science. Crime control,
for example, is now a science, providing opportunities to bring scientic rigour to
evaluation. This makes statistical knowledge among practitioners and the training of
public service statisticians increasingly important. Greater reliance on crime analysts
is an opportunity to drive up crime prevention standards by developing a discipline
of crime epidemiology – there is no equivalent of public health or epidemiology
in police services in the UK or more widely.
Practitioner-scientists (Shepherd, 2002) and clinician-specialist partnerships in
medical and dental schools apply research ndings in their own clinical practice, a
process that is constantly scrutinised by trainee doctors and dentists, and audited.
This crucial interface for continuous reform throughout healthcare is characteristic
of few, if any, other public services (Shepherd, 2003c). For example, teachers in
primary and secondary education generally stop teaching when they become
lecturers in education departments and colleges, increasing the potential for losing
sight of the reality of the day-to-day, month-to-month challenges of teaching.
Although the selection, education and training of teachers are now carried out in
close collaboration with schools, the requirement that higher education personnel
who teach trainee teachers have ‘recent and relevant’ teaching experience has been
lost, weakening the connections between research, teaching and practice.
Concerns about erosion of clinical research capacity in the UK, set out in
the Walport Report (MMC and UKCRC, 2005) have recently prompted the
Department of Health and the Clinical Research Collaboration, together with
the UK heads of medical and dental schools and the postgraduate medical deans,
to develop an integrated training pathway for clinical scientists. Imperial College
London is currently taking this a step further still, by developing a Practitioner
(Clinical) Academic Training School to formalise the training of clinical scientists
who produce, distribute and apply evidence. How much more, then, are similar
initiatives required to drive up standards in other public services?
University public service schools, a concept introduced here, are exemplied
by medical schools led by practitioner-scientists and staed by teams of clinicians
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
241
and scientists. They are communities that produce, distribute and apply evidence.
They also integrate practice, training and research, enabling all three functions to
prompt each other, instil lifelong practitioner objectivity and promote an evidence-
based culture. But some public services, such as oender management and police
services, are not underpinned by university schools at all (Shepherd, 2003a, 2004).
Furthermore, some that are, such as social services, need to adopt a more prescriptive,
scientic approach to ensure, for example, that eective early family support/
preschool education interventions developed elsewhere are replicated and rened. As
with drugs in medicine, the identication of eective social interventions depends
on strict replication, renement and delivery of the active ingredients.
Police science and oender management science need to be recognised as distinct
bodies of knowledge needing investment in research capacity in their own right,
reecting the importance of policing and oender management services and their
desperate need for eective interventions. Some research relevant to these disciplines
is already carried out in university schools of social science, law, city planning
and psychology, with which new service-perspective schools should collaborate.
However, continued reliance on these related disciplines is no longer enough. Doing
so is analogous to relying on schools of chemistry and physics for medical research
and the implementation of ndings in medical practice. University public service
schools would provide communities of students, trainees, innovators, evaluators,
educators and practitioners in which evidence is constantly generated and applied.
The solution advocated here is service-specic (not cross-service) schools that are
homes to enthusiastic opinion leaders (known to facilitate research implementation:
Nutley et al, 2003), practitioner-academics and practitioner-scientist partnerships,
where there is robust research, education and practice governance. These schools
would also bridge the ‘knowing–doing’ gap, identied by Pfeer and Sutton (2000)
in another (private sector) context.
The UK Clinical Research Collaboration,10 which brings together all the main
agencies responsible for directing, funding, regulating and participating in clinical
research, has been established to accelerate the translation of scientic advances
into patient care, to increase treatment and prevention eectiveness and to expand
the clinical research workforce. However, the even greater need to achieve these
objectives in criminal justice, education and social services has not yet been
recognised in this way. Similar research collaborations, which embed research into
other public services, are needed to sustain and accelerate the production, distribution
and application of evidence.
In themselves, codes of eective policy and practice – concise, readable, service-
based guidelines that are published and distributed to managers and practitioners
– do not guarantee that such policy and practice will be implemented. They are
essential, however, to commission and audit services rationally and in the context
of communication in and between public service schools, Excellence Institutes,
regulatory bodies, communities of practitioners and policy makers.
To maintain a sharp focus on reform in particular public services and to maximise
the relevance, volume and responsiveness of research from government departments,
universities and the private sector, service-specic research and development (R&D)
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
242
schemes are needed. The NHS R&D scheme is funded from the health service
budget but there is no equivalent in many public services, including education,
oender management and police services. Furthermore, Oce for National Statistics
reports on UK research funding do not categorise spending by public service (see
Figure 2) and accountability for research spending in some public services is often
not clear. For example, although a proportion of police funding is hypothecated for
research, the published products of this are not dened or audited. Levels of public
service research spending should be published to focus attention on investment and
reform of eectiveness in each service. The establishment of the National Institute
for Health Research (NIHR) will sustain health service research, but other public
services are not committed to research in the same way.
Discussion
Public services comprise interventions delivered by practitioners. These
interventions, funded by the taxpayer, should be eective and ecient. Meeting
these requirements depends on reliable evidence, which must be produced,
distributed and applied. This appraisal of these processes has identied many
strengths, but also obvious disparities and opportunities for practical reform.
Strengths include some university schools that are closely associated with public
services, and a strategic partnership of publicly funded research councils (RCUK);
The production and management of evidence for public service reform
Figure 2: Research spending by sector
Source: Office for National Statistics (2004) Research and experimental development (R&D)
statistics 2002
Higher education
funding councils
8%
Government
departments
11%
Business enterprise
48%
Higher education
institutions
1%
Abroad
20%
Research councils
7%
Private non-profit
5%
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
243
the Excellence Institutes; the Cochrane and Campbell Collaborations; and
successful models for evidence management, particularly in health. Two major
factors have been identied that help to explain disparities between services with
regard to the quality and quantity of evidence production, distribution and application.
The rst is a lack of connection between existing components of what could become
rst-class evidence systems (although few components are present in many services);
and the second is the limited integration of the very strong UK science base with
public services. Proposals for practical reforms designed to address these weaknesses
are set out in Box 1.
Box 1: Proposals
• A national Evidence Standards Board should be established to set standards across all
public services.
• Research Councils UK should recognise that the production and management of
evidence of public service effectiveness is one of the fundamental research needs of
society.
• An Economic and Social Research Council (ESRC) field trials unit should be developed
to increase the quality and volume of evidence production.
• Service-specific Research Collaborations and R&D schemes should be components of
evidence systems supporting all major public services.
• Service-specific schools in universities should be designated ‘public service schools’ and
their contributions to producing, distributing and applying evidence formalised.
• University public service schools, which integrate training, research and services, should
be a foundation of all public services.
• High-quality evaluation of the services of which they are a foundation should be a core
function of all public service schools.
• Concise, readable, science-based guidelines should be published and distributed to
managers and practitioners in all major public services.
• National Excellence Institutes responsible for the publication of science-based practice
guidelines should be a foundation of all public services, and formal dialogue should be
instituted between them.
• Local fora should be developed to promote sharing of best implementation practice by
practitioner-managers across services.
• Applied statistics training is an essential part of practitioner and management training,
and should be audited.
• Active practitioner-academics committed to the implementation of reliable evidence
are needed in all public service schools.
• Greater formalisation, transparency and accountability are needed with regard to public
service research spending.
It is clear from this appraisal that the language used in debates about research methods
and the use of research ndings to inform public services diers between disciplines.
In medicine, ‘evidence-based’ practice is more to do with the implementation of
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
244
eective therapeutic and preventive interventions – particular drugs, operative
procedures or safety measures – than with wider policy questions. In contrast, in the
social sciences, there has been a greater emphasis on ‘knowledge’ than ‘evidence’ and
more emphasis on ‘policy’ than ‘practice’. Furthermore, in medicine the emphasis
is on clinical eectiveness, not surprisingly, since the right treatment can make the
dierence between life and death, while interventions at the operational level have
attracted far less attention in social policy. This is exemplied by the low regard until
recently for situational crime prevention (Clarke, 1997) even though interventions
in this category are amenable to randomised experiments (Warburton and Shepherd,
2000; Shepherd 2003c).
Although it is recognised that social conditions are just as important determinants
of health as health services themselves, the eectiveness of social interventions
– what housing, criminal justice and education practitioners actually do – has
received far less rigorous research attention than medical interventions. The reasons
for this seem to be structural. The integration of practice, training and research
is largely conned to medical and other clinical schools, and in medicine, theory
and practice have about equal status. For example, Florey and Chain, who did
not discover the anti-microbial properties of penicillin but developed penicillin
in its therapeutic form, were Nobel Prize winners as well as Alexander Fleming.
There are dierent hierarchies of values in the social sciences, such that technical,
largely practice-orientated research is often considered low grade and its published
ndings are often considered to be more ‘footnotes’ than fundamental.
There is, therefore, much to be gained from sharing evidence standards, expertise,
processes, infrastructure templates and research training programmes across services.
Standards vary greatly and worthwhile reform is likely to advance most rapidly
where standards are high, reecting a reliance on controlled and randomised
experiments and on a sustainable cadre of practitioner-academics and applied
scientists in universities. Government proposals in Science and innovation (HM
Treasury et al, 2004) emphasise the need for corporate governance in universities
but should go further, specically focusing the research and knowledge transfer
responsibilities of universities on public services; hence the proposal in this article
that university departments linked to public services are redesignated ‘public service
schools’, with contractual responsibility to focus research eort on the needs of their
services. Although arrangements for the production and management of evidence
in medical science are taken as a model for reform, it is also acknowledged that
the NHS may have a great deal to learn from other services, for example the rapid
responsiveness culture prevalent in the police service.
Public sector research establishments such as universities need not only to clarify
long-term approaches to sustaining research activity and supporting infrastructure
but also to ensure that there are strong links with the public services who need the
evidence that they can produce. While the drive to translate knowledge into innovation
in public services more eectively is very important, the reverse is also true: that the
research needs of public services need to be continually reviewed and met, including
by the steps proposed here. In the UK health services, research is a core function.
The same should be true of other public services. These are practical ways in which
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
245
government can create conduits for productive ows of evidence, as well as of ‘ideas
and people’, between research and practice. The management of evidence is too
important an issue to be left mainly to researchers – the principal focus of RCUK
– while a policy limited to creating conduits of ideas and people addresses only a small
part of the overall systems that need to be created. This is why this article focuses on
the bedrock of evidence: ideas constitute untested proposals and people are, in this
context, instruments for the production, distribution and application of evidence. Thus
the proposals for reform set out here address the need to ‘build a bridge between
what is and what ought to be in society’ (Gallagher, 1981, p 40).
An Evidence Standards Board would be one way to improve the production and
management of evidence across all public services. It is needed to dene and create
evidence systems, connect existing system components, set evidence production and
review standards, share expertise across services and forge links with service regulators.
It would raise standards in the Excellence Institutes to ensure, for example, that practice
guidelines are based not on the regurgitation of previous guidance but on objective
summaries of primary evidence. Such a Board should be established by government,
perhaps under the aegis of the Oce for Science and Innovation. No distinction is
made here between policy and practice, in part because of dierent denitions of these
two terms in dierent public services. Thus, the proposals set out here are as much to
do with informing choices between medical or surgical treatment options, dierent
teaching methods in primary education, or custodial or community sentences in the
criminal justice system as with how these interventions are formulated.
An Evidence Standards Board would dene standards against which consistent
judgements could be made about the cost-eectiveness of policies and practices, ensure
that critical analysis and training methods are shared across services, and increase the
quality and volume of evidence. However, such an initiative is not without potential
diculties. These include unnecessary bureaucracy, problems in reconciling the very
dierent approaches to evidence in dierent services, and the dangers of setting
standards in the absence of procedures to ensure that they are adopted. It is in this
context that the other steps proposed in this article, for example the development of
Excellence Institutes in each public service, will play their part.
The conclusion in the Cross-cutting review of science and research (HM Treasury
et al, 2002) – that government departments should have chief scientic advisers
– is welcome: their responsibilities should include the management of evidence
production, distribution and application, assisted by ‘senior ocials made responsible
for delivery of knowledge, transfer, goals and targets’ (p 64). However, the management
of evidence needs to go beyond the work of government departments and should
be coordinated across services. Priorities also include tightening the remit of the
Excellence Institutes and auditing services against the guidelines they publish. The
new Innovation and Knowledge Transfer Steering Groups in the Department of Trade
and Industry are also in a position to contribute in these areas.
The training of public service statisticians is a priority as the frontiers of science
are extended and evaluation methodology becomes more prescriptive and organised.
A principle in this set of proposals for reform is the evidence audit trail (Figure 3):
high-quality evidence must be generated in practice settings by practitioner-academics
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
246
or by teams of practitioners and scientists working together; accumulated and
summarised objectively; abridged for practitioners and their managers; and applied by
practitioners trained in the art of their profession as well as its scientic principles. The
components of such an audit trail are university public service schools and research
collaborations; organisations such as the Cochrane and Campbell Collaborations,
which review, classify and interpret evidence from all sources; the Excellence Institutes,
which distil and abridge cumulative evidence of eectiveness and cost-eectiveness;
and the practitioners and managers who apply it. Arrangements and personnel that
combine these functions are likely to be particularly cost-eective, for example public
service schools that encompass research, education and services – the medical school
model – and practitioner-managers, like police commanders, who can ensure that
only practice based on reliable evidence is implemented (see Figure 3). Research and
practice responsiveness will be fastest where communication between practitioners
and researchers is ecient, particularly where they are the same people. However,
robust governance in the management of evidence is essential at all levels whether
The production and management of evidence for public service reform
Figure 3:The production, distribution and application of evidence: the
evidence audit trail
Note: This audit trail is represented here in sequential form. It should not be inferred from this
that steps are not and should not be integrated. For example, in a university public service
school some steps can be integrated. Ethical committee approval, and research, educational
and service governance are essential, however evidence and practice are managed.
Evidence producers
Universities
Government departments
Research Councils and charities
Private sector
Evidence interpreters
Government departments
Universities
Cochrane/Campbell Collaborations
Excellence Institutes
Private sector
Media
Evidence implementers
Service practitioners and
managers
Evidence distributors
Government departments
Universities
Excellence Institutes
Professional bodies
Private sector
Media
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
247
evidence generation, dissemination and implementation are combined in the same
institution by the same people, or in dierent institutions. Many system components
are in place in some services but for evidence systems to work, these components
need to be connected.
The second major conclusion is that the national science base is not suciently
connected with public services. Scrutiny of recent policy proposals shows that reform
of research capability and capacity in the UK has almost always been considered
in the context of the economy (wealth creation through innovation) and ‘quality
of life’. Rarely, except in healthcare, has it been considered in terms of informing
public services. Previous eorts to support UK science have concentrated largely on
the development of science in isolation. While this rightly remains a major priority,
much more integration is necessary for public services to benet fully from the
world-leading capability of UK science. Formalising and systematising the scientic
underpinning of UK public services should have no negative resource implications
for science but could have far-reaching benets as the contributions of scientists
ensure that methodological rigour increasingly characterises the management of
evidence. Furthermore, there are potential benets for the scientic community
as demands for high-quality evidence by Excellence Institutes and public service
research collaborations rise, resulting in greater public and private sector investment.
The increased and more systematic production and distribution of evidence, with
interpretation of scientic ndings for practitioners and public service users, could
also reduce the isolation of the science community, with consequent benets in
improved public awareness and understanding of science.
The principal argument presented here, that evidence needs to be managed in
order for users to get the best and the best value out of public services, has largely
been substantiated. Above all, perhaps, in the context of public services whose quality
determines survival and well-being, social justice demands the universal application
of reliable evidence of eectiveness.
Acknowledgments
I thank David Richards, Daphne Shepherd, Peter Durning and three anonymous referees
for their very helpful comments.
Notes
1 Research Councils UK: www.rcuk.ac.uk/
2 National Centre for Policing Excellence: www.centrex.police.uk/
3 Social Care Institute for Excellence: www.scie.org.uk/
4 British National Formulary: www.bnf.org/bnf/
5 Clinical Evidence: www.clinicalevidence.com/ceweb/index.jsp
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
248
6 Washington State Institute for Public Policy: www.wsipp.wa.gov/
7 Campbell & Cochrane Economics Methods Group: www.med.uea.ac.uk/research/
research_econ/cochrane/cochrane_home.htm
8 Cochrane Collaboration: www.cochrane.org/
9 Campbell Collaboration: www.campbellcollaboration.org/
10 UK Clinical Research Collaboration: www.ukcrc.org/
11 National Police Improvement Agency: www.npia.police.uk/
12 National Institute for Health Research: www.nihr.ac.uk/
References
Aylin, P., Williams, S., Jarman, B. and Bottle, A. (2005) ‘Trends in day surgery rates’,
British Medical Journal, vol 331, no 7520, pp 803-4.
Boaz, A., Ashby, D. and Young, K. (2002) Systematic reviews: What have they got to oer
evidence based policy and practice?, Working Paper 2, London: ESRC UK Centre for
Evidence Based Policy and Practice.
Cabinet Oce, Prime Minister’s Strategy Unit (2006) The UK government’s approach
to public service reform: A discussion paper, London: Cabinet Oce, www.strategy.gov.
uk/downloads/work_areas/public_service_reform/sj_report.pdf
Cabinet Oce, Strategic Policy Making Team (1999) Professional policy making for
the twenty rst century, London: Cabinet Oce, www.policyhub.gov.uk/docs/
profpolicymaking.pdf
Clarke, R. V. (1997) Situational crime prevention: Successful case studies (2nd edition),
Guilderland, NY: Harrow and Heston.
Coalition for Evidence-Based Policy (2003) Identifying and implementing educational
practices supported by rigorous evidence: A user friendly guide, Washington, DC: Council
for Excellence in Government, www.excelgov.org/usermedia/images/uploads/
PDFs/User-Friendly_Guide_12.2.03.pdf
Coalition for Evidence-Based Policy (2004) What constitutes strong evidence of a program’s
eectiveness, Washington, DC: Oce for Management and Budget, http://coexgov.
securesites.net/admin/FormManager/lesuploading/OMB_memo_on_strong_
evidence.pdf
Cooksey, D. (2006) A review of UK health research funding, London: The Stationery
Oce.
Davies, P. and Boruch, R. (2001) ‘The Campbell Collaboration’, British Medical Journal,
vol 323, no 7308, pp 294-5.
Farrington, D. P. and Welsh, B. C. (2005) ‘Randomized experiments in criminology:
what have we learned in the last two decades?’, Journal of Experimental Criminology,
vol 1, no 1, pp 9-38.
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
249
Farrington D. P. and Welsh, B. C. (2006) ‘A half century of randomized experiments
on crime and justice’, Crime and Justice, vol 34, pp 55-132.
Gallagher, J. (1981) ‘Models for policy analysis: child and family policy’, in R. Haskins
and J. Gallagher (eds) Models for analysis of social policy: An introduction, Norwood,
NJ: Ablex Publishing Company, pp 37-77.
HM Treasury, Department of Trade and Industry and Department for Education and
Skills (2004) Science and innovation: Working towards a 10-year investment framework,
London: HM Treasury, DTI and DfES. www.hm-treasury.gov.uk/media/F1761/
science_406.pdf
HM Treasury, Department for Education and Skills, Oce of Science and Technology
and Department of Trade and Industry (2002) Cross-cutting review of science and
research: Final report, London: HM Treasury, DFES, OST and DTI), www.hm-treasury.
gov.uk/media/EDB/22/science_crosscutter.pdf
Holt, V. L., Kernic, M. A., Lumley, T., Wolf, M. E. and Rivara, F. P. (2002) ‘Civil
protection orders and risk of subsequent police-reported violence’, Journal of the
American Medical Association, vol 288, no 5, pp 589-94.
Home Oce (1998) Guidance on statutory Crime and Disorder Partnerships, London:
Home Oce Communication Directorate, www.nationalarchives.gov.uk/ERO/
records/ho415/1/cdact/cdaguid.htm#Contents
Ioannidis, J. P. A. (2005) ‘Contradicted and initially stronger eects in highly cited
clinical research’, Journal of the American Medical Association, vol 294, no 2, pp 218-
28.
Jackson, L. (2006) Personal communication, 9 January, Science Strategy Team, Oce
of Science and Technology.
Macdonald, G. M. (2003) Using systematic reviews to improve social care, SCIE Report
4, London: Social Care Institute for Excellence, www.scie.org.uk/publications/
knowledge.asp
Maxwell, C. D., Garner, J. H. and Fagan, J. A. (2001) The eects of arrest on intimate
partner violence: New evidence from the Spouse Assault Replication Program, Research
in Brief series, Washington, DC: National Institute of Justice, www.ncjrs.gov/
pdles1/nij/188199.pdf
MMC (Modernising Medical Careers) and UKCRC (UK Clinical Research
Collaboration) (2005) Medically- and dentally-qualied academic sta: Recommendations
for training the researchers and educators of the future, London: MMC and UKCRC,
www.mmc.nhs.uk/pages/academic
Mulgan, G. (2005) ‘Government, knowledge and the business of policy making: the
potential and limits of evidence-based policy’, Evidence & Policy, vol 1, no 2, pp
215-26.
Nutley, S. M., Percy-Smith, J. and Solesbury, W. (2003) Models of research impact: A
cross-sector review of literature and practice, London: Learning and Skills Development
Agency, www.lsrc.ac.uk/publications/index.asp
Oakley, A. (1998) ‘Living in two worlds’, British Medical Journal, vol 316, no 7129,
pp 482-3.
Oce for National Statistics (2004) Research and experimental development (R&D)
statistics 2002, Newport: Oce for National Statistics.
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
250
People, Science and Policy Ltd (2005) Evaluation: Practical guidelines, London:
Research Councils UK, and Oce of Science and Technology, www.rcuk.
ac.uk/news/evaluation.htm
Petrosino, A., Turpin-Petrosino, C. and Buehler, J. (2006) ‘Scared Straight and other
juvenile awareness programs’, in B. C. Welsh and D. Farrington (eds) Preventing
crime: What works for children, oenders, victims and places, Dordrecht: Springer, pp
87-103.
Pfeer, J. and Sutton, R. I. (2000) The knowing–doing gap: How smart companies turn
knowledge into action, Boston, MA: Harvard Business School Press.
Popay, J. and Roen, K. (2003) Using evidence from diverse research designs, SCIE Report
3, London: Social Care Institute for Excellence, www.scie.org.uk/publications/
knowledge.asp
Roberts, G. (2002) SET for success: The supply of people with science, technology, engineering
and mathematic skills: The report for Sir Gareth Roberts’ review, London: HM Treasury,
www.hm-treasury.gov.uk/documents/enterprise_and_productivity/research_and_
enterprise/ent_res_roberts.cfm
Rose, S., Bisson, J. and Wessely, S. (2002) ‘Psychological debrieng for preventing
posttraumatic stress disorder (PTSD)’, Cochrane Database of Systematic Reviews 2002,
Issue 2, Art. No.: CD000560. DOI: 10.1002/14651858.CD000560.
Sheldon, T. A., Cullum, N., Dawson, D., Lankshear, A., Lowson, K., Watt, I., West, P.,
Wright, D. and Wright, J. (2004) ‘What’s the evidence that NICE guidance has
been implemented? Results from a national evaluation using time series analysis,
audit of patients’ notes, and interviews’, British Medical Journal, vol 329, no 7473,
pp 999-1006.
Shepherd, J. P. (2001) ‘Criminal deterrence as a public health strategy’, Lancet, vol
358, no 9294, pp 1717-22.
Shepherd, J. P. (2002) ‘Britain’s public services need practitioner-academics’, Times
Higher Education Supplement, 9 August, p 12.
Shepherd, J. P. (2003a) ‘Universities could help transform the police service’, Times
Higher Education Supplement, 25 July, p 14.
Shepherd, J. P. (2003b) ‘Who’d be a reformer?’, British Medical Journal, vol 327, no
7426, p 1295.
Shepherd, J. P. (2003c) ‘Explaining feast and famine in randomised eld trials: medical
science and criminology compared’, Evaluation Review, vol 27, no 3, pp 290-315.
Shepherd, J. P. (2004) ‘A scientic approach to policing’, Police Review, 9 January,
p 15.
Walter, I., Nutley, S. and Davies, H. (2003) Research impact: A cross-sector review – literature
review, St Andrew’s: St Andrew’s University, Department of Management, Research
Unit for Research Utilisation, www.ruru.ac.uk
Warburton, A. L. and Shepherd, J. P. (2000) ‘Eectiveness of toughened glassware in
terms of reducing injury in bars: a randomised controlled trial’, Injury Prevention,
vol 6, no 1, pp 36-40.
Weatherall, D. (1995) Science and the quiet art: Medical research and patient care, Oxford:
Oxford University Press, pp 10-11.
The production and management of evidence for public service reform
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51
251
Wilson, D. B. and MacKenzie, D. L. (2006) ‘Boot camps’, in B. C. Welsh and D.
Farrington (eds) Preventing crime: What works for children, oenders, victims, and places,
Dordrecht: Springer, pp 73-87.
Jonathan Shepherd, Violence and Society Research Group,
Cardiff University, UK, shepherdjp@cardiff.ac.uk
Jonathan Shepherd
Evidence & Policy • vol 3 • no 2 • 2007 • 231-51