ArticlePDF Available

Abstract

When knowledge is uncertain, experts should avoid pressures to simplify their advice. Render decision-makers accountable for decisions, says Andy Stirling.
COMMENT
OBITUARY Brian Marsden,
keeper of comets,
remembered p.1042
RevIewIng Pool of peers grows
to cope with submissions
surge p.1041
mAThemATIcs Roger Penrose
reflects on 50 years and 6
volumes of work p.1039
cOnseRvATIOn Threats
to Adélie penguins
assessed p.1034
W
orldwide and across many fields,
there lurks a hidden assumption
about how scientific expertise
can best serve society. Expert advice is often
thought most useful to policy when it is pre-
sented as a single definitive’ interpretation.
Even when experts acknowledge uncer-
tainty, they tend to do so in ways that reduce
unknowns to measurable ‘risk. In this way,
policy-makers are encouraged to pursue (and
claim) science-based’ decisions. It is also not
uncommon for senior scientists to assert that
there is no alternative to some scientifically
contestable policy. After years researching
and participating in — science advisory
processes, I have come to the conclusion that
this practice is misguided.
An overly narrow focus on risk is an inad-
equate response to incomplete knowledge. It
leaves science advice vulnerable to the social
dynamics of groups and to manipulation
by political pressures seeking legitimacy,
justification and blame management. When
the intrinsically plural, conditional nature
of knowledge is recognized, I believe that
science advice can become more rigorous,
robust and democratically accountable.
A rigorous definition of uncertainty can be
traced back to the twentieth-century econo-
mist Frank Knight
1
. For Knight, a measur-
able uncertainty, or ‘risk’ proper ... is so far
different from an unmeasurable one that it
is not in effect an uncertainty at all. This is
not just a matter of words, or even methods.
The stakes are potentially much higher. A
preoccupation with assessing risk means
that policy-makers are denied exposure to
dissenting interpretations and the possibility
of downright surprise.
Of course, no-one can reliably foresee
the unpredictable, but there are lessons to
be learned from past mistakes. For example,
the belated recognition that seemingly inert
and benign halogenated hydrocarbons were
interfering with the ozone layer. Or the slow-
ness to acknowledge the possibility of novel
transmission mechanisms for spongiform
encephalopathies, in animal breeding and
in the food chain. In the early stages, these
sources of harm were not formally charac-
terized as possible risks — they were early
warningsoffered by dissenting voices. Policy
recommendations that miss such warnings
court overconfidence and error.
The question is how to move away
Keep it complex
When knowledge is uncertain, experts should avoid
pressures to simplify their advice. Render decision-
makers accountable for decisions, says Andy Stirling.
A UK crop circle, created by activists to signify uncertainty over where genetic contamination can occur.
G. GRAF/GREENPEACE
23/30 DECEMBER 2010 | VOL 468 | NATURE | 1029
© 20 Macmillan Publishers Limited. All rights reserved10
from this narrow focus on risk to broader
and deeper understandings of incomplete
knowledge. Many practical quantitative
and qualitative methods already exist (see
‘Uncertainty matrix’), but political pres-
sure and expert practice often prevent them
being used to their full potential. Choosing
between these methods requires a more
rigorous approach to assessing incomplete
knowledge, avoiding the temptation to treat
every problem as a risk nail, to be reduced
by a probabilistic hammer. Instead, experts
should pay more attention to neglected areas
of uncertainty (in Knights strict sense) as
well as to deeper challenges of ambiguity and
ignorance
2
. For policy-making purposes, the
main difference between the ‘riskmethods
shown in the matrix and the rest is that the
others discourage single definitive’ policy
interpretations.
AnY jUsTIfIcATIOn
There are still times when ‘risk-based
techniques are appropriate and can yield
important information for policy. This can
be so for consumer products in normal use,
general road or airline-safety statistics, or
the epidemiology of familiar diseases. Yet
even in these seemingly familiar and
straightforward areas, unforeseen pos-
sibilities, and over-reliance on aggre-
gation, can undermine probabilistic
assessments. There is a need for humil-
ity about science-based decisions.
For example, consider the risk
assessment of energy technologies.
The other graphic (see ‘The perils of
science-based’ advice’) summarizes
63 studies on the economic costs aris-
ing from health and environmental
impacts of different sets of energy tech-
nologies. The aim of the studies is to
help policy-makers identify the options
that are likely to have the lowest impact.
This is one of the most sophisticated
and mature fields for quantitative risk-
based comparisons. Individual policy
reports commonly express their find-
ings as if there were little room for
doubt. Many of the studies present no
— or tiny — uncertainty ranges. But
taken together, these 63 studies tell a
very different story
3
— one usually
hidden from policy-makers. The dis-
crepancies between equally authoritative,
peer-reviewed studies span many orders of
magnitude, and the overlapping uncertainty
ranges can support almost any ranking order
of technologies, justifying almost any policy
decision as science based.
This is not just a problem with quantita-
tive analysis. Qualitative science advice is
also usually presented in aggregated and
consensual form: there is always pressure
on expert committees to reach a consensus’
opinion. This raises profound questions over
what is most accurate and useful for policy. Is
it a picture asserting an apparent consensus,
even where one does not exist? Or would it
be more helpful to set out a measured array
of contrasting specialist views, explaining
underlying reasons for different interpreta-
tions of the evidence? Whatever the political
pressures for the former, surely the latter is
more consistent both with scientific rigour
and with democratic accountability?
I believe that the answer lies in supporting
more plural and conditional methods for sci-
ence advice (the non-risk quadrants shown
in ‘Uncertainty matrix’). These are plural
because they even-handedly illuminate a
variety of alternative reasonable interpreta-
tions. And conditional because they explore
explicitly for each alternative, the associated
questions, assumptions, values or inten-
tions
4
. Under Knightian uncertainty, for
instance, pessimistic and optimistic inter-
pretations can be treated separately, each
explicitly associated with assumptions, dis-
ciplines, values or interests so that these can
be clearly appraised. It reminds experts that
absence of evidence of harm is not the same
as evidence of absence of harm. It also allows
scenario analysis and the consideration of
sensitivity, enabling more accountable evalu-
ation. For example, it could allow experts to
highlight conditional decision rules aimed at
maximizing best or worst possible outcomes,
or minimizing regrets
5
.
The few sporadic examples of the appli-
cation of this approach show that it can be
practical. One particularly politicized and
high-stakes context for expert policy advice
is the setting of financial interest rates. The
Bank of England’s Monetary Policy Commit-
tee, for example, describes its expert advisory
process as a “two-way dialogue” — with a
priority placed on public accountability.
Great care is taken to inform the commit-
tee, not just of the results of formal analysis
by the sponsoring bodies, but also of com-
plex real-world conditions and perspectives.
Reports detail contrasting recommendations
by individual members and explain reasons
for differences
6
. Why is this kind of thing not
normal in science advice?
When scientists are faced with unmeas-
urable uncertainties, it is much more usual
for a committee to spend hours negotiating
a single interpretation across a spread of con-
tending contexts, analyses and judgements.
From my own experiences of standard-
setting for toxic substances, it would often
be more accurate and useful to accept these
divergent expert interpretations and focus
instead on documenting the reasons. In my
view, concrete policy decisions could still
be made — and possibly more efficiently.
Moreover, the relationship between the
decision and the available science would be
clearer and the inherently political dimen-
sions more honest and accountable.
Problems of ambiguity arise when experts
disagree over the framing of possible options,
contexts, outcomes, benefits or harms.
Like uncertainty, these cannot be
reduced to risk analysis, and demand
plural and conditional treatment. Such
methods can highlight — rather than
conceal — different regulatory ques-
tions, such as: “what is best?”, “what
is safest?”, “is this safe?”, “is this toler-
able?” or (as is often routine) “is this
worse than what we have now?” Nobel-
winning work in rational choice shows
that when ambiguity rules there is no
guarantee, as a matter of logic, that
scientific analysis will lead to a unique
policy answer
7
. Consequently, defini-
tive science-based decisions are not
just potentially misleading they are a
fundamental contradiction in terms.
meThODs ThAT wORK
One practical example of ways to be
plural and conditional when consid-
ering questions and options, as well
as in deriving answers, is multicri-
teria mapping. Other participatory
and deliberative procedures include
interactive modelling and scenario work-
shops, as well as Q-method and dissensus
methods. Multi criteria mapping makes use
of simple but rigorous scoring and weight-
ing procedures to reveal the ways in which
overall rankings depend on divergent ways
of framing the possible options. In 1999,
Unilever funded me and colleagues to use
multicriteria mapping to study the perspec-
tives of different leading science advisers on
genetically modified (GM) crops
8
. The back-
ing of this transnational company helped
UNCERTAINTY MATRIX
A tool to catalyse nuanced deliberations: experts must look beyond
risk (top left quadrant) to ambiguity, uncertainty and ignorance
using quantitative and qualitative methods.
Unproblematic Problematic
Unproblematic Problematic
Knowledge about probabilities
AMBIGUITY
IGNORANCE
RISK
UNCERTAINTY
• Risk assessment
Optimizing models
Expert consensus
• Cost–benefit analysis
• Aggregated beliefs
• Interval analysis
• Scenario methods
• Sensitivity testing
Decision rules
Evaluative judgement
• Interactive modelling
Participatory deliberation
Focus & dissensus groups
• Multicriteria mapping
• Q-method, repertory grid
• Monitoring & surveillance
• Reversibility of effects
Flexibility of commitments
Adaptability, resilience
Robustness, diversity
Knowledge about possibilities
Political pressures tend to push attention from ‘plural conditional’
(dark shading) to ‘single definitive’ (light shading) methods.
1030 | NATURE | VOL 468 | 23/30 DECEMBER 2010
COMMENT
© 20 Macmillan Publishers Limited. All rights reserved10
draw high-level UK government attention.
A series of civil servants told me, in quite
colourful terms, that results mapped out in
plural, conditional fashion would be “abso-
lutely no usein practical policy-making. Yet
when a chance finally emerged to present
results to Mo Mowlam, the relevant cabinet
minister, the reception was very positive.
She immediately appreciated the value of
having alternative perspectives laid out for a
range of policy options. It turned out in this
case, that the real block to a plural, condi-
tional approach was not the preferences of
the decision-maker herself, but of some of
those around her.
In my experience, it is the single defini-
tive representations of science that are most
vulnerable to political manipulation. Plural,
conditional approaches are not immune, but
they can help make political pressures more
visible. Indeed, this is what happened dur-
ing another GM policy process in which I
was involved: the 2003 UK science review of
GM crops. Reporting included explicit dis-
cussion of uncertainties, gaps in knowledge
and divergent views and was described as
neither a red nor a green lightfor GM tech-
nology. A benefit of this more open approach
is that it helped GM proponents and critics
to work more effectively together during the
committee deliberations, without a high-
stakes, ‘winner takes alldynamic. There was
more space to express alternate interpreta-
tions, free from implications that one party
or another was wrong. This is important in a
highly-politicized area such as GM science,
where there are entrenched interests on both
sides. Yet this unusual attempt to acknowl-
edge uncertainty was not universally popu-
lar. Indeed, it was also the only occasion, to
my knowledge, on which the minutes of a
UK science advisory committee formally
documented covert attempts to damage
the career of one of its members (me, in
this case)
9
. Perhaps for political — rather
than scientific — reasons, this experiment
towards plural and conditional advice has
not been repeated.
A further argument for using more plural
approaches arises from the state of igno-
rance, in which ‘we dont know what we
dont know’. Ignorance typically looms in
the choice of which of a range of feasible,
economically viable future paths to support
either through funding or regulation
for emerging technologies. In a finite and
globalizing world, no single path can be fully
realized without detracting from the poten-
tial for others. Even in the most competitive
consumer markets, for
instance, development
routinely ‘locks in’ to
dominant technologies
such as the QWERTY
keyboard or VHS tape.
The same is true of
infrastructures, such
as narrow-gauge rail,
AC electricity or light-
water reactors. This is not evidence of inevi-
tability, but of the crowding out’ of potential
alternatives. Likewise, locking-in occurs in
the prioritizing of certain areas of scientific
enquiry over others. The paths taken by
scientific and technological progress are far
from inevitable. Deliberately or blindly, the
direction of progress is inherently a matter
of social choice
10
.
A move towards plural, conditional advice
would help avoid erroneous one-track,
race to the future’ visions of progress. Such
advice corrects the fallacy that scepticism
over a specific technology implies a general
anti-science’ sentiment. It defends against
simplistic or cynical support for some par-
ticular favoured direction of change that is
backed on the spurious grounds that it is
somehow synonymous with sound science,
or uniquely ‘pro innovation.
Instead, plural, conditional advice helps
enable mature and sophisticated policy
debate on broader questions. How reversible
are the effects of a particular path, if we learn
later that it was ill-advised? How flexible are
the associated industrial and institutional
commitments, allowing us later to shift
direction? How adaptable are the innovation
systems? What part might be played by the
deliberate pursuit of diverse approaches —
to hedge ignorance, defend against lock-in
or foster innovation — in any given area?
Thus, such advice provides the basis for
a more-equal partnership between social
and natural science in policy advice. Plural
and conditional advice may also help resolve
some polarized fault-lines in current debates
about science in policy. It shows how we
might better: integrate quantitative and
qualitative methods; articulate ‘risk assess-
mentand ‘risk management’; and reconcile
science-based’ and ‘precautionary appraisal’
methods.
A move towards plural and conditional
expert advice is not a panacea. It cannot
promise escape from the deep intractabilities
of uncertainty, the perils of group dynamics
or the perturbing effects of power. It differs
from prevailing approaches in that it makes
these influences more rigorously explicit and
democratically accountable.
Andy Stirling is research director at SPRU
(Science and Technology Policy Research)
and co-directs the joint Centre for Social
Technological & Environmental Pathways to
Sustainability at Sussex University, Falmer,
Brighton BN1 9QE, UK.
e-mail: a.c.stirling@sussex.ac.uk
1. Knight, F. Risk, Uncertainty and Profit (Houghton
Mifflin, 1921).
2. Wynne, B. Glob. Environ. Change 2, 111–127
(1992).
3. Stirling, A. Ann. NY Acad. Sci. 1128, 95–110
(2008).
4. Stirling, A. Sci. Technol. Hum. Valu. 33, 262–294
(2008).
5. Farman, J. in Late Lesson from Early Warnings:
the precautionary principle 1898-2000
(eds Harremoës, P. et al.) 76–83 (European
Environment Agency, 2001).
6. Treasury Committee Inquiry into the Monetary
Policy Committee of the Bank of England: Ten
Years On (Bank of England, 2007); available at
go.nature.com/v2h4al
7. Leach, M., Scoones, I. & Stirling, A. Dynamic
Sustainabilities: Technology, Environment, Social
Justice (Earthscan, 2010).
8. Stirling, A. & Mayer, S. Environ. Plann. C 19,
529–555 (2001).
9. UK GM Science Review Panel Minutes of the
Seventh Meeting (2003); available at go.nature.
com/lxbmpb
10. ESRC Centre on Social, Technological and
Environmental Pathways to Sustainability, A New
Manifesto, (Univ. Sussex, 2010); available at:
go.nature.com/znqakg
Variability across studies
Low High
Energy options
THE PERILS OF ‘SCIENCE-BASED’ ADVICE
A survey of 63 peer-reviewed studies of health and environmental risks associated with energy technologies.
Individual studies oer conclusions with surprisingly narrow uncertainty ranges, yet together the
literature
oers no clear consensus for policy makers.
Number
of studies
0.01 1 100 10,000
Gas
Nuclear
Hydro
Wind
Solar
Biomass
Economic risk (dollars per megawatt hour)
Lowest result
Highest result25% Median 75%
Oil
Coal
XX
36
20
3�
2�
6
8
��
22
An overly
narrow focus
on risk is an
inadequate
response to
incomplete
knowledge.”
SOURCE: REF. 3
23/30 DECEMBER 2010 | VOL 468 | NATURE | 1031
COMMENT
© 20 Macmillan Publishers Limited. All rights reserved10
... Esta questão justamente induz a um cenário de ambiguidade quando os especialistas discordam do enquadramento (forte, fraco, moderado) para a realização de uma gestão dos riscos, os quais estão abarrotados de incertezas e trazem questionamentos como os citados pelo autor Stirling (2010). ...
Article
Full-text available
O princípio da Precaução é amplamente aplicado em cenários de riscos de danos ambientais. Ele serviu e serve para direcionar a elaboração de importantes legislações ambientais brasileiras. O fato é que apesar de existir um conceito pacificado entre os doutrinadores do Direito Ambiental sobre o que é o Princípio da Precaução, quando se exige a sua aplicação empírica, uma série de questionamentos e discordâncias se revelam. Para tanto, a presente pesquisa objetiva demonstrar as diretrizes para a aplicação empírica do Princípio da Precaução com base no conceito firmado na Rio 92. Utiliza-se da metodologia exploratória e método de abordagem indutivo, bem como da pesquisa bibliográfica como procedimento técnico para pesquisa doutrinária e legal. Concluiu-se que para uma abordagem capaz de aplicação empírica do Princípio da Precaução faz-se importante inicialmente compreender suas partes (riscos, estado da técnica e medidas economicamente viáveis). A partir da compreensão de suas “partes” foi possível elaborar um quadro-resumo com suas diretrizes de aplicação.
... [8,9] To remain well-informed, they consult diverse perspectives and work with experts. [10,11] Scientists can play a vital role in this process, providing evidence-informed recommendations and advice. In particular, scientists offer expertise in their research fields and are more broadly trained in ways that allow them to provide insights on a wider range of topics, including data analysis, hypothesis testing, and multidisciplinary collaboration. ...
Article
Full-text available
Pressing global challenges, such as climate change, the COVID‐19 pandemic, or antibiotic resistance, require coordinated international responses guided by evidence‐informed decisions. For this purpose, it is critical that scientists engage in providing insights during the decision‐making process. However, the mechanisms for the engagement of scientists in policy‐making are complex and vary internationally, which often poses significant challenges to their involvement. Herein, we address some of the mechanisms and barriers for scientists to engage in policy‐making with a global perspective by early‐career scientists. We highlight the importance of scientific academies, societies, universities, and early‐career networks as stakeholders and how they can adapt their structures to actively contribute to shaping global policies, with representative examples from chemistry‐related disciplines. We showcase the importance of raising awareness, providing resources and training, and leading discussions about connecting emerging scientists with global decision‐makers to address societal challenges through policies.
... [8,9] To remain well-informed, they consult diverse perspectives and work with experts. [10,11] Scientists can play a vital role in this process, providing evidence-informed recommendations and advice. In particular, scientists offer expertise in their research fields and are more broadly trained in ways that allow them to provide insights on a wider range of topics, including data analysis, hypothesis testing, and multidisciplinary collaboration. ...
Article
Full-text available
Pressing global challenges, such as climate change, the COVID-19 pandemic, or antibiotic resistance, require coordinated international responses guided by evidence-informed decisions. For this purpose, it is critical that scientists engage in providing insights during the decision-making process. However, the mechanisms for the engagement of scientists in policy-making are complex and vary internationally, which often poses significant challenges to their involvement. Herein, we address some of the mechanisms and barriers for scientists to engage in policy-making with a global perspective by early-career scientists. We highlight the importance of scientific academies, societies, universities, and early-career networks as stakeholders and how they can adapt their structures to actively contribute to shaping global policies, with representative examples from chemistry-related disciplines. We showcase the importance of raising awareness, providing resources and training, and leading discussions about connecting emerging scientists with global decision-makers to address societal challenges through policies.
Article
Full-text available
Due to the long timescales and deep uncertainties involved, comprehensive model-building has played a pivotal role in creating shared expectations about future trajectories for addressing climate change processes, mobilizing a network of knowledge-based experts who assist in defining common problems, identifying policy solutions, and assessing the policy outcomes. At the intersection between climate change science and climate governance, where wholly empirical methods are infeasible, numerical simulations have become the central practice for evaluating truth claims, and the key medium for the transport and translation of data, methods, and guiding principles among the actors involved. What makes integrated assessment unique as a comprehensive modeling-effort is that it is explicitly policy-oriented, justified by its policy-relevance. Although recognized by the Intergovernmental Panel on Climate Change as invaluable to their review assessments, the role of integrated modeling in implementations of the Paris Agreement, such as in impact assessments of climate legislation on the national level, is far less known. Taking as its starting-point the boundary-work carried out in public administration, this paper examines how foresight knowledge produced with the help of model-based scenario analysis has been made relevant in Swedish climate policymaking, focusing on the processes by which key indicators for political action become institutionalized through the choice and use of model parameters. It concludes by arguing for an expanded understanding of policy-relevance, beyond institutional approaches and toward a process-based point of view, treating relevance as something in-the-making.
Article
Scientific advisory boards are frequently established to provide scientific insights and advice to policymakers. Advisory board appointing bodies often state that research excellence and scientific seniority are the main grounds on which advisory board members are selected. Many authors have pointed out that there is more to giving good scientific advice than just being an expert for a specific research field. The aim of this study is to analyse if and how research excellence correlates with the probability of being appointed as a scientific advisory board member. We collected data for scientific advisory boards from both the USA and Germany. We use logit regression models to analyse how research excellence correlates with the probability of appointment to a scientific advisory board. Our results suggest that research excellence is insignificant or even correlates negatively with the probability of being appointed to a scientific advisory board.
Article
The COVID‐19 pandemic continues to pose a global zoonotic disease threat, highlighting the importance of developing strategies to combat new viruses and variants stemming from climate change and animal populations. A multidisciplinary approach is needed to address dis/misinformation surrounding such research. Collaborations across disciplines aid in actual responses to public health emergency events as well, easing the transition to a new equilibrium and impacting planning for future events. Russian disinformation regarding the joint work of Americans and Ukrainians as part of the Biological Threat Reduction Program is evidence of the need to educate the public about the real risks and benefits of zoonotic disease research. As seen in the recently released National Biodefense Strategy, political leaders in the United States are becoming more explicit about embracing a One Health framework for biodefense and public health emergency preparedness. This has significant benefits in terms of policy making by helping to dispel dis/misinformation, build trust in disease surveillance data, and give frontline workers such as those in the agricultural sector a voice in the conversation with which to share their expertise. A communication complex perspective applied through a culture‐centered approach to a One Health model aims to improve risk and crisis communication. This unique strategy will help government leaders as well as those in relevant sectors to speak to the unique questions and concerns of a particular population. Among other things, this helps to avoid an ineffective “cookie‐cutter” approach messaging. In this manuscript, we argue that coconstructing a cultural understanding of human, animal, and environmental health practices within a unique culture will lead to a specialized risk and crisis communication plan accepted by local communities.
Book
Full-text available
Linking environmental sustainability with poverty reduction and social justice, and making science and technology work for the poor, have become central practical, political and moral challenges of our times. These must be met in a world of rapid, interconnected change in environments, societies and economies, and globalised, fragmented governance arrangements. Yet despite growing international attention and investment, policy attempts often fail. Why is this, and what can be done about it? How might we understand and address emergent threats from epidemic disease, or the challenges of water scarcity in dryland India? In the context of climate change, how might seed systems help African farmers meet their needs, and how might appropriate energy strategies be developed? This book lays out a new 'pathways approach' to address sustainability challenges such as these in today's dynamic world. Through an appreciation of dynamics, complexity, uncertainty, differing narratives and the values-based aims of sustainability, the pathways approach allows us to see how some approaches are dominant, even though they do not produce the desired results, and how to create successful alternative 'pathways' of responding to the challenges we face. As well as offering new ways of thinking about sustainability, the book also suggests a series of practical ways forward - in tools and methods, forms of political engagement, and styles of knowledge-making and communication. Throughout the book, the practicalities of the pathways approach are illustrated using four case studies: water in dryland India, agricultural seeds in Africa, responses to epidemic disease and energy systems/climate change. Published in association with the Economic and Social Research Council (ESRC). © M. Leach, I. Scoones and A. Stirling, 2010. All rights reserved.
Article
Full-text available
Discursive deference in the governance of science and technology is rebalancing from expert analysis toward participatory deliberation. Linear, scientistic conceptions of innovation are giving ground to more plural, socially situated understandings. Yet, growing recognition of social agency in technology choice is countered by persistently deterministic notions of technological progress. This article addresses this increasingly stark disjuncture. Distinguishing between "appraisal" and "commitment" in technology choice, it highlights contrasting implications of normative, instrumental, and substantive imperatives in appraisal. Focusing on the role of power, it identifies key commonalities transcending the analysis/participation dichotomy. Each is equally susceptible to instrumental framing for variously weak and strong forms of justification. To address the disjuncture, it is concluded that greater appreciation is required-in both analytic and participatory appraisal-to facilitating the opening up (rather than the closing down) of governance commitments on science and technology.
Article
Full-text available
The recent controversy over genetically modified (GM) foods amply demonstrates the general difficulties encountered in the social appraisal of technological risk. Existing procedures for regulatory appraisal neglect many possible forms of impact and routinely exclude important cultural and social dimensions of risk. A narrow, expert, 'science-based' approach is now widely acknowledged to be insufficient. There is a need for new approaches that are more broadly based, transparent, pluralistic and ready to acknowledge uncertainty as well as being practically feasible and robust. The authors investigate the potential for a novel 'multicriteria mapping' (MCM) method as one such possible tool. Drawing on a variety of perspectives in the current UK debate, a range of agricultural strategies for the production of oilseed rape, including both GM and non-GM options were explored in this MCM pilot exercise. The results demonstrate the general feasibility and positive potential of this type of approach, with specific findings providing modest insights for policymaking in this difficult area.
Article
Full-text available
This paper examines apparent tensions between "science-based," "precautionary," and "participatory" approaches to decision making on risk. Partly by reference to insights currently emerging in evolutionary studies, the present paper looks for ways to reconcile some of the contradictions. First, I argue that technological evolution is a much more plural and open-ended process than is conventionally supposed. Risk politics is thus implicitly as much about social choice of technological pathways as narrow issues of safety. Second, it is shown how conventional "science-based" risk assessment techniques address only limited aspects of incomplete knowledge in complex, dynamic, evolutionary processes. Together, these understandings open the door to more sophisticated, comprehensive, rational, and robust decision-making processes. Despite their own limitations, it is found that precautionary and participatory approaches help to address these needs. A concrete framework is outlined through which the synergies can be more effectively harnessed. By this means, we can hope simultaneously to improve scientific rigor and democratic legitimacy in risk governance.
Article
The author considers the implications for current assumptions about scientific knowledge and environmental policy raised by the preventive approach and the associated Precautionary Principle. He offers a critical examination of approaches to characterizing different kinds of uncertainty in policy knowledge, especially in relation to decision making upstream from environmental effects. Via the key dimension of unrecognized indeterminacy in scientific knowledge, the author argues that shifting the normative principles applied to policy use of science is not merely an external shift in relation to the same body of 'natural' knowledge, but also involves the possible reshaping of the 'natural' knowledge itself.
Article
Examines the role played by true uncertainty, defined as the possibility of alternative outcomes whose probabilities are not capable of measurement, in an economic system, and distinguishes uncertainty from risk. Classical economic theory teaches that perfect competition ought to drive an economy into equilibrium and eliminate opportunities for economic profit. Nevertheless, economic profit persists in the real world. The introductory sections of the book provide a historical and critical review of early attempts to reconcile theory and observation. Then, beginning with a simplified model economy of individuals as producers-and-consumers, the author derives familiar features of static economics. The model goes through further refinements of joint production, and changes with uncertainty absent with similar results. The final model is one that demonstrates how perfect competition tends to eliminate profit. The author then takes up the question of how risk and uncertainty may upset the equilibrium. Risk is the possibility of alternative outcomes whose probabilities are capable of measurement; uncertainty is the possibility of alternative outcome whose probabilities are not capable of measurement. When probabilities are known, adverse outcomes may be insured against. Uncertainty is handled by judgment, an unequally distributed ability. The successful entrepreneur is one who has the sound judgment, either in the direction of the enterprise itself or in the selection of its managers (as shareholders do). The recompense for this talent is profit. (CAR)
  • A Stirling
  • Ann
Stirling, A. Ann. NY Acad. Sci. 1128, 95–110 (2008).
in Late Lesson from Early Warnings: the precautionary principle
  • J Farman
  • P Harremoës
Farman, J. in Late Lesson from Early Warnings: the precautionary principle 1898-2000 (eds Harremoës, P. et al.) 76–83 (European Environment Agency, 2001).