ArticlePDF Available

Abstract

When knowledge is uncertain, experts should avoid pressures to simplify their advice. Render decision-makers accountable for decisions, says Andy Stirling.
COMMENT
OBITUARY Brian Marsden,
keeper of comets,
remembered p.1042
RevIewIng Pool of peers grows
to cope with submissions
surge p.1041
mAThemATIcs Roger Penrose
reflects on 50 years and 6
volumes of work p.1039
cOnseRvATIOn Threats
to Adélie penguins
assessed p.1034
W
orldwide and across many fields,
there lurks a hidden assumption
about how scientific expertise
can best serve society. Expert advice is often
thought most useful to policy when it is pre-
sented as a single definitive’ interpretation.
Even when experts acknowledge uncer-
tainty, they tend to do so in ways that reduce
unknowns to measurable ‘risk. In this way,
policy-makers are encouraged to pursue (and
claim) science-based’ decisions. It is also not
uncommon for senior scientists to assert that
there is no alternative to some scientifically
contestable policy. After years researching
and participating in — science advisory
processes, I have come to the conclusion that
this practice is misguided.
An overly narrow focus on risk is an inad-
equate response to incomplete knowledge. It
leaves science advice vulnerable to the social
dynamics of groups and to manipulation
by political pressures seeking legitimacy,
justification and blame management. When
the intrinsically plural, conditional nature
of knowledge is recognized, I believe that
science advice can become more rigorous,
robust and democratically accountable.
A rigorous definition of uncertainty can be
traced back to the twentieth-century econo-
mist Frank Knight
1
. For Knight, a measur-
able uncertainty, or ‘risk’ proper ... is so far
different from an unmeasurable one that it
is not in effect an uncertainty at all. This is
not just a matter of words, or even methods.
The stakes are potentially much higher. A
preoccupation with assessing risk means
that policy-makers are denied exposure to
dissenting interpretations and the possibility
of downright surprise.
Of course, no-one can reliably foresee
the unpredictable, but there are lessons to
be learned from past mistakes. For example,
the belated recognition that seemingly inert
and benign halogenated hydrocarbons were
interfering with the ozone layer. Or the slow-
ness to acknowledge the possibility of novel
transmission mechanisms for spongiform
encephalopathies, in animal breeding and
in the food chain. In the early stages, these
sources of harm were not formally charac-
terized as possible risks — they were early
warningsoffered by dissenting voices. Policy
recommendations that miss such warnings
court overconfidence and error.
The question is how to move away
Keep it complex
When knowledge is uncertain, experts should avoid
pressures to simplify their advice. Render decision-
makers accountable for decisions, says Andy Stirling.
A UK crop circle, created by activists to signify uncertainty over where genetic contamination can occur.
G. GRAF/GREENPEACE
23/30 DECEMBER 2010 | VOL 468 | NATURE | 1029
© 20 Macmillan Publishers Limited. All rights reserved10
from this narrow focus on risk to broader
and deeper understandings of incomplete
knowledge. Many practical quantitative
and qualitative methods already exist (see
‘Uncertainty matrix’), but political pres-
sure and expert practice often prevent them
being used to their full potential. Choosing
between these methods requires a more
rigorous approach to assessing incomplete
knowledge, avoiding the temptation to treat
every problem as a risk nail, to be reduced
by a probabilistic hammer. Instead, experts
should pay more attention to neglected areas
of uncertainty (in Knights strict sense) as
well as to deeper challenges of ambiguity and
ignorance
2
. For policy-making purposes, the
main difference between the ‘riskmethods
shown in the matrix and the rest is that the
others discourage single definitive’ policy
interpretations.
AnY jUsTIfIcATIOn
There are still times when ‘risk-based
techniques are appropriate and can yield
important information for policy. This can
be so for consumer products in normal use,
general road or airline-safety statistics, or
the epidemiology of familiar diseases. Yet
even in these seemingly familiar and
straightforward areas, unforeseen pos-
sibilities, and over-reliance on aggre-
gation, can undermine probabilistic
assessments. There is a need for humil-
ity about science-based decisions.
For example, consider the risk
assessment of energy technologies.
The other graphic (see ‘The perils of
science-based’ advice’) summarizes
63 studies on the economic costs aris-
ing from health and environmental
impacts of different sets of energy tech-
nologies. The aim of the studies is to
help policy-makers identify the options
that are likely to have the lowest impact.
This is one of the most sophisticated
and mature fields for quantitative risk-
based comparisons. Individual policy
reports commonly express their find-
ings as if there were little room for
doubt. Many of the studies present no
— or tiny — uncertainty ranges. But
taken together, these 63 studies tell a
very different story
3
— one usually
hidden from policy-makers. The dis-
crepancies between equally authoritative,
peer-reviewed studies span many orders of
magnitude, and the overlapping uncertainty
ranges can support almost any ranking order
of technologies, justifying almost any policy
decision as science based.
This is not just a problem with quantita-
tive analysis. Qualitative science advice is
also usually presented in aggregated and
consensual form: there is always pressure
on expert committees to reach a consensus’
opinion. This raises profound questions over
what is most accurate and useful for policy. Is
it a picture asserting an apparent consensus,
even where one does not exist? Or would it
be more helpful to set out a measured array
of contrasting specialist views, explaining
underlying reasons for different interpreta-
tions of the evidence? Whatever the political
pressures for the former, surely the latter is
more consistent both with scientific rigour
and with democratic accountability?
I believe that the answer lies in supporting
more plural and conditional methods for sci-
ence advice (the non-risk quadrants shown
in ‘Uncertainty matrix’). These are plural
because they even-handedly illuminate a
variety of alternative reasonable interpreta-
tions. And conditional because they explore
explicitly for each alternative, the associated
questions, assumptions, values or inten-
tions
4
. Under Knightian uncertainty, for
instance, pessimistic and optimistic inter-
pretations can be treated separately, each
explicitly associated with assumptions, dis-
ciplines, values or interests so that these can
be clearly appraised. It reminds experts that
absence of evidence of harm is not the same
as evidence of absence of harm. It also allows
scenario analysis and the consideration of
sensitivity, enabling more accountable evalu-
ation. For example, it could allow experts to
highlight conditional decision rules aimed at
maximizing best or worst possible outcomes,
or minimizing regrets
5
.
The few sporadic examples of the appli-
cation of this approach show that it can be
practical. One particularly politicized and
high-stakes context for expert policy advice
is the setting of financial interest rates. The
Bank of England’s Monetary Policy Commit-
tee, for example, describes its expert advisory
process as a “two-way dialogue” — with a
priority placed on public accountability.
Great care is taken to inform the commit-
tee, not just of the results of formal analysis
by the sponsoring bodies, but also of com-
plex real-world conditions and perspectives.
Reports detail contrasting recommendations
by individual members and explain reasons
for differences
6
. Why is this kind of thing not
normal in science advice?
When scientists are faced with unmeas-
urable uncertainties, it is much more usual
for a committee to spend hours negotiating
a single interpretation across a spread of con-
tending contexts, analyses and judgements.
From my own experiences of standard-
setting for toxic substances, it would often
be more accurate and useful to accept these
divergent expert interpretations and focus
instead on documenting the reasons. In my
view, concrete policy decisions could still
be made — and possibly more efficiently.
Moreover, the relationship between the
decision and the available science would be
clearer and the inherently political dimen-
sions more honest and accountable.
Problems of ambiguity arise when experts
disagree over the framing of possible options,
contexts, outcomes, benefits or harms.
Like uncertainty, these cannot be
reduced to risk analysis, and demand
plural and conditional treatment. Such
methods can highlight — rather than
conceal — different regulatory ques-
tions, such as: “what is best?”, “what
is safest?”, “is this safe?”, “is this toler-
able?” or (as is often routine) “is this
worse than what we have now?” Nobel-
winning work in rational choice shows
that when ambiguity rules there is no
guarantee, as a matter of logic, that
scientific analysis will lead to a unique
policy answer
7
. Consequently, defini-
tive science-based decisions are not
just potentially misleading they are a
fundamental contradiction in terms.
meThODs ThAT wORK
One practical example of ways to be
plural and conditional when consid-
ering questions and options, as well
as in deriving answers, is multicri-
teria mapping. Other participatory
and deliberative procedures include
interactive modelling and scenario work-
shops, as well as Q-method and dissensus
methods. Multi criteria mapping makes use
of simple but rigorous scoring and weight-
ing procedures to reveal the ways in which
overall rankings depend on divergent ways
of framing the possible options. In 1999,
Unilever funded me and colleagues to use
multicriteria mapping to study the perspec-
tives of different leading science advisers on
genetically modified (GM) crops
8
. The back-
ing of this transnational company helped
UNCERTAINTY MATRIX
A tool to catalyse nuanced deliberations: experts must look beyond
risk (top left quadrant) to ambiguity, uncertainty and ignorance
using quantitative and qualitative methods.
Unproblematic Problematic
Unproblematic Problematic
Knowledge about probabilities
AMBIGUITY
IGNORANCE
RISK
UNCERTAINTY
• Risk assessment
Optimizing models
Expert consensus
• Cost–benefit analysis
• Aggregated beliefs
• Interval analysis
• Scenario methods
• Sensitivity testing
Decision rules
Evaluative judgement
• Interactive modelling
Participatory deliberation
Focus & dissensus groups
• Multicriteria mapping
• Q-method, repertory grid
• Monitoring & surveillance
• Reversibility of effects
Flexibility of commitments
Adaptability, resilience
Robustness, diversity
Knowledge about possibilities
Political pressures tend to push attention from ‘plural conditional’
(dark shading) to ‘single definitive’ (light shading) methods.
1030 | NATURE | VOL 468 | 23/30 DECEMBER 2010
COMMENT
© 20 Macmillan Publishers Limited. All rights reserved10
draw high-level UK government attention.
A series of civil servants told me, in quite
colourful terms, that results mapped out in
plural, conditional fashion would be “abso-
lutely no usein practical policy-making. Yet
when a chance finally emerged to present
results to Mo Mowlam, the relevant cabinet
minister, the reception was very positive.
She immediately appreciated the value of
having alternative perspectives laid out for a
range of policy options. It turned out in this
case, that the real block to a plural, condi-
tional approach was not the preferences of
the decision-maker herself, but of some of
those around her.
In my experience, it is the single defini-
tive representations of science that are most
vulnerable to political manipulation. Plural,
conditional approaches are not immune, but
they can help make political pressures more
visible. Indeed, this is what happened dur-
ing another GM policy process in which I
was involved: the 2003 UK science review of
GM crops. Reporting included explicit dis-
cussion of uncertainties, gaps in knowledge
and divergent views and was described as
neither a red nor a green lightfor GM tech-
nology. A benefit of this more open approach
is that it helped GM proponents and critics
to work more effectively together during the
committee deliberations, without a high-
stakes, ‘winner takes alldynamic. There was
more space to express alternate interpreta-
tions, free from implications that one party
or another was wrong. This is important in a
highly-politicized area such as GM science,
where there are entrenched interests on both
sides. Yet this unusual attempt to acknowl-
edge uncertainty was not universally popu-
lar. Indeed, it was also the only occasion, to
my knowledge, on which the minutes of a
UK science advisory committee formally
documented covert attempts to damage
the career of one of its members (me, in
this case)
9
. Perhaps for political — rather
than scientific — reasons, this experiment
towards plural and conditional advice has
not been repeated.
A further argument for using more plural
approaches arises from the state of igno-
rance, in which ‘we dont know what we
dont know’. Ignorance typically looms in
the choice of which of a range of feasible,
economically viable future paths to support
either through funding or regulation
for emerging technologies. In a finite and
globalizing world, no single path can be fully
realized without detracting from the poten-
tial for others. Even in the most competitive
consumer markets, for
instance, development
routinely ‘locks in’ to
dominant technologies
such as the QWERTY
keyboard or VHS tape.
The same is true of
infrastructures, such
as narrow-gauge rail,
AC electricity or light-
water reactors. This is not evidence of inevi-
tability, but of the crowding out’ of potential
alternatives. Likewise, locking-in occurs in
the prioritizing of certain areas of scientific
enquiry over others. The paths taken by
scientific and technological progress are far
from inevitable. Deliberately or blindly, the
direction of progress is inherently a matter
of social choice
10
.
A move towards plural, conditional advice
would help avoid erroneous one-track,
race to the future’ visions of progress. Such
advice corrects the fallacy that scepticism
over a specific technology implies a general
anti-science’ sentiment. It defends against
simplistic or cynical support for some par-
ticular favoured direction of change that is
backed on the spurious grounds that it is
somehow synonymous with sound science,
or uniquely ‘pro innovation.
Instead, plural, conditional advice helps
enable mature and sophisticated policy
debate on broader questions. How reversible
are the effects of a particular path, if we learn
later that it was ill-advised? How flexible are
the associated industrial and institutional
commitments, allowing us later to shift
direction? How adaptable are the innovation
systems? What part might be played by the
deliberate pursuit of diverse approaches —
to hedge ignorance, defend against lock-in
or foster innovation — in any given area?
Thus, such advice provides the basis for
a more-equal partnership between social
and natural science in policy advice. Plural
and conditional advice may also help resolve
some polarized fault-lines in current debates
about science in policy. It shows how we
might better: integrate quantitative and
qualitative methods; articulate ‘risk assess-
mentand ‘risk management’; and reconcile
science-based’ and ‘precautionary appraisal’
methods.
A move towards plural and conditional
expert advice is not a panacea. It cannot
promise escape from the deep intractabilities
of uncertainty, the perils of group dynamics
or the perturbing effects of power. It differs
from prevailing approaches in that it makes
these influences more rigorously explicit and
democratically accountable.
Andy Stirling is research director at SPRU
(Science and Technology Policy Research)
and co-directs the joint Centre for Social
Technological & Environmental Pathways to
Sustainability at Sussex University, Falmer,
Brighton BN1 9QE, UK.
e-mail: a.c.stirling@sussex.ac.uk
1. Knight, F. Risk, Uncertainty and Profit (Houghton
Mifflin, 1921).
2. Wynne, B. Glob. Environ. Change 2, 111–127
(1992).
3. Stirling, A. Ann. NY Acad. Sci. 1128, 95–110
(2008).
4. Stirling, A. Sci. Technol. Hum. Valu. 33, 262–294
(2008).
5. Farman, J. in Late Lesson from Early Warnings:
the precautionary principle 1898-2000
(eds Harremoës, P. et al.) 76–83 (European
Environment Agency, 2001).
6. Treasury Committee Inquiry into the Monetary
Policy Committee of the Bank of England: Ten
Years On (Bank of England, 2007); available at
go.nature.com/v2h4al
7. Leach, M., Scoones, I. & Stirling, A. Dynamic
Sustainabilities: Technology, Environment, Social
Justice (Earthscan, 2010).
8. Stirling, A. & Mayer, S. Environ. Plann. C 19,
529–555 (2001).
9. UK GM Science Review Panel Minutes of the
Seventh Meeting (2003); available at go.nature.
com/lxbmpb
10. ESRC Centre on Social, Technological and
Environmental Pathways to Sustainability, A New
Manifesto, (Univ. Sussex, 2010); available at:
go.nature.com/znqakg
Variability across studies
Low High
Energy options
THE PERILS OF ‘SCIENCE-BASED’ ADVICE
A survey of 63 peer-reviewed studies of health and environmental risks associated with energy technologies.
Individual studies oer conclusions with surprisingly narrow uncertainty ranges, yet together the
literature
oers no clear consensus for policy makers.
Number
of studies
0.01 1 100 10,000
Gas
Nuclear
Hydro
Wind
Solar
Biomass
Economic risk (dollars per megawatt hour)
Lowest result
Highest result25% Median 75%
Oil
Coal
XX
36
20
3�
2�
6
8
��
22
An overly
narrow focus
on risk is an
inadequate
response to
incomplete
knowledge.”
SOURCE: REF. 3
23/30 DECEMBER 2010 | VOL 468 | NATURE | 1031
COMMENT
© 20 Macmillan Publishers Limited. All rights reserved10
Article
Full-text available
This study analyses marine governance and knowledge politics of sediments in the Borkum Reef Ground from a historical and German perspective, as well as in the context of litigation against marine gas production from transboundary Dutch and German fields. The authors analysed interview transcripts, project documents, environmental media campaigns, and notes originating from participant observation and stakeholder engagement. The study employs the science and technology and sociology of ignorance approaches. It asks which implications for biodiversity protection and ocean governance derive from administrative fragmentation and knowledge politics by a diverse set of actors. National divisions and prioritised knowledge production led to a shift in perception of the area from a transboundary seabed habitat to two distinct national marine areas and resulted in a fragmented Schutzgebietskulisse including marine protected areas, restoration zones, and unprotected zones. The study illustrates how the prioritisation of mapping marine protected areas may backfire on knowledge gain in potential industrial zones and overall marine protection.
Article
Full-text available
Assessing the safety and sustainability of novel technologies while they are still in the early research and development stages is the most effective way to avoid undesired outcomes. However, the journey from idea to market is highly uncertain and involves intensive trial and error as technology developers attempt to optimize material choices and product configurations. Designs evolve quickly, and assessing their risks and impacts while numerous factors remain undetermined is challenging. The standard practice is to evaluate a limited subset of scenarios that can guide design choices. However, selecting scenarios from hundreds of undetermined factors without a systematic sensitivity screening may leave out important improvement opportunities. To provide well‐informed guidance, the evaluated scenarios should be selected based on factors that are most influential to the safety and sustainability impacts of the technology. We propose an approach that accomplishes this by incorporating a wide spectrum of undetermined factors, both intrinsic and extrinsic to the technology design. The assessment models are then screened for highly‐sensitive factors using global sensitivity analysis. Strategies to reduce uncertainty on highly influential factors are proposed for subsequent iterations, and the residual factors for which uncertainty cannot be further reduced yet remain influential are selected as a basis for proposed “sensitive scenarios” and improvement roadmaps. We demonstrate the framework with an emerging photovoltaics case study. Over a hundred uncertain factors are reduced to less than five which, if optimized, would substantially improve the future safety and sustainability performance of the technology as well as reduce the uncertainty around it.
Article
Full-text available
Multilateral environmental governance regimes like the Antarctic Treaty System are pivotal in addressing today's wicked transboundary socio‐ecological problems and central to their success is the facilitation of constructive knowledge exchange (KE) between research and policymaking communities. Consequently, the literature is now ripe with studies that aim to uncover the elements that enable or hinder KE successes across diverse environmental governance settings. Yet, in the Antarctic context, the KE practices that comprise Antarctic science‐policy interfaces remain empirically under examined. Here we contribute by exploring the perspectives of 31 Antarctic practitioners to develop our understandings of successful KE practices in the policy contexts of the Antarctic Treaty Consultative Meetings and the Committee for Environmental Protection. By adopting a reflexive thematic analysis, we identify 11 enablers and 9 barriers to KE success that are overlapping, interconnected and complex. According to practitioners, in the face of pervasive barriers, such as the often overshadowing effect of politics, a deficiency of KE incentives and large‐scale wicked policy problems, certain Antarctic institutions and practitioners portray strong boundary spanning expertise, which despite the many challenges identified, serves to facilitate KE in support of evidence‐informed decision‐making. However, the extent to which boundary spanners are influential in their leadership varies, and while acknowledging that influential leadership is an important enabler for success, we raise several questions regarding the potentially unexplored assumptions that underpin current KE practices. As Antarctic practitioners share a desire to foster inclusive, iterative and multidirectional science‐policy dialogues among other identified improvements, we suggest that harnessing reflexivity and humility within these processes will be critically important for ensuring that existing asymmetries or inequities are not reinforced under the guise of improved ways of working.
Article
Full-text available
Common difficulties across industries are discovered in data management, where handling the volume, variety, and quality of data is crucial for informed decisions in uncertain environments. In this context, rail management must navigate complex decision‐making to ensure safety, service continuity, and cost‐effectiveness. The 2020 Stonehaven derailment is an example of the increasing vulnerability of rail infrastructure to environmental factors and systemic failures. It emphasizes the need for resilient systems, proficient at preventative maintenance and adaptable to escalating challenges. These matters further accentuate the need for context‐dependent strategies that bridge theoretical insights and practical applications. This scoping review explores strategies for decision‐making under uncertainty across sectors such as civil infrastructure, agriculture, water management, and emergency response. It unfolds a selection of procedures addressing the impacts of extreme weather and other unexpected disruptions. It also sets a foundation for future research to support rail infrastructure adaptation to climate change by advocating the use of cybernetic principles and artificial intelligence (AI) to enhance decision‐making processes. Cybernetics enables collaborative human‐AI methods, improving adaptability and resilience. However, balancing and incorporating diverse stakeholder viewpoints into decision chains remains difficult. While promising, substantial research and system improvements are needed to fully harness the potential of AI.
Article
Full-text available
Climate change adaptation is most often defined as a local and national governance issue. While the scientific literature recognizes the potential significance of cross-border climate impacts, adaptation responses and strategies are mostly confined within tightly defined sectoral contexts or specific geographical regions. These approaches overlook transmission of impacts across sectors and borders and fail to lay the groundwork for systemic adaptation responses and cross-scale solutions for resilience building. We propose a conceptual framework for identifying and analysing different types of responses to cross-border climate impacts. The response framework provides typologies of cross-border climate impacts and responses and define different configurations of actors who may respond to impacts. A set of alternative governance approaches are then proposed to address different types of cross-border climate impact, potentially minimising undesirable consequences of adaptation responses. The framework offers a sequence of steps to assess historical responses and map policy gaps and under-represented response types, enhancing the design and implementation of adaptation strategies in future. We apply our framework to a historical example, the food affordability crisis in 2010, and a hypothetical case of a global food security crisis in future. We conclude that our framework enhances understanding of responses to cross-border climate impacts and inform policy of a range of response options and governance approaches to reduce and manage risks based on the nature and dynamic of impacts, the level of cross-scale coordination and governance capacities.
Book
Full-text available
Linking environmental sustainability with poverty reduction and social justice, and making science and technology work for the poor, have become central practical, political and moral challenges of our times. These must be met in a world of rapid, interconnected change in environments, societies and economies, and globalised, fragmented governance arrangements. Yet despite growing international attention and investment, policy attempts often fail. Why is this, and what can be done about it? How might we understand and address emergent threats from epidemic disease, or the challenges of water scarcity in dryland India? In the context of climate change, how might seed systems help African farmers meet their needs, and how might appropriate energy strategies be developed? This book lays out a new 'pathways approach' to address sustainability challenges such as these in today's dynamic world. Through an appreciation of dynamics, complexity, uncertainty, differing narratives and the values-based aims of sustainability, the pathways approach allows us to see how some approaches are dominant, even though they do not produce the desired results, and how to create successful alternative 'pathways' of responding to the challenges we face. As well as offering new ways of thinking about sustainability, the book also suggests a series of practical ways forward - in tools and methods, forms of political engagement, and styles of knowledge-making and communication. Throughout the book, the practicalities of the pathways approach are illustrated using four case studies: water in dryland India, agricultural seeds in Africa, responses to epidemic disease and energy systems/climate change. Published in association with the Economic and Social Research Council (ESRC). © M. Leach, I. Scoones and A. Stirling, 2010. All rights reserved.
Article
Full-text available
Discursive deference in the governance of science and technology is rebalancing from expert analysis toward participatory deliberation. Linear, scientistic conceptions of innovation are giving ground to more plural, socially situated understandings. Yet, growing recognition of social agency in technology choice is countered by persistently deterministic notions of technological progress. This article addresses this increasingly stark disjuncture. Distinguishing between "appraisal" and "commitment" in technology choice, it highlights contrasting implications of normative, instrumental, and substantive imperatives in appraisal. Focusing on the role of power, it identifies key commonalities transcending the analysis/participation dichotomy. Each is equally susceptible to instrumental framing for variously weak and strong forms of justification. To address the disjuncture, it is concluded that greater appreciation is required-in both analytic and participatory appraisal-to facilitating the opening up (rather than the closing down) of governance commitments on science and technology.
Article
Full-text available
The recent controversy over genetically modified (GM) foods amply demonstrates the general difficulties encountered in the social appraisal of technological risk. Existing procedures for regulatory appraisal neglect many possible forms of impact and routinely exclude important cultural and social dimensions of risk. A narrow, expert, 'science-based' approach is now widely acknowledged to be insufficient. There is a need for new approaches that are more broadly based, transparent, pluralistic and ready to acknowledge uncertainty as well as being practically feasible and robust. The authors investigate the potential for a novel 'multicriteria mapping' (MCM) method as one such possible tool. Drawing on a variety of perspectives in the current UK debate, a range of agricultural strategies for the production of oilseed rape, including both GM and non-GM options were explored in this MCM pilot exercise. The results demonstrate the general feasibility and positive potential of this type of approach, with specific findings providing modest insights for policymaking in this difficult area.
Article
Full-text available
This paper examines apparent tensions between "science-based," "precautionary," and "participatory" approaches to decision making on risk. Partly by reference to insights currently emerging in evolutionary studies, the present paper looks for ways to reconcile some of the contradictions. First, I argue that technological evolution is a much more plural and open-ended process than is conventionally supposed. Risk politics is thus implicitly as much about social choice of technological pathways as narrow issues of safety. Second, it is shown how conventional "science-based" risk assessment techniques address only limited aspects of incomplete knowledge in complex, dynamic, evolutionary processes. Together, these understandings open the door to more sophisticated, comprehensive, rational, and robust decision-making processes. Despite their own limitations, it is found that precautionary and participatory approaches help to address these needs. A concrete framework is outlined through which the synergies can be more effectively harnessed. By this means, we can hope simultaneously to improve scientific rigor and democratic legitimacy in risk governance.
Article
The author considers the implications for current assumptions about scientific knowledge and environmental policy raised by the preventive approach and the associated Precautionary Principle. He offers a critical examination of approaches to characterizing different kinds of uncertainty in policy knowledge, especially in relation to decision making upstream from environmental effects. Via the key dimension of unrecognized indeterminacy in scientific knowledge, the author argues that shifting the normative principles applied to policy use of science is not merely an external shift in relation to the same body of 'natural' knowledge, but also involves the possible reshaping of the 'natural' knowledge itself.
Article
Examines the role played by true uncertainty, defined as the possibility of alternative outcomes whose probabilities are not capable of measurement, in an economic system, and distinguishes uncertainty from risk. Classical economic theory teaches that perfect competition ought to drive an economy into equilibrium and eliminate opportunities for economic profit. Nevertheless, economic profit persists in the real world. The introductory sections of the book provide a historical and critical review of early attempts to reconcile theory and observation. Then, beginning with a simplified model economy of individuals as producers-and-consumers, the author derives familiar features of static economics. The model goes through further refinements of joint production, and changes with uncertainty absent with similar results. The final model is one that demonstrates how perfect competition tends to eliminate profit. The author then takes up the question of how risk and uncertainty may upset the equilibrium. Risk is the possibility of alternative outcomes whose probabilities are capable of measurement; uncertainty is the possibility of alternative outcome whose probabilities are not capable of measurement. When probabilities are known, adverse outcomes may be insured against. Uncertainty is handled by judgment, an unequally distributed ability. The successful entrepreneur is one who has the sound judgment, either in the direction of the enterprise itself or in the selection of its managers (as shareholders do). The recompense for this talent is profit. (CAR)
  • A Stirling
  • Ann
Stirling, A. Ann. NY Acad. Sci. 1128, 95–110 (2008).
in Late Lesson from Early Warnings: the precautionary principle
  • J Farman
  • P Harremoës
Farman, J. in Late Lesson from Early Warnings: the precautionary principle 1898-2000 (eds Harremoës, P. et al.) 76–83 (European Environment Agency, 2001).