ArticlePDF Available

Abstract and Figures

The governance of emerging science and innovation is a major challenge for contemporary democracies. In this paper we present a framework for understanding and supporting efforts aimed at ‘responsible innovation’. The framework was developed in part through work with one of the first major research projects in the controversial area of geoengineering, funded by the UK Research Councils. We describe this case study, and how this became a location to articulate and explore four integrated dimensions of responsible innovation: anticipation, reflexivity, inclusion and responsiveness. Although the framework for responsible innovation was designed for use by the UK Research Councils and the scientific communities they support, we argue that it has more general application and relevance.
Content may be subject to copyright.
Research
Policy
42 (2013) 1568–
1580
Contents
lists
available
at
ScienceDirect
Research
Policy
j
o
ur
nal
homep
age:
www.elsevier.com/locate/respol
Developing
a
framework
for
responsible
innovation
Jack
Stilgoea,,
Richard
Owenb,1,
Phil
Macnaghtenc,d
aUniversity
of
Exeter
Business
School/Department
of
Science
and
Technology
Studies,
University
College
London,
Gower
Street,
London
WC1E
6BT,
UK
bUniversity
of
Exeter
Business
School,
Rennes
Drive,
Exeter
EX4
4PU,
UK
cDepartment
of
Geography,
Science
Laboratories,
Durham
University,
South
Road,
Durham
DH1
3LE,
UK
dDepartment
of
Science
and
Technology
Policy,
Institute
of
Geosciences,
P.O.
Box
6152,
State
University
of
Campinas
UNICAMP,
13083-970
Campinas,
SP,
Brazil
a
r
t
i
c
l
e
i
n
f
o
Article
history:
Received
16
July
2012
Received
in
revised
form
7
May
2013
Accepted
17
May
2013
Available online 13 June 2013
Keywords:
Responsible
innovation
Governance
Emerging
technologies
Ethics
Geoengineering
a
b
s
t
r
a
c
t
The
governance
of
emerging
science
and
innovation
is
a
major
challenge
for
contemporary
democracies.
In
this
paper
we
present
a
framework
for
understanding
and
supporting
efforts
aimed
at
‘responsible
innovation’.
The
framework
was
developed
in
part
through
work
with
one
of
the
first
major
research
projects
in
the
controversial
area
of
geoengineering,
funded
by
the
UK
Research
Councils.
We
describe
this
case
study,
and
how
this
became
a
location
to
articulate
and
explore
four
integrated
dimensions
of
responsible
innovation:
anticipation,
reflexivity,
inclusion
and
responsiveness.
Although
the
frame-
work
for
responsible
innovation
was
designed
for
use
by
the
UK
Research
Councils
and
the
scientific
communities
they
support,
we
argue
that
it
has
more
general
application
and
relevance.
© 2013 The Authors. Published by Elsevier B.V. All rights reserved.
1.
Introduction
1.1.
Responsibility,
science
and
innovation
Responsible
innovation
is
an
idea
that
is
both
old
and
new.
Responsibility
has
always
been
an
important
theme
of
research
and
innovation
practice,
although
how
it
has
been
framed
has
varied
with
time
and
place.
Francis
Bacon’s
imperative
to
support
sci-
ence
‘for
the
relief
of
man’s
estate’,
the
institutionalisation
and
professionalisation
of
science
from
the
17th
century
onwards,
Van-
nevar
Bush’s
(1945)
‘Endless
Frontier’,
JD
Bernal’s
(1939)
arguments
for
science
in
the
service
of
society
and
Michael
Polanyi’s
(1962)
‘Republic
of
Science’
counter-argument
have
all
contained
partic-
ular
notions
of
responsibility.
Science
has
been
conventionally
invoked
by
policy
as
eman-
cipatory.
This
has
allowed
scientists
and
innovators
considerable
freedom
from
political
accountability.
From
this
perspective,
the
role
responsibilities
of
scientists
to
produce
reliable
knowledge
and
their
wider
moral
responsibilities
to
society
are
imagined
to
be
conflicted.
The
perceived
high
value
of
knowledge
to
society
This
is
an
open-access
article
distributed
under
the
terms
of
the
Creative
Commons
Attribution
License,
which
permits
unrestricted
use,
distribution
and
reproduction
in
any
medium,
provided
the
original
author
and
source
are
credited.
Corresponding
author.
Tel.:
+44
020
7679
7197.
E-mail
addresses:
j.stilgoe@ucl.ac.uk,
jackstilgoe@gmail.com
(J.
Stilgoe),
r.j.owen@exeter.ac.uk
(R.
Owen),
p.m.macnaghten@durham.ac.uk
(P.
Macnaghten).
1Tel.:
+44
01392
723458.
means
that
such
role
responsibilities
typically
trump
any
wider
social
or
moral
obligations
(Douglas,
2003).
Although
frequent
objections
from
university
scientists
suggest
a
permanent
assault
on
their
autonomy,
much
of
the
constitution
of
Polanyi’s
(1962)
self-governing
‘Republic
of
Science’
survives
to
this
day.
In
the
second
half
of
the
20th
century,
as
science
and
innova-
tion
have
become
increasingly
intertwined
and
formalised
within
research
policy
(Kearnes
and
Wienroth,
2011),
and
as
the
power
of
technology
to
produce
both
benefit
and
harm
has
become
clearer,
debates
concerning
responsibility
have
broadened
(Jonas,
1984;
Collingridge,
1980;
Beck,
1992;
Groves,
2006).
We
have
seen
recog-
nition
and
negotiation
of
the
responsibilities
of
scientists
beyond
those
associated
with
their
professional
roles
(e.g.
Douglas,
2003;
Mitcham,
2003).
We
have
seen
scientists’
own
ideas
of
‘research
integrity’
change
in
response
to
societal
concerns
(Mitcham,
2003;
Steneck,
2006).
In
the
1970s,
biologists
in
the
nascent
field
of
recombinant
DNA
research
sought
to
‘take
responsibility’
for
the
possible
hazards
their
research
might
unleash,
with
a
meeting
at
Asilomar
in
1975
and
a
subsequent
moratorium.2Concerns
about
the
‘dual
use’
of
emerging
technologies
and
the
limits
of
self-
regulation,
visible
in
physicists’
agonising
about
nuclear
fission
prior
to
the
Manhattan
project
(Weart,
1976),
resurfaced
in
2012
with
the
recent
controversy
over
the
publishing
of
potentially
2We
should
point
out
that
this
meeting
was
criticised,
both
at
the
time
(Rogers,
1975)
and
in
later
scholarship
(Wright,
2001;
Nelkin,
2001)
as
being
motivated
by
an
attempt
to
escape
top-down
regulation
rather
than
to
‘take
responsibility’
0048-7333/$
see
front
matter ©
2013 The Authors. Published by Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.respol.2013.05.008
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1569
dangerous
research
on
flu
viruses
(Kaiser
and
Moreno,
2012).
The
negotiation
of
responsibility
between
practicing
scientists,
innovators
and
the
outside
world
remains
an
important
and
contested
area
of
debate
to
this
day.
Research
in
Science
and
Technology
Studies
(STS)
suggests
that
conceptions
of
responsibility
should
build
on
the
understand-
ing
that
science
and
technology
are
not
only
technically
but
also
socially
and
politically
constituted
(e.g.
Winner,
1977).
Latour
(2008)
suggests
that
science
does
not
straightforwardly
reveal
real-
ity
through
techniques
of
simplification
and
purification
aimed
at
further
mastery.
As
Callon
et
al.
(2009)
point
out,
science
and
technology
can,
paradoxically,
add
to
our
sense
of
uncertainty
and
ignorance.
They
tend
to
produce
a
continuous
movement
toward
a
greater
and
greater
level
of
attachments
of
things
and
people
at
an
ever
expanding
scale
and
at
an
ever
increasing
degree
of
inti-
macy”
(Latour,
2008,
p.
4,
italics
in
original).
These
observations
suggest
that
unforeseen
impacts
potentially
harmful,
potentially
transformative
will
be
not
just
possible
but
probable
(Hacking,
1986).
Responsibility
in
governance
has
historically
been
concerned
with
the
‘products’
of
science
and
innovation,
particularly
impacts
that
are
later
found
to
be
unacceptable
or
harmful
to
society
or
the
environment.
Recognition
of
the
limitations
of
governance
by
market
choice
has
led
to
the
progressive
introduction
of
post
hoc,
and
often
risk-based
regulation.
This
has
created
a
well-established
division
of
labour
that
reflects
a
consequentialist
framing
of
respon-
sibility,
as
accountability
or
liability
(Pellizzoni,
2004;
Grinbaum
and
Groves,
2013).
With
innovation,
the
past
and
present
however
do
not
provide
a
reasonable
guide
to
the
future
(Adam
and
Groves,
2011),
so
such
retrospective
accounts
of
responsibility
are
inher-
ently
limited.
We
face
a
dilemma
of
control
(Collingridge,
1980),
in
that
we
lack
the
evidence
on
which
to
govern
technologies
before
pathologies
of
path
dependency
(David,
2001),
technological
lock-
in
(Arthur,
1989),
‘entrenchment’
(Collingridge,
1980)
and
closure
(Stirling,
2007)
set
in.
We
have
(pre-)cautionary
tales
of
risks
whose
effects
did
not
materialise
for
many
years,
where
potential
threats
were
foreseen
but
ignored
or
where
only
certain
risks
were
considered
relevant
(Hoffmann-Riem
and
Wynne,
2002;
EEA,
2001,
2013).
Governance
processes,
often
premised
on
formal
risk
assessment,
have
done
little
to
identify
in
advance
many
of
the
most
profound
impacts
that
we
have
experienced
through
innovation,
with
the
2008
finan-
cial
crisis
being
the
most
disruptive
recent
example
(Muniesa
and
Lenglet,
2013).
Bioethics,
another
major
governance
response,
has
drawn
criticism
for
privileging
individual
ethical
values
such
as
autonomy
over
those
such
as
solidarity
that
might
lead
to
a
genuine
‘public
ethics’
(Nuffield
Council
on
Bioethics,
2012;
also
Prainsack
and
Buyx,
2012)
and,
in
its
consequentialist
version,
serving
to
bol-
ster
the
narrow
instrumental
expectations
of
innovators
in
some
areas
(Hedgecoe,
2010).
Callon
et
al.
(2009)
use
the
metaphor
of
science
and
technol-
ogy
‘overflowing’
the
boundaries
of
existing
scientific
regulatory
institutional
frameworks.
They
point
to
the
need
for
new
‘hybrid
forums’
that
will
help
our
democracies
to
be
“enriched,
expanded,
extended
and.
.
.
more
able
to
absorb
the
debates
and
con-
troversies
surrounding
science
and
technology”
(Callon
et
al.,
2009,
p.
9).
Such
controversies
have
demonstrated
that
pub-
lic
concerns
cannot
be
reduced
to
questions
of
risk,
but
rather
encompass
a
range
of
concerns
relating
to
the
purposes
and
motivations
of
research
(Grove-White
et
al.,
2000;
Wynne,
2002;
Grove-White
et
al.,
1997;
Macnaghten
and
Szerszynski,
2013;
Stilgoe,
2011),
joining
a
stream
of
policy
debate
about
the
direc-
tions
of
innovation
(Smith
et
al.,
2005;
Stirling,
2008;
Morlacchi
and
Martin,
2009;
Fisher
et
al.,
2006;
Flanagan
et
al.,
2011).
Yet,
despite
efforts
at
enlarging
participation
(see,
for
example,
RCEP,
1998;
House
of
Lords,
2000;
Wilsdon
and
Willis,
2004)
current
forms
of
regulatory
governance
offer
little
scope
for
broad
ethical
reflection
on
the
purposes
of
science
or
innova-
tion.
1.2.
A
new
scientific
governance?
One
alternative
to
a
consequentialist
model
of
responsibility
has
been
to
succumb
to
moral
luck
(Williams,
1981),
to
hope
that
an
appeal
to
unpredictability
and
an
inability
to
‘reasonably
foresee’
will
allow
us
to
escape
moral
accountability
for
our
actions.
Dis-
satisfaction
with
both
this
approach
and
risk-based
regulation
has
moved
attention
away
from
accountability,
liability
and
evidence
towards
those
future-oriented
dimensions
of
responsibility
care
and
responsiveness
that
offer
greater
potential
to
accommodate
uncertainty
and
allow
reflection
on
purposes
and
values
(Jonas,
1984;
Richardson,
1999;
Pellizzoni,
2004;
Groves,
2006;
Adam
and
Groves,
2011).
Emerging
technologies
typically
fall
into
what
Hajer
(2003)
calls
an
‘institutional
void’.
There
are
few
agreed
structures
or
rules
that
govern
them.
They
are
therefore
emblematic
of
the
move
from
old
models
of
governing
to
more
decentralised
and
open-ended
gov-
ernance,
which
takes
place
in
new
places
markets,
networks
and
partnerships
as
well
as
conventional
policy
and
politics
(Hajer
and
Wagenaar,
2003).
A
number
of
multi-level,
non-regulatory
forms
of
science
and
innovation
governance
have
taken
this
forward-looking
view
of
responsibility,
building
on
insights
from
STS
that
highlight
the
social
and
political
choices
that
stabilise
particular
innovations
(Williams
and
Edge,
1996;
Pinch
and
Bijker,
1984;
Winner,
1986).
New
models
of
anticipatory
governance
(Barben
et
al.,
2008;
Karinen
and
Guston,
2010)
Constructive,
Real-Time
and
other
forms
of
technology
assessment
(Rip
et
al.,
1995;
Guston
and
Sarewitz,
2002;
Grin
and
Grunwald,
2000),
upstream
engagement
(Wynne,
2002;
Wilsdon
and
Willis,
2004),
value-sensitive
design
(Friedman,
1996;
van
den
Hoven
et
al.,
2012)
and
socio-technical
integra-
tion
(Fisher
et
al.,
2006;
Schuurbiers,
2011)
have
emerged.
These
have
been
complemented
by
policy
instruments
such
as
normative
codes
of
conduct
(see,
for
example,
European
Commission,
2008),
standards,
certifications
and
accreditations,
running
alongside
expert
reports,
technology
assessments
and
strategic
roadmaps.
Such
initiatives
have,
to
varying
degrees,
attempted
to
introduce
broader
ethical
reflection
into
the
scientific
and
innovation
pro-
cess,
breaking
the
existing
moral
division
of
labour
described
above.
They
have
attempted
to
open
up
science
and
innovation
(Stirling,
2008)
to
a
wider
range
of
inputs,
notably
through
the
creation
of
new
spaces
of
‘public
dialogue’
(Irwin,
2006).
The
other
important
aspect
of
a
forward-looking
view
of
respon-
sibility
in
science
and
innovation
is
that
it
is
shared
(Richardson,
1999;
Mitcham,
2003;
Von
Schomberg,
2007).
The
unpredictability
of
innovation
is
inherently
linked
to
its
collective
nature.
Follow-
ing
Callon’s
account
of
innovation
as
‘society
in
the
making’
(Callon,
1987),
we
can
see
that
implications
are
‘systemic’,
coming
from
the
interplay
of
the
technical
and
the
social
(Hellström,
2003).
This
sug-
gests
that
scientists,
research
funders,
innovators
and
others
have
a
collective
political
responsibility
(Grinbaum
and
Groves,
2013)
or
co-responsibility
(Mitcham,
2003).
This
reflects
understanding
that
while
actors
may
not
individually
be
irresponsible
people,
it
is
the
often
complex
and
coupled
systems
of
science
and
innovation
that
create
what
Ulrich
Beck
(2000)
calls
‘organised
irresponsibility’.3
We
can
point
to
‘second-order’
(Illies
and
Meijers,
2009)
or
‘meta-
task’
responsibilities
(van
den
Hoven,
1998;
van
den
Hoven
et
al.,
3von
Schomberg
(2013)
suggests
four
categories
of
irresponsible
innovation
that
typically
manifest:
Technology
push,
Neglect
of
ethical
principles,
Policy
Pull
and
Lack
of
precaution
and
foresight.
1570 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
2012)
of
ensuring
that
responsible
choices
can
be
made
in
the
future,
through
anticipating
and
gaining
knowledge
of
possible
consequences
and
building
capacity
to
respond
to
them.
This
reframing
of
responsibility
and
the
approaches
aimed
at
opening
up
scientific
governance
described
above
provide
impor-
tant
foundations
for
responsible
innovation.
The
phrase,
sometimes
lengthened
to
‘responsible
research
and
innovation’,
is
starting
to
appear
in
academic
and
policy
literature
(Guston,
2006;
Hellström,
2003;
von
Schomberg,
2011a,
2011b;
Lee,
2012;
Sutcliffe,
2011;
Owen
and
Goldberg,
2010;
Owen
et
al.,
2012;
Randles
et
al.,
2012),
but
it
is
still
lacking
conceptual
weight.
Around
nanotechnology
and
other
emerging
areas
of
science
and
technology,
Rip
(2011)
identifies
a
move
from
a
discourse
of
responsible
science
to
one
of
‘responsible
governance’.
US
nanotechnology
debates
have
tended
to
use
the
phrase
‘responsible
development’
(Kjølberg,
2010).
But
the
meaning
of
such
terms
remains
contested.
Rather
than
rep-
resenting
a
clear
novel
governance
paradigm,
we
might
instead
see
responsible
innovation
as
a
location
for
making
sense
of
the
move
from
the
governance
of
risk
to
the
governance
of
innova-
tion
itself
(Felt
et
al.,
2007).
In
the
following
sections
we
develop
these
concepts
and
associated
literatures
to
articulate
a
framework
for
responsible
innovation.
This
has
been
informed
by
a
geoengi-
neering
research
project
in
which
we
were
involved.
Finally,
we
offer
some
conclusions
on
how
this
framework
might
be
taken
for-
ward,
based
in
part
on
our
experiences
within
this
case
study
of
technoscience-in-the-making.
2.
Four
dimensions
of
responsible
innovation
von
Schomberg
(2011a)
offers
the
following
definition
of
Responsible
Research
and
Innovation:
“A
transparent,
interactive
process
by
which
societal
actors
and
innovators
become
mutually
responsive
to
each
other
with
a
view
to
the
(ethical)
acceptability,
sustainability
and
societal
desirability
of
the
innovation
process
and
its
marketable
prod-
ucts
(in
order
to
allow
a
proper
embedding
of
scientific
and
technological
advances
in
our
society).”
This
definition
is
anchored
to
European
policy
processes
and
values.
As
we
will
discuss
in
the
final
section
of
this
paper,
our
framework
has
similar
elements
but
emerges
from
a
different
con-
text.
We
offer
a
broader
definition,
based
on
the
prospective
notion
of
responsibility
described
above:
“Responsible
innovation
means
taking
care
of
the
future
through
collective
stewardship
of
science
and
innovation
in
the
present.”
The
dimensions
that
make
up
our
framework
originate
from
a
set
of
questions
that
have
emerged
as
important
within
pub-
lic
debates
about
new
areas
of
science
and
technology.
These
are
questions
that
public
groups
typically
ask
of
scientists,
or
would
like
to
see
scientists
ask
of
themselves.
Table
1
draws
on
Macnaghten
and
Chilvers’
(forthcoming)
analysis
of
cross-cutting
public
concerns
across
17
UK
public
dialogues
on
science
and
technology
and
categorises
these
questions
as
to
whether
they
relate
to
the
products,
processes
or
purposes
of
innovation.
Con-
ventional
governance
focuses
on
product
questions,
particularly
those
of
technological
risk,
which
can
obscure
areas
of
uncertainty
and
ignorance
about
both
risks
and
benefits
(Hoffmann-Riem
and
Wynne,
2002;
Stirling,
2010).
Tools
of
ethical
governance
and
research
integrity
move
into
questions
of
process,
especially
when
human
volunteers
and
animals
are
involved
in
experimentation.
Approaches
to
responsible
innovation
extend
the
governance
discussion
to
encompass
questions
of
uncertainty
(in
its
multiple
forms),
purposes,
motivations,
social
and
political
constitutions,
trajectories
and
directions
of
innovation.
If
we
take
these
questions
to
represent
aspects
of
societal
concern
and
interest
in
research
and
innovation,
responsible
inno-
vation
can
be
seen
as
a
way
of
embedding
deliberation
on
these
within
the
innovation
process.
The
four
dimensions
of
responsi-
ble
innovation
we
propose
(anticipation,
reflexivity,
inclusion
and
responsiveness)
provide
a
framework
for
raising,
discussing
and
responding
to
such
questions.
The
dimensions
are
important
char-
acteristics
of
a
more
responsible
vision
of
innovation,
which
can,
in
our
experience,
be
heuristically
helpful
for
governance.
We
will
go
on
to
describe
one
application
of
our
framework
at
a
project
level,
where
the
main
actors
were
the
project
scientists,
research
fun-
ders,
stakeholders
and
ourselves.
However,
the
framework
may
be
applicable
at
other
levels,
such
as
with
the
development
of
policy
or
thematic
programmes
(see
Fisher
and
Rip,
2013).
Each
dimen-
sion
demands
particular
explanation,
but
the
lines
between
them
are
blurred.
We
therefore
end
this
section
by
discussing
the
impor-
tance
of
integration.
For
each
dimension,
we
explain
the
conceptual
and
policy
background,
give
meaning
to
the
term,
describe
some
mechanisms
and
approaches
that
might
articulate
the
dimension
in
practice
and
offer
criteria
and
conditions
for
effective
innovation
governance.
2.1.
Anticipation
The
call
for
improved
anticipation
in
governance
comes
from
a
variety
of
sources,
from
political
and
environmental
concerns
with
the
pace
of
social
and
technical
change
(e.g.
Toffler,
1970),
to
schol-
arly
(and
latterly,
policy)
critiques
of
the
limitations
of
top-down
risk-based
models
of
governance
to
encapsulate
the
social,
ethi-
cal
and
political
stakes
associated
with
technoscientific
advances
(amongst
others,
see
Wynne,
1992,
2002;
RCEP,
1998;
Jasanoff,
2003;
Henwood
and
Pidgeon,
2013).
The
detrimental
implications
of
new
technologies
are
often
unforeseen,
and
risk-based
estimates
of
harm
have
commonly
failed
to
provide
early
warnings
of
future
effects
(European
Environment
Agency,
2001,
2013;
Hoffmann-
Riem
and
Wynne,
2002).
Anticipation
prompts
researchers
and
organisations
to
ask
‘what
if.
.
.?’
questions
(Ravetz,
1997),
to
con-
sider
contingency,
what
is
known,
what
is
likely,
what
is
plausible
and
what
is
possible.
Anticipation
involves
systematic
thinking
aimed
at
increasing
resilience,
while
revealing
new
opportunities
for
innovation
and
the
shaping
of
agendas
for
socially-robust
risk
research.
The
attempt
to
improve
foresight
in
issues
of
science
and
inno-
vation
is
a
familiar
theme
in
science
and
innovation
policy
(Martin,
Table
1
Lines
of
questioning
on
responsible
innovation.
Product
questions
Process
questions
Purpose
questions
How
will
the
risks
and
benefits
be
distributed?
How
should
standards
be
drawn
up
and
applied?
Why
are
researchers
doing
it?
What
other
impacts
can
we
anticipate?
How
should
risks
and
benefits
be
defined
and
measured?
Are
these
motivations
transparent
and
in
the
public
interest?
How
might
these
change
in
the
future?
Who
is
in
control?
Who
will
benefit?
What
don’t
we
know
about?
Who
is
taking
part?
What
are
they
going
to
gain?
What
might
we
never
know
about?
Who
will
take
responsibility
if
things
go
wrong?
What
are
the
alternatives?
How
do
we
know
we
are
right?
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1571
2010).
This
is
not
to
say
there
is
a
shortage
of
future-gazing.
Indeed,
there
is
a
growing
literature
in
STS
concerned
with
scientists’
and
innovators’
‘imaginaries’
of
the
future
(van
Lente,
1993;
Brown
et
al.,
2000;
Fortun,
2001;
Brown
and
Michael,
2003;
Hedgecoe
and
Martin,
2003;
Fujimura,
2003;
Borup
et
al.,
2006;
Selin,
2007).
These
expectations
work
not
just
to
predict
but
also
to
shape
desirable
futures
and
organise
resources
towards
them
(te
Kulve
and
Rip,
2011).
Research
in
genomics
and
nanotechnology
has,
for
example,
been
shown
to
carry
highly
optimistic
promises
of
major
social
and
industrial
transformation,
suggesting
a
need
for
what
Fortun
(2005)
calls
‘an
ethics
of
promising’
to
instil
some
form
of
responsibility
in
disentangling
present
hype
from
future
reality
(Brown,
2003).
Any
process
of
anticipation
therefore
faces
a
tension
between
pre-
diction,
which
tends
to
reify
particular
futures,
and
participation,
which
seeks
to
open
them
up.
Upstream
public
engagement
(Wilsdon
and
Willis,
2004)
and
Constructive
Technology
Assessment
(Rip
et
al.,
1995)
are
two
techniques
that
involve
anticipatory
discussions
of
possible
and
desirable
futures.
Guston
and
Sarewitz’s
(2002)
‘Real-Time
Tech-
nology
Assessment’
is
another
model
of
what
they
call
‘anticipatory
governance’
(see
also
Barben
et
al.,
2008;
Karinen
and
Guston,
2010).
Anticipation
is
here
distinguished
from
prediction
in
its
explicit
recognition
of
the
complexities
and
uncertainties
of
sci-
ence
and
society’s
co-evolution
(Barben
et
al.,
2008).
Methods
of
foresight,
technology
assessment,
horizon
scanning
or
scenario
planning
can
be
important
techniques,
although
used
narrowly
they
risk
exacerbating
technological
determinism.
Scenarios
(Selin,
2011;
Robinson,
2009)
and
vision
assessment
(Grin
and
Grunwald,
2000)
have
been
used
in
various
settings.
Some
scholars
(e.g.
Miller
and
Bennett,
2008)
have
also
suggested
that
socio-literary
techniques
drawing
on
science
fiction
may
be
powerful
ways
to
democratise
thinking
about
the
future.
Much
of
the
academic
literature
here
makes
the
point
that
suc-
cessful
anticipation
also
requires
understanding
of
the
dynamics
of
promising
that
shape
technological
futures
(Borup
et
al.,
2006;
Selin,
2011;
van
Lente
and
Rip,
1998).
Anticipatory
processes
need
to
be
well-timed
so
that
they
are
early
enough
to
be
constructive
but
late
enough
to
be
meaningful
(Rogers-Hayden
and
Pidgeon,
2007).
The
plausibility
of
scenarios
is
an
important
factor
in
their
success
(Selin,
2011;
von
Schomberg,
2011c)
and
we
should
not
underes-
timate
the
work
involved
in
building
robust
tools
for
anticipation
(Robinson,
2009).
We
must
also
recognise
institutional
and
cul-
tural
resistance
to
anticipation.
As
Guston
(2012)
points
out,
a
lack
of
anticipation
may
not
just
be
a
product
of
reductionism
and
dis-
ciplinary
siloes.
It
may,
at
least
in
part,
be
intentional
as
scientists
seek
to
defend
their
autonomy
(Guston,
2012).
2.2.
Reflexivity
Responsibility
demands
reflexivity
on
the
part
of
actors
and
institutions,
but
this
is
not
straightforwardly
defined.
Lynch
(2000)
unpacks
the
word
‘reflexivity’
to
reveal
its
multiple
meanings
and
modes
of
engagement
with
social
worlds.
Social
theorists
(Beck,
1992;
Beck
et
al.,
1994)
have
argued
that
reflexivity
is
a
condition
of
contemporary
modernity.
Scientists’
own
version
of
reflexivity
often
echoes
Popper’s
(1963)
argument
that
self-
referential
critique
is
an
organising
principle
of
science
(Lynch,
2000).
We
would
argue,
following
Wynne
(1993),
that
there
is
a
demonstrated
need
for
institutional
reflexivity
in
governance.
Reflexivity,
at
the
level
of
institutional
practice,
means
holding
a
mirror
up
to
one’s
own
activities,
commitments
and
assumptions,
being
aware
of
the
limits
of
knowledge
and
being
mindful
that
a
particular
framing
of
an
issue
may
not
be
universally
held.
This
is
second-order
reflexivity
(Schuurbiers,
2011)
in
which
the
value
systems
and
theories
that
shape
science,
innovation
and
their
governance
are
themselves
scrutinised.
Unlike
the
private,
professional
self-critique
that
scientists
are
used
to,
responsibility
makes
reflexivity
a
public
matter
(Wynne,
2011).
Mechanisms
such
as
codes
of
conduct,
moratoriums
and
the
adoption
of
standards
may
build
this
second-order
reflexivity
by
drawing
connections
between
external
value
systems
and
scien-
tific
practice
(Busch,
2011;
von
Schomberg,
2013).
Recent
attempts
to
build
reflexivity
have
tended
to
focus
at
the
laboratory
level,
often
with
the
participation
of
social
scientists
or
philosophers.
The
argument
is
that
in
the
bottom-up,
self-governing
world
of
science,
laboratory
reflexivity
becomes
a
vital
lever
for
opening
up
alternatives
through
enhancing
the
“reflections
of
natural
scien-
tists
on
the
socio-ethical
context
of
their
work”
(Schuurbiers,
2011,
p.
769;
also
see
Schuurbiers
and
Fisher,
2009).
Approaches
such
as
‘midstream
modulation’
(Fisher
et
al.,
2006;
Fisher,
2007)
and
‘eth-
ical
technology
assessment’
(Swierstra
et
al.,
2009)
give
familiar
ethnographic
STS
laboratory
studies
an
interventionist
turn
(see
Doubleday,
2007
for
a
similar
approach).
Rosalyn
Berne’s
(2006)
account
of
her
interviews
with
nanoscientists
suggests
a
similar
intention.
The
conversation
becomes
a
tool
for
building
reflexivity.
Wynne
(2011)
concludes
that,
while
this
work
has
been
demon-
strably
successful
in
beginning
to
build
reflexivity
at
the
laboratory
level,
such
concepts
and
practices
need
to
be
extended
to
include
research
funders,
regulators
and
the
other
institutions
that
com-
prise
the
patchwork
of
science
governance
(a
conclusion
that
has
also
surfaced
from
public
dialogues
in
areas
of
synthetic
biology
and
beyond
(e.g.
TNS-BRMB,
2010)).
These
institutions
have
a
respon-
sibility
not
only
to
reflect
on
their
own
value
systems,
but
also
to
help
build
the
reflexive
capacity
within
the
practice
of
science
and
innovation.
Building
actors’
and
institutions’
reflexivity
means
rethink-
ing
prevailing
conceptions
about
the
moral
division
of
labour
within
science
and
innovation
(Swierstra
and
Rip,
2007).
Reflexivity
directly
challenges
assumptions
of
scientific
amorality
and
agnos-
ticism.
Reflexivity
asks
scientists,
in
public,
to
blur
the
boundary
between
their
role
responsibilities
and
wider,
moral
responsibili-
ties.
It
therefore
demands
openness
and
leadership
within
cultures
of
science
and
innovation.
2.3.
Inclusion
The
waning
of
the
authority
of
expert,
top-down
policy-making
has
been
associated
with
a
rise
in
the
inclusion
of
new
voices
in
the
governance
of
science
and
innovation
as
part
of
a
search
for
legiti-
macy
(Irwin,
2006;
Felt
et
al.,
2007;
Hajer,
2009).
Over
the
last
two
decades,
particularly
in
Northern
Europe,
new
deliberative
forums
on
issues
involving
science
and
innovation
have
been
established,
moving
beyond
engagement
with
stakeholders
to
include
mem-
bers
of
the
wider
public
(e.g.
RCEP,
1998;
Grove-White
et
al.,
1997;
Wilsdon
and
Willis,
2004;
Stirling,
2006;
Macnaghten
and
Chilvers,
forthcoming).
These
small-group
processes
of
public
dialogue,
usefully
described
as
‘mini-publics’
by
Goodin
and
Dryzek
(2006),
include
consensus
conferences,
citizens’
juries,
deliberative
mapping,
deliberative
polling
and
focus
groups
(see
Chilvers,
2010).
Often
under
the
aegis
of
quasi-governmental
institutions
such
as
Sciencewise-ERC
in
the
UK
or
the
Danish
Board
of
Technology,
these
can,
according
to
the
UK
government,
“enable
[public]
debate
to
take
place
‘upstream’
in
the
scientific
and
technological
process”
(HM
Treasury/DTI/DfES,
2004,
p.
105;
see
also
Royal
Society/Royal
Academy
of
Engineering,
2004).
Additionally,
we
can
point
to
the
use
of
multi-stakeholder
partnerships,
forums,
the
inclusion
of
lay
members
on
scientific
advisory
committees,
and
other
hybrid
mechanisms
that
attempt
to
diversify
the
inputs
to
and
delivery
of
governance
(Callon
et
al.,
2009;
Bäckstrand,
2006;
Brown,
2002).
The
practice
of
these
exercises
in
inclusive
governance
and
their
impact
on
policymaking
has
been
uneven,
and
has
attracted
1572 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
substantial
critique
(among
others,
see
Horlick-Jones
et
al.,
2007;
Kerr
et
al.,
2007;
Rothstein,
2007).
Public
engagement
practition-
ers
can
be
accused
of
following
an
emerging
orthodoxy,
with
an
assumed
reasoning
that
“the
technical
is
political,
the
political
should
be
democratic
and
the
democratic
should
be
participatory”
(Moore,
2010,
p.
793).
In
response,
STS
scholarship
has
begun
to
problematise
public
dialogue
as
a
public
good
in
itself
(see
Chilvers,
2009).
The
proliferation
of
participatory
approaches
activities
has
led
to
arguments
for
greater
clarity
about
the
meth-
ods
of
participation,
the
purposes
for
which
they
are
used
and
the
criteria
against
which
they
might
be
evaluated
(Rowe
and
Frewer,
2000,
2005).
In
addition,
a
growing
body
of
critique
has
developed,
drawing
attention
to,
among
other
things:
framing
effects
within
dialogue
processes
which
can
reinforce
existing
relations
of
professional
power
and
deficit
understandings
of
the
public
(Wynne,
2006;
Kerr
et
al.,
2007),
thus
constituting,
at
times,
a
new
“tyranny”
with
questionable
benefits
(Cooke
and
Kothari,
2001);
the
ways
in
which
engagement
processes
construct
particular
kinds
of
publics
that
respond
to
contingent
political
imaginaries
(Lezaun
and
Soneryd,
2007;
Macnaghten
and
Guivant,
2011;
Michael
and
Brown,
2005);
and
the
diverse,
occasionally
competing
motivations
that
underpin
dialogue
(see
Fiorino,
1989;
Stirling,
2008;
Macnaghten
and
Chilvers,
forthcoming).
Irwin
and
colleagues
suggest,
however,
that
“the
(often
implicit)
evocation
of
the
highest
principles
that
engagement
might
ideally
fulfil
can
make
it
difficult
to
acknowledge
and
pay
serious
attention
to
the
varieties
of
engagement
that
are
very
much
less
than
perfect
but
still
somehow
‘good”’
(Irwin
et
al.,
2013,
p.
120).
The
importance
of
public
dialogue
in
“opening
up”
(Stirling,
2008)
framings
of
issues
that
challenge
entrenched
assumptions
and
commitments
has
been
emphasised
(Lövbrand
et
al.,
2011).
And
while
there
has
been
a
resistance
to
attempts
to
proceduralise
public
dialogue
for
fear
that
it
becomes
another
means
of
closure
(Wynne,
2005;
Stirling,
2008)
or
technocracy
(Rose,
1999;
Lezaun
and
Soneryd,
2007),
there
have
been
efforts
to
develop
criteria
aimed
at
assessing
the
quality
of
dialogue
as
a
learning
exercise.
On
the
latter,
Callon
et
al.
(2009,
p.
160)
offer
three
criteria:
intensity
how
early
members
of
the
public
are
consulted
and
how
much
care
is
given
to
the
compo-
sition
of
the
discussion
group;
openness
how
diverse
the
group
is
and
who
is
represented;
and
quality
the
gravity
and
continu-
ity
of
the
discussion.
In
relation
to
what
actually
is
at
stake
in
the
advance
of
new
science
and
technology,
Grove-White
et
al.
(2000)
argue
that
public
dialogue
needs
to
open
up
discussion
of
future
social
worlds
(building
on
the
dimension
of
anticipation)
in
ways
that
critically
interrogate
the
‘social
constitutions’
inherent
in
tech-
nological
options
that
is,
the
distinctive
set
of
social,
political
and
ethical
implications
that
their
development
would
likely
bring
into
being
(see
Macnaghten,
2010
for
an
articulation
of
this
approach
with
respect
to
nanotechnology
and
Macnaghten
and
Szerszynski,
2013
on
geoengineering).
Processes
of
inclusion
inevitably
force
consideration
of
ques-
tions
of
power.
Agencies
commissioning
such
exercises,
facilitators
and
public
participants
may
all
have
different
expectations
of
the
instrumental,
substantive
or
normative
benefits
of
dialogue
(Stirling,
2008).
There
should
be
room
therefore
for
public
and
stakeholder
voices
to
question
the
framing
assumptions
not
just
of
particular
policy
issues
(Grove-White
et
al.,
1997;
Jasanoff,
2003),
but
also
of
participation
processes
themselves
(van
Oudheusden,
2011).
Observed
bottom-up
changes
within
innovation
processes
may
engender
greater
inclusion.
User-driven
(von
Hippel,
1976,
2005),
open
(Chesbrough,
2003),
open
source
(Raymond,
1999),
participatory
(Buur
and
Matthews,
2008)
and
networked
innova-
tion
(Powell
et
al.,
1996)
all
suggest
the
possibility
of
including
new
voices
in
discussions
of
the
ends
as
well
as
the
means
of
innova-
tion,
although
it
remains
to
be
seen
whether,
first,
these
trends
are
as
widespread
and
disruptive
as
their
proponents
claim
and
second,
whether
they
in
reality
resemble
outsourcing
rather
than
genuine
forms
of
‘collective
experimentation’
(Callon
et
al.,
2009,
p.
18).
It
is
far
from
clear
whether
current
or
past
attempts
at
pub-
lic
engagement,
taken
together,
can
be
said
to
constitute
a
new
governance
paradigm.
Rather,
they
might
be
regarded
as
a
pro-
cess
of
‘ongoing
experimentation’
(Lövbrand
et
al.,
2011,
p.
487),
a
symptom
of
changes
in
governance
rather
than
a
centrepiece,
mixing
old
and
new
governance
assumptions
(Irwin,
2006).
Such
processes
might
therefore
be
considered
legitimate
if
their
ambi-
tions
are
modest
and
if
the
STS
scholars
who
advocate
dialogue
are
willing
“to
put
their
own
normative
commitments
through
the
test
of
deliberation”
(Lövbrand
et
al.,
2011,
p.
489).
Attention
has
also
been
drawn
to
the
“institutional
preconditions
for
delibera-
tion”
(Lövbrand
et
al.,
2011,
p.
491).
Dryzek
(2011)
argues
that
deliberative
processes
are
only
part
of
the
‘deliberative
systems’
that
are
required
to
confer
legitimacy
(see
also
Goodin
and
Dryzek,
2006).
2.4.
Responsiveness
There
exist
a
range
of
processes
through
which
questions
of
responsible
innovation
can
be
asked
(see
Table
2).
Some
of
these
processes
focus
questioning
on
the
three
dimensions
of
respon-
sible
innovation
above.
A
few
approaches,
such
as
Constructive
Technology
Assessment
(Rip
et
al.,
1995),
Real-Time
Technology
Assessment
(Guston
and
Sarewitz,
2002),
midstream
modula-
tion
(Fisher
et
al.,
2006)
and
anticipatory
governance
(Barben
et
al.,
2008),
seek
to
interrogate
multiple
dimensions.
However,
for
responsible
innovation
to
have
purchase,
it
must
also
seek
to
respond
to
such
questions.
Responsible
innovation
requires
a
capacity
to
change
shape
or
direction
in
response
to
stakeholder
and
public
values
and
chang-
ing
circumstances.
The
limited
capacity
for
empowering
social
agency
in
technological
choice
and
the
modulation
of
innova-
tion
trajectories
has
been
a
significant
criticism
of
the
impact
of
public
engagement
(e.g.
Stirling,
2008;
Macnaghten
and
Chilvers,
forthcoming).
We
must
therefore
consider
how
systems
of
inno-
vation
can
be
shaped
so
that
they
are
as
responsive
as
possible.
Pellizzoni
describes
responsiveness
as
“an
encompassing
yet
sub-
stantially
neglected
dimension
of
responsibility”
(Pellizzoni,
2004,
p.
557).
Drawing
an
explicit
link
to
inclusion,
he
suggests
that
responsiveness
is
about
adjusting
courses
of
action
while
recog-
nising
the
insufficiency
of
knowledge
and
control
(with
echoes
of
Collingridge’s
aspiration
of
‘corrigibility’
(Collingridge,
1980)).
Its
two
aspects
relate
to
the
two
meanings
of
the
word
respond
to
react
and
to
answer
(Pellizzoni,
2004).
Responsiveness
involves
responding
to
new
knowledge
as
this
emerges
and
to
emerging
perspectives,
views
and
norms.
For
responsible
innovation
to
be
responsive,
it
must
be
situ-
ated
in
a
political
economy
of
science
governance
that
considers
both
products
and
purposes.
In
the
UK,
Europe
and
perhaps
more
broadly,
we
can
point
to
growing
policy
interest
in
‘grand
chal-
lenges’
(Lund
Declaration,
2009).
von
Schomberg
(2013)
contends
that
the
central
challenge
of
responsible
innovation
is
to
become
more
responsive
to
societal
challenges.
But
such
challenges
are
not
preordained,
nor
are
they
uncontested.
There
are
various
mechanisms
that
might
allow
innovation
to
respond
to
improved
anticipation,
reflexivity
and
inclusion.
In
some
cases,
application
of
the
precautionary
principle,
a
moratorium
or
a
code
of
conduct
may
be
appropriate.
Existing
approaches
to
technology
assessment
and
foresight
may
be
widened
to
engender
improved
responsiveness
(von
Schomberg,
2013).
Value-sensitive
design
(Friedman,
1996)
suggests
the
possibility
of
designing
par-
ticular
ethical
values
into
technology.
As
we
describe
in
the
next
section’s
case
study,
techniques
such
as
stage-gating
can
also
create
new,
responsive
governance
choices.
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1573
Table
2
Four
dimensions
of
responsible
innovation.
Dimension
Indicative
techniques
and
approaches
Factors
affecting
implementation
Anticipation
Foresight
Engaging
with
existing
imaginaries
Technology
assessment Participation
rather
than
prediction
Horizon
scanning
Plausibility
Scenarios
Investment
in
scenario-building
Vision
assessment
Scientific
autonomy
and
reluctance
to
anticipate
Socio-literary
techniques
Reflexivity
Multidisciplinary
collaboration
and
training
Rethinking
moral
division
of
labour
Embedded
social
scientists
and
ethicists
in
laboratories Enlarging
or
redefining
role
responsibilities
Ethical
technology
assessment Reflexive
capacity
among
scientists
and
within
institutions
Codes
of
conduct
Connections
made
between
research
practice
and
governance
Moratoriums
Inclusion
Consensus
conferences
Questionable
legitimacy
of
deliberative
exercises
Citizens’
juries
and
panels
Need
for
clarity
about,
purposes
of
and
motivation
for
dialogue
Focus
groups
Deliberation
on
framing
assumptions
Science
shops
Ability
to
consider
power
imbalances
Deliberative
mapping
Ability
to
interrogate
the
social
and
ethical
stakes
associated
with
new
science
and
technology
Deliberative
polling Quality
of
dialogue
as
a
learning
exercise
Lay
membership
of
expert
bodies
User-centred
design
Open
innovation
Responsiveness
Constitution
of
grand
challenges
and
thematic
research
programmes
Strategic
policies
and
technology
‘roadmaps’
Regulation
Science-policy
culture
Standards
Institutional
structure
Open
access
and
other
mechanisms
of
transparency
Prevailing
policy
discourses
Niche
managementaInstitutional
cultures
Value-sensitive
design
Institutional
leadership
Moratoriums
Openness
and
transparency
Stage-gatesbIntellectual
property
regimes
Alternative
intellectual
property
regimes
Technological
standards
aSchot
and
Geels
(2008).
bSee
below
and
Macnaghten
and
Owen
(2011)
for
an
example
of
this.
Diversity
is
an
important
feature
of
productive,
resilient,
adapt-
able
and
therefore
responsive
innovation
systems
(Stirling,
2007).
Responsible
innovation
should
not
just
welcome
diversity;
it
should
nurture
it.
This
may
require
active
policies
of,
for
exam-
ple,
niche
management
(Schot
and
Geels,
2008).
It
certainly
demands
explicit
scrutiny
of
the
tensions
and
governance
mech-
anisms
within
processes
of
research
funding,
intellectual
property
regimes
and
technological
standards,
which
often
act
to
close
down
innovation
in
particular
ways,
and
other
norms,
pressures
and
expectations
that
reinforce
particular
path
dependencies
and
lock-
ins.
These
will
differ
across
countries,
disciplines
and
contexts,
but
this
‘de
facto
governance’
(Kearnes
and
Rip,
2009)
is
likely
to
follow
what
Pellizzoni
(2004)
calls
‘a
logic
of
unresponsiveness’
(p.
558)
in
which,
if
responsibility
is
considered
in
any
depth,
retrospective
accountability
takes
precedence.
Empirical
research
with
governance
actors
in
the
UK,
scru-
tinising
their
receptivity
to
substantive
public
concerns
about
science
governance,
suggests
some
important
mediating
factors
that
are
likely
to
improve
institutional
responsiveness.
These
include:
a
deliberative
science
policy
culture,
emphasising
reflex-
ive
learning
and
responsiveness;
an
open
organisational
culture,
emphasising
innovation,
creativity,
interdisciplinarity,
experimen-
tation
and
risk
taking;
top-level
leadership
and
commitment
to
public
engagement
and
to
taking
account
of
the
public
interest;
and
commitments
to
openness
and
transparency
(Macnaghten
and
Chilvers,
forthcoming).
Responsiveness
is
therefore
linked
to
reflexive
capacity.
We
can
see
the
societal
embedding
of
technologies
as
requir-
ing
a
process
of
alignment
(te
Kulve
and
Rip,
2011;
Fujimura,
1987)
(or
‘enrollment’,
following
Latour
(1987)).
Actors
and
inter-
ests
are
arranged
such
that
they
are
dependent
on
one
another,
so
stabilising
a
particular
sociotechnical
system.
The
project
of
respon-
sible
governance
requires
understanding
this
‘alignment
work’
(te
Kulve
and
Rip,
2011).
The
midstream
modulation
approach
described
in
Fisher
et
al.
(2006)
differentiates
between
three
levels
of
decision-making:
de
facto,
reflexive
and
deliberate,
with
the
aim
of
iterating
governance
through
these
levels
to
make
assumptions
more
explicit
and
decisions
more
deliberate.4
Making
innovation
more
responsive
also
requires
attention
to
metagovernance
the
values,
norms
and
principles
that
shape
or
underpin
policy
action
(Kooiman
and
Jentoft,
2009).
STS
approaches
have
highlighted
how
policy
discourses
shape
the
gov-
ernance
of
emerging
technologies
(Hilgartner,
2009;
Lave
et
al.,
2010).
These
governance
principles
may
be
explicit,
as
in
the
case
of
the
Bayh–Dole
act
in
the
US;
implicit,
as
with
the
growing
pol-
icy
focus
on
the
relevance
and
‘impact’
of
research
(Hessels
et
al.,
2009);
or
nascent,
as
with
the
‘grand
challenges’
approach
(Kearnes
and
Wienroth,
2011;
Lund
Declaration,
2009).
At
an
overarching
level,
the
insistence,
particularly
in
European
policy,
that
Research
and
Development
should
be
increased
in
order
to
spur
economic
growth,
with
no
question
of
which
research
or
what
developments,
provides,
it
has
been
argued,
a
powerful
policy
discourse
that
limits
responsiveness
(von
Schomberg,
2013;
Felt
et
al.,
2007).
2.5.
Integrating
the
dimensions
of
responsible
innovation
Moving
beyond
the
range
of
processes
described
above
that
seek
to
advance
single
or
multiple
dimensions,
responsible
inno-
vation
demands
their
integration
and
embedding
in
governance.
The
dimensions
therefore
do
not
float
freely
but
must
connect
as
an
integrated
whole.
It
is
necessary
to
draw
connections
both
between
the
dimensions
and
with
the
contexts
of
governance
in
which
4Wynne
(2003)
has
an
analogous
critique
of
‘decisionism’
the
discursive
reduc-
tion
of
governance
debates
to
explicit
decision
points.
1574 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
they
sit.
The
dimensions
may
in
practice
be
mutually
reinforcing.
For
example,
increased
reflexivity
may
lead
to
greater
inclusion
or
vice
versa.
But,
as
illustrated
in
the
case
study
in
Section
3,
these
dimensions
may
also
be
in
tension
with
one
another
and
may
generate
new
conflicts.5Anticipation
can
encourage
wider
participation,
but,
as
Guston
(2012)
argues,
it
may
be
resisted
by
scientists
seeking
to
protect
their
autonomy,
or
prior
commit-
ments
to
particular
trajectories
(see
also
te
Kulve
and
Rip,
2011).
The
surfacing
and
subsequent
negotiation
of
such
tensions
is
cen-
tral
to
making
responsible
innovation
responsive.
For
this
reason,
institutional
commitment
to
a
framework
that
integrates
all
four
dimensions
(with
no
a
priori
instrumental
conditioning)
becomes
vital,
rather
than
relying
on
piecemeal
processes
that
highlight
par-
ticular
dimensions
and
not
others.
Public
dialogue,
bioethics,
research
integrity,
codes
of
conduct,
risk
management
and
other
mechanisms
may
target
parts
of
the
governance
of
science,
but
they
do
not
offer
an
overarching,
coher-
ent
and
legitimate
governance
approach
unless
we
consider
how
they
are
aligned
with
one
another.
Approaches
that
build
on
Con-
structive
Technology
Assessment
(Rip
et
al.,
1995)
have
recognised
this
need
for
integration
at
multiple
levels
of
governance.
For
exam-
ple,
the
US
project
of
Socio-Technical
Integration
Research
(STIR)
taking
place
around
nanotechnology
seeks
to
explore:
“what
counts
as
responsible
innovation
at
the
macro-level
of
public
policy,
the
micro-level
of
laboratory
research,
and
the
meso-level
of
institu-
tional
structures
and
practices
that
connect
them”
(Fisher
and
Rip,
2013).
The
integration
of
the
dimensions
described
above
provides
a
general
framework,
but
attention
to
the
responsiveness
dimension
in
particular
demands
that
such
a
framework
be
embedded
in
par-
ticular
institutional
contexts
and
adjusted
to
take
account
of
their
idiosyncrasies.
In
the
following
section,
we
describe
the
application
and
further
development
of
our
framework
within
a
UK
Research
Council
and
a
particular
research
project
as
a
case
study
of
this.
3.
Responsible
innovation
in
action:
a
case
study
of
‘technoscience
in
the
making’
While
we
were
working
with
the
Research
Councils
to
develop
the
framework
described
above,
we
were
presented
with
an
oppor-
tunity
to
work
alongside
a
particular
science
and
engineering
project.
This
case
study
allowed
us
to
embed
and
deepen
our
thinking.
In
this
section
we
first
describe
the
particulars
of
the
project,
its
broader
socio-political
context,
and
the
de
facto
gover-
nance
arrangements
in
place.
We
then
describe
how
our
emerging
responsible
innovation
framework
was
applied
within
this
context.
Finally
we
critically
reflect
on
this
case
study:
on
the
framework’s
dimensions,
its
implementation,
impact
and
legitimacy,
including,
importantly,
whether
it
offered
a
means
to
genuinely
empower
social
agency
in
technological
decision-making
(Stirling,
2008).
The
case
was
the
Stratospheric
Particle
Injection
for
Climate
Engineering
(SPICE)
project,
funded
by
three
UK
research
coun-
cils
(the
Engineering
and
Physical
Sciences
Research
Council,
the
Natural
Environmental
Research
Council
and
the
Science
and
Tech-
nology
Facilities
Council).
The
aim
of
this
project
was
to
investigate
whether
the
purposeful
injection
of
large
quantities
of
particles
into
the
stratosphere
could
mimic
the
cooling
effects
of
volcanic
eruptions
and
provide
a
possible
means
to
mitigate
global
warm-
ing
(SPICE,
2010).
The
SPICE
project
was
funded
to
answer
three
broad
questions:
First,
what
quantity
of
which
type(s)
of
parti-
cle
would
need
to
be
injected
into
the
atmosphere
(and
where),
to
effectively
manage
the
climate
system?
Second,
how
might
we
5We
are
indebted
to
an
anonymous
reviewer
for
inviting
us
to
explore
in
more
detail
the
tensions
within
and
between
the
dimensions
we
describe.
deliver
it
there?
Third,
what
are
the
likely
impacts
associated
with
deployment?
In
response
to
the
second
question,
a
test
was
pro-
posed
of
a
scaled
down
delivery
system,
a
1-km
high
hose
attached
to
a
tethered
balloon.
Although
the
testbed
would
not
be
a
geoengi-
neering
test
per
se
the
trial
would
spray
only
a
small
amount
of
water
the
testbed
nevertheless
constituted
the
UK’s
first
field
trial
of
a
technology
with
geoengineering
potential
(Macnaghten
and
Owen,
2011),
and
was
as
such
deeply
symbolic,
even
though
this
symbolism
was
not
initially
apparent
to
many
of
those
involved.
3.1.
Socio-political
context
for
the
case
study
Geoengineering
has
been
defined
as
the
“deliberate
large-scale
manipulation
of
the
planetary
environment
to
counteract
anthro-
pogenic
climate
change”
(Royal
Society,
2009,
p.
1).
Within
the
space
of
a
few
years,
geoengineering
has
become
a
powerful
pol-
icy
discourse,
offering
a
new
class
of
response
to
anthropogenic
climate
change,
alongside
mitigation
and
adaptation
(American
Meteorological
Society,
2009;
Royal
Society,
2009;
Bipartisan
Policy
Centre
Task
Force,
2011).
Those
geoengineering
approaches
classed
as
solar
radiation
management,
which
are
intended
to
reduce
the
amount
of
sunlight
reaching
the
Earth’s
surface,
have
received
particular
attention
because
initial
estimates
suggest
that
they
could
be
both
effective
and
relatively
cheap
compared
to
the
cost
of
implementing
greenhouse
gas
mitigation
policies
(Boyd,
2008;
Caldeira
and
Keith,
2010;
SRMGI,
2011).
Although
unformed
and
uncertain,
solar
radiation
manage-
ment
introduces
a
range
of
significant
social,
political
and
ethical
questions.
These
include:
whether
international
agreement
and
buy-in
for
such
a
planetary-wide
technology
is
plausible;
whether
research
into
or
deployment
of
solar
radiation
management
geo-
engineering
will
create
a
moral
hazard,
diverting
political
attention
away
from
climate
mitigation
efforts;
whether
the
impacts
of
solar
radiation
management
can
be
fully
understood
before
deploy-
ment;
whether
solar
radiation
management
can
be
accommodated
within
democratic
institutions;
and
whether
the
technology
would
be
used
for
other
purposes,
opening
up
the
potential
for
new
geopolitical
conflicts
(for
various
accounts
of
the
social
and
ethical
dimensions
of
social
radiation
management,
see
Corner
et
al.,
2011;
Hamilton,
2013;
Ipsos-MORI,
2010;
Owen,
2011;
Macnaghten
and
Szerszynski,
2013).
Such
questions
have
informed
governance
ini-
tiatives
aimed
at
the
responsible
conduct
of
geoengineering
(see
the
‘Oxford
Principles’,
Rayner
et
al.,
2013)
and
solar
radiation
manage-
ment
research
(see
the
‘Solar
Radiation
Management
Governance
Initiative’,
SRMGI,
2011).
3.2.
The
SPICE
project:
history
Following
the
publication
of
the
Royal
Society’s
geoengineering
report
in
July
2009
and
in
response
to
a
specific
recommenda-
tion
that
UK
Research
Councils
co-fund
“a
10
year
geoengineering
research
programme
at
the
level
of
the
order
of
£10
M
per
annum”
(Royal
Society,
2009,
p.
xii),
the
Research
Councils
convened
a
scop-
ing
workshop
in
October
2009
aimed
at
informing
a
programme
of
geoengineering
research.
The
aim
was
to
“fund
research
which
will
allow
the
UK
to
make
informed
and
intelligent
assessments
about
the
development
of
climate
geoengineering
technologies”
(EPSRC/NERC/LWEC,
2009,
p.
1).
A
number
of
themes
relating
to
governance,
ethics,
public
acceptability
and
public
engagement
were
discussed
at
the
scoping
workshop
but
these
were
not
con-
sidered
a
top
priority.
The
subsequent
March
2010
‘sandpit’,
conducted
15–19
March
2010,
was
set
up
by
the
Research
Councils
with
the
aim
of
“bringing
together
researchers
from
numerous
backgrounds
and
to
encourage
and
drive
innovative
thinking
and
radical
approaches
to
addressing
research
challenges
in
this
area”
(EPSRC/NERC/LWEC,
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1575
Table
3
Overview
of
Stage-gate
criteria
and
panel
recommendations.
Criterion
Relevant
RI
dimensions
Panel
recommendation
Comment
from
the
Research
Councils
(abridged)
1.
Risks
identified,
managed
and
deemed
acceptable
Reflexivity
Pass
No
further
information
required
2.
Compliant
with
relevant
regulations
Reflexivity
Pass
No
further
information
required
3.
Clear
communication
of
the
nature
and
purpose
of
the
project
Reflexivity,
inclusion
Pass
pending
Additional
work
is
required:
(1)
a
communications
strategy
informed
by
stakeholder
engagement;
(2)
a
commitment
to
two-way
communication;
and
(3)
a
‘sticky
questions’
briefing
4.
Applications
and
impacts
described
and
mechanisms
put
in
place
to
review
these
Anticipation,
reflexivity Pass
pending Additional
work
is
required:
(1)
more
information
on
the
envisaged
milestones
and
associated
questions,
that
will
need
to
be
addressed
before
deployment
of
the
testbed;
(2)
a
literature
review
of
risks,
uncertainties
and
opportunities
of
solar
radiation
management
including
social
and
ethical
dimensions
5.
Mechanisms
identified
to
understand
public
and
stakeholder
views
Inclusion,
reflexivity Pass
pending Additional
work
is
required:
(1)
stakeholder
mapping
exercise;
(2)
engagement
with
stakeholders
and
(3)
ensuring
that
key
stakeholders
are
aware
of
the
testbed
2010,
p.
2),
although
involvement
of
social
scientists
was
limited.
The
SPICE
project
was
one
of
two
projects
funded
from
this
sandpit
and
did
not
include
ethics
or
social
science
competency.
Aware
of
at
least
some
of
the
wider
ethical
and
socio-political
dimensions
of
solar
radiation
management
(a
point
stressed
by
a
presenter
at
the
beginning
of
the
sandpit
from
an
environmental
Non
Governmen-
tal
Organisation
(NGO)),
the
UK
Research
Councils
were
sensitive
to
the
potential
for
the
SPICE
project
to
be
the
subject
of
external
scrutiny,
particularly
given
that
its
proposed
testbed
moved
beyond
laboratory
tests
or
simulations
and
thus
could
be
defined
as
a
“small
field
trial”
(see
SRMGI,
2011,
p.
26).
Perhaps
unsurprisingly,
the
SPICE
project
passed
through
the
ethics
procedures
at
the
Universities
concerned
with
little
or
no
comment:
the
research
did
not
involve
human
volunteers
or
animals
and
the
research
was
unlikely
to
have
a
direct
effect
on
the
environment.
Nevertheless,
given
the
evident
sensitivities
involved,
the
Research
Councils
decided
upon
a
‘stage-gate’
review
process,
which
we
used
to
incorporate
our
own
emerging
ideas
of
responsible
innovation.
3.3.
Embedding
the
dimensions
of
responsible
innovation
within
SPICE
Stage-gating
is
a
well-established
mechanism
for
developing
new
products
(Cooper,
1990)
by
splitting
R&D
into
discrete
stages.
Decision
gates
use
certain
criteria
for
progression
through
the
stages.
Conventionally
the
inputs
to
the
decision
gate
have
been
based
on
technical
considerations
and
market
potential.
In
the
case
of
the
SPICE
project,
the
stage-gate
was
constructed
to
include
a
set
of
responsible
innovation
criteria
(see
Table
3),
based
on
the
dimen-
sions
we
have
described
in
Section
2.
The
decision
gate
involved
an
independent
panel
evaluating
the
SPICE
team’s
response
to
the
criteria
and
recommending
to
the
Research
Councils
whether
the
testbed
should
proceed
and,
if
so,
under
what
conditions.
Two
authors
of
this
paper
were
involved
in
the
development
and
imple-
mentation
of
this
governance
approach:
Owen
was
the
architect;
Macnaghten
was
chair
of
the
independent
stage-gate
panel.
Other
members
of
the
stage-gate
panel
included
a
social
scientist,
a
rep-
resentative
of
a
civil
society
organisation,
an
atmospheric
scientist
and
an
aerospace
engineer.
Five
criteria
were
developed
for
the
stage
gate.
Criteria
1
and
2
were
related
to
the
issues
and
potential
impacts
directly
associated
with
the
testbed
itself:
that
the
research
was
conducted
in
ways
that
were
assessed
to
be
safe
and
compliant
with
existing
legisla-
tion.
These
were
not
particularly
related
to
the
prospective
notion
of
responsibility
as
developed
in
this
paper
and
were
responded
to
comprehensively.
Criteria
3–5
were
however
concerned
with
wider
issues
and
potential
impacts,
associated
with
how
the
research
was
framed
and
issues
relating
to
future
deployment.
Criterion
3
concerned
framing,
communication
and
dialogue.
It
asked
SPICE
researchers
to
develop
a
communications
approach
informed
by
dialogue
with
diverse
stakeholders,
acknowledging
areas
of
uncertainty
and
ignorance.
It
built
on
dimensions
of
inclu-
sion
(a
commitment
to
base
communications
on
genuine
dialogue
with
stakeholders)
and
reflexivity
(a
commitment
for
commu-
nications
to
demonstrate
reflection
on
SPICE’s
own
embedded
assumptions,
commitments
and
framings).
Criterion
4
required
SPICE
researchers
to
assess
future
applications
and
impacts,
broad-
ening
their
visions
of
application
and
impact,
drawing
on
the
dimensions
of
anticipation
and
reflection.
It
requested
a
review
of
the
risks
and
uncertainties
of
solar-radiation
management
as
well
as
reflection
on
the
questions
(social,
ethical
and
technical)
that
would
have
to
be
considered
between
the
testbed
and
eventual
deployment
of
a
working
full-size
system.
Criterion
5
incorporated
the
dimensions
of
inclusion
and
reflexivity,
asking
researchers
to
engage
directly
with
stakeholders
and
wider
publics
and
to
reflect
upon
their
own
tacit
understandings,
assumptions,
uncertainties,
framings
and
commitments.
These
criteria
are
described
in
Table
3,
with
reference
to
the
dimensions
of
responsible
innovation,
along
with
the
panel
rec-
ommendations
and
Research
Council
responses.
The
stage-gate
panel
reviewed
responses
by
the
SPICE
team
to
the
five
criteria
in
June
2011.
Criteria
1
and
2
were
passed,
but
more
work
was
requested
to
meet
Criteria
3–5.
The
stage-gate
itself
was
a
process
of
responsiveness.
While
the
panel
assessment
was
independent,
the
criteria
were
discussed
in
advance
of
the
stage
gate
between
EPSRC
officials
and
the
SPICE
team
and
some
support
provided
to
enable
them
to
identify
what
inputs
they
should
consider
in
order
to
respond
(for
example,
on
public
engagement,
see
Pidgeon
et
al.,
2013).
EPSRC
created
the
institutional
conditions
for
this
new
governance
mechanism
and
were
willing,
with
leadership
from
senior
staff,
to
interrogate
their
own
institutional
responsibilities.
There
was
a
visible
degree
of
the
institutional
reflexivity
demanded
by
Wynne
(1993).
Nevertheless,
the
stage-gate
review
had
to
fit
within
a
wider
governance
land-
scape,
partly
de
jure,
partly
de
facto.
The
state-gate
review
was
introduced
after
the
project
had
been
funded,
with
little
scope
for
deliberation
on
the
motivations
for
the
research
or
whether
the
research
should
have
been
funded
at
all.
There
were
conventional
appeals
to
scientific
autonomy
and
to
the
authority
of
the
princi-
pal
investigator.
There
were
over-riding
assumptions
that
Research
Council
decision-making
should
be
science-led,
in
the
service
of
national
competitiveness.
Outside
the
Research
Councils,
policy
bodies
such
as
the
European
Parliament
and
the
United
Nations
Convention
on
Biological
Diversity
were
urging
caution
on
field
1576 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
tests
of
geoengineering.
In
addition,
the
‘Oxford
Principles’
(Rayner
et
al.,
2013)
on
geoengineering
research,
including
specific
rec-
ommendations
for
early
public
participation
in
decision-making,
had
recently
been
endorsed
by
the
UK
House
of
Commons
Science
and
Technology
Committee.
The
stage-gate
became
a
forum
for
the
Research
Councils’
negotiation
of
internal
and
external
demands,
blending
a
substantive
motivation
towards
greater
responsibility
with
an
instrumental
imperative
to
protect
reputations
and
rela-
tionships.
In
September
2011,
following
the
advice
of
the
stage-gate
panel,
EPSRC
postponed
the
testbed
to
allow
the
team
to
undertake
the
additional
work
requested.
At
the
same
time
a
vocal
debate
was
tak-
ing
place
in
the
media.
Following
an
earlier
announcement
by
the
SPICE
team
that
the
testbed
would
go
ahead
imminently
(SPICE
had
decided
to
continue
to
prepare
for
the
testbed
experiment
along-
side
the
extra
work
requested
by
the
stage-gate
review
panel),
EPSRC
received
a
letter
in
September
2011,
copied
to
the
then
UK
Secretary
of
State
for
Energy
and
Climate
Change
and
signed
by
more
than
50
NGOs,
demanding
that
the
project
be
cancelled.
The
NGOs
saw
the
testbed
as
symbolic,
sending
the
wrong
signal
to
the
international
community,
deflecting
political
and
scientific
attention
from
the
need
to
curb
greenhouse
gas
emissions
(HOME,
2011).
There
is
a
risk
with
any
new
governance
mechanism
that
it
gives
the
illusion
of
control.
The
stage-gate’s
ambitions
were
mod-
est.
The
process
was
a
useful
‘hybrid
forum’
(Callon
et
al.,
2009)
within
which
to
open
up
a
complex
governance
discussion,
sur-
facing
tensions,
framings,
tacit
assumptions,
areas
of
contestation
and,
importantly,
commitments.
As
such,
it
prompted
a
discussion
of
some
particular
issues
that
would
turn
out
to
be
vital,
even
if
they
were
not
made
explicit
in
the
criteria,
and
were
not
predicted
at
the
start
of
the
process.
While
responding
to
Criterion
3,
by
drawing
up
a
‘sticky
ques-
tions’
briefing,
the
SPICE
project
leader
was
made
aware
of
the
existence
of
a
prior
patent
application
(Davidson
et
al.,
2011)
on
the
concept
of
a
tethered
balloon
stratospheric
particle
delivery
mechanism.
This
had
been
submitted
by
one
of
the
mentors
at
the
‘sandpit’
prior
to
this
meeting.
The
patent
application
included
one
of
the
SPICE
project
investigators
as
a
co-author.
Although
there
was
no
evidence
that
Research
Council
rules
such
as
those
on
vested
interests
had
been
broken,
given
the
sensitivities
of
the
project,
an
independent
external
review
was
commissioned
by
EPSRC
to
investigate
the
sandpit
and
funding
process.
Later,
in
May
2012,
after
discussions
between
the
Research
Councils,
the
SPICE
project
leader
and
one
of
this
paper’s
authors
(Owen),
the
SPICE
team
decided
to
cancel
the
testbed,
citing
the
lack
of
rules
governing
geoengineering
research
and
the
fact
that
the
patent
application
represented
“a
potentially
significant
conflict
of
interest”
(Cressey,
2012,
p.
429).
3.4.
Reflections
on
the
SPICE
project
In
the
case
of
the
SPICE
project,
the
responsible
innovation
approach
introduced
reflection,
anticipation,
inclusive
delibera-
tion
and
responsiveness,
materially
influencing
the
direction
of
a
contentious,
charged
and
highly
uncertain
area
of
emerging
technoscience.
Specifically,
the
framework
helped
the
research
scientists,
the
stage-gate
panel
and
Research
Council
officials
to
anticipate
previously
unexplored
impacts,
applications
and
issues.
Those
involved
were
asked
to
reflect
on
SPICE’s
embedded
com-
mitments,
assumptions,
promissory
statements,
uncertainties
and
areas
of
ignorance.
As
the
project
developed,
there
was
evidence
of
a
more
reflexive
and
deliberative
research
culture
within
SPICE
and
the
Research
Councils,
not
least
through
on-going
dialogue
on
the
project
from
the
EPSRC’s
advisory
Societal
Issues
Panel
(of
which
Owen
and
Macnaghten
were
members),
set
up
to
help
EPSRC
Council
to
take
account
of
public
opinion.
The
SPICE
principle
inves-
tigator’s
blog
‘The
reluctant
geoengineer’
(Watson,
2011)
reveals
an
emerging
appreciation
of
the
social
and
ethical
dilemmas
asso-
ciated
with
the
project
and
a
growing
reflexivity
in
relation
to
his
own
responsibilities.
In
terms
of
inclusion,
the
SPICE
stage
gate
was
informed,
first,
by
a
public
dialogue
exercise,
with
results
that
suggested
at
best
a
highly
qualified
public
support
for
the
project
(Parkhill
and
Pidgeon,
2011;
Pidgeon
et
al.,
2013)
and,
second,
by
a
programme
of
stakeholder
engagement
(Stilgoe
et
al.,
submitted
for
publication).
Further
reflection
and
deliberation
between
the
SPICE
team,
Research
Councils
and
others
were
important
in
the
SPICE
team’s
decision
not
to
proceed
with
the
testbed,
but
it
is
important
to
note
that
this
decision
was
one
made
by
the
SPICE
team
itself.
The
proposed
SPICE
testbed
originally
attracted
the
attention
of
NGOs
because
of
its
potential
to
set
a
precedent
for
governance.
The
subsequent
debate
and
change
of
direction
what
some
have
called
‘the
SPICE
experience’
(Nature,
2012;
Olson,
2012)
have
had
a
discernible
impact
on
geoengineering
research
and
governance
discussions
for
the
same
reason.
Although
it
is
unclear
how
these
discussions
will
continue,
the
precedent
set
by
the
SPICE
project
and
its
funders
in
at
least
acknowledging
wider
complexities
and
responsibilities
looks
set
to
remain
a
talking
point.
This
case
study
of
responsible
innovation
in
action
has
also
highlighted
some
important
limitations
and
constraints.
It
became
apparent
that
the
framework
should
have
been
in
place
earlier,
before
the
project’s
conception,
and
articulated
more
clearly.
The
responsible
innovation
framework
had
been
separately
funded
and
then
embedded
into
the
SPICE
project
once
the
latter
was
under-
way.
The
framework
had
no
influence
on
the
constitution
of
the
project
within
the
sandpit,
the
framing
of
the
sandpit
itself
or
the
scoping
workshop
that
informed
this.
It
was
therefore
open
to
instrumental
conditioning.
Nevertheless,
it
opened
up
the
SPICE
project
and
its
wider
socio-political
context
to
broader
reflection
and
deliberation,
providing
a
hybrid
forum
to
support
decisions
by
the
funders
and
scientists.
It
also
served
an
important
function
as
a
location
for
a
wider
ethical
discussion
concerning
solar
radiation
management
research
through
a
tangible
example.
The
case
high-
lights
the
potential
for
a
framework
to
inform
decision-making
in
a
field
with
limited
governance,
even
if
this
was
restricted
by
the
nature
and
timing
of
its
intervention.
4.
Discussion
The
framework
for
responsible
innovation
that
we
have
described
starts
with
a
prospective
model
of
responsibility,
works
through
four
dimensions,
couples
anticipation,
reflection
and
delib-
eration
to
agency
and
action
and
makes
explicit
the
need
to
connect
with
cultures
and
practices
of
governance.
For
this
reason,
the
case
study
that
we
set
out
above
was
an
important
site
for
the
framework’s
development.
In
using
the
framework,
actors
and
institutions
were
challenged
to
go
beyond
compliance
with
estab-
lished
regulation,
in
ways
that
challenged
conventional
role
and
institutional
responsibilities.
The
eventual
outcome
of
the
case
study
in
which
the
testbed
was
postponed
and
subsequently
cancelled
was
unexpected,
but
this
was
an
important
feature
of
the
framework.
The
outcome
was
a
product
of
the
reflexive
process
itself.
The
framework
sought
not
to
instrumentally
legitimise
any
particular
framing
or
commit-
ment.
Instead
it
served
to
guide,
prompt
and
open
up
space
for
essential
governance
discussions
aimed
at
supporting,
but
not
dic-
tating,
decisions
about
the
framing,
direction,
pace
and
trajectory
of
contentious
and
innovative
research.
Although
the
case
study
was
limited
in
scope,
the
adoption
of
a
responsible
innovation
approach
prompted
unconventional
and,
as
it
turned
out,
important
gover-
nance
discussions.
We
should
not
make
assumptions
about
the
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1577
applicability
or
validity
of
our
framework
across
all
issues
or
at
all
levels
of
decision-making,
but
we
believe
that
our
framework
may
at
least
provide
a
basis
for
discussions
as
policy
and
research
enthusiasm
for
ideas
of
responsible
innovation
grows.
The
framework
we
have
suggested
does
not
pretend
to
be
an
off-
the-shelf
quick
fix
for
responsible
governance.
It
joins
and
seeks
to
constructively
inform
an
emerging
debate
on
responsible
research
and
innovation.
Our
framework
draws
on
insights
and
experiences
from
the
recent
governance
developments
(Real-Time
Technol-
ogy
Assessment,
Constructive
Technology
Assessment,
upstream
engagement,
midstream
modulation
etc.)
we
describe
above.
It
seeks
to
shape
a
constructive
engagement
between
questions
of
responsibility
and
innovation
that
interrogates
the
purposes
of
innovation
alongside
the
more
conventional
preoccupation
with
the
products
of
innovation.
The
framework
allows
scientists
and
decision
makers
to
build
on
past
lessons
rather
than
reinventing
responsibilities
for
each
particular
emerging
technology.
Recent
institutional
inclinations
towards
responsible
innova-
tion,
though
under-conceptualised
at
present,
can
be
seen
as
part
of
a
move
towards
a
new
governance
of
science.
Responsible
innova-
tion
is
seen
by
some
as
a
response
to
a
particular
authority
gap.
It
is
therefore
important
to
interrogate
the
legitimacy
of
our
framework
(see
also
Randles
et
al.,
2012).
Lövbrand
et
al.
(2011)
point
to
a
legitimacy
gap
in
deliberative
engagement
on
science
and
technology
issues.
There
are
problems
of
both
input
legitimacy
how
processes
are
set
up
and
run
and
output
legitimacy
the
efficacy
of
governance.
They
claim
that:
“The
science
and
technology
studies
literature
still
offers
little
guid-
ance
on
institutional
design.
.
.
[and
is]
often
weary
of
institutional
realities.”
(Lövbrand
et
al.,
2011,
p.
480).
We
would
hope
that
our
framework
provides
a
counterexample
to
this
assertion.
In
addi-
tion,
we
heed
Lövbrand
et
al.’s
(2011;
see
also
Chilvers,
2012)
call
for
self-reflection
on
the
grounds
on
which
legitimacy
is
based.6
We
could,
as
von
Schomberg
(2011a)
has
done
in
the
European
context,
anchor
responsible
innovation
to
the
pursuit
of
particular
values:
in
his
case
the
values
that
drive
European
Union
policy.
But
in
different
areas
of
innovation,
and
in
different
cultural
contexts,
different
values
will
be
more
or
less
pertinent,
and
they
may
be
con-
flicted.
In
our
analysis,
we
have
therefore
been
reticent
to
explicitly
define
the
normative
ends
of
responsible
innovation
(what
von
Schomberg
calls
the
‘right
impacts’).
Our
approach,
in
line
with
the
concepts
of
metagovernance
described
above,
has
concentrated
on
the
means
of
governance
such
that
an
improved
more
demo-
cratic
or
more
legitimate
consideration
of
ends
becomes
possible,
and
in
ways
that
are
attentive
to
the
distinctive
social
and
ethical
stakes
that
are
associated
with
particular
scientific
and
technolog-
ical
developments.
In
this
sense,
we
have
second-order
normative
commitments
to
democratisation,
which
we
see
as
vital
for
the
good
governance
of
science
and
innovation.
We
support
the
feasi-
bility
and
desirability
of
shaping
or
steering
science
and
innovation,
as
opposed
to
letting
the
future
take
care
of
itself.
It
is
not
the
pur-
pose
of
this
paper
to
explore
the
first
order
normative
question
of
desirable
ends,
although
we
would
argue
that
such
a
discussion
is
important.
Our
aims
are
modest
and
incremental.
We
are
providing
neither
a
toolkit
nor
a
manifesto,
but
rather
one
input
into
a
broader
dis-
cussion
that
is
highly
likely
to
shape
research
policy
(particularly
in
the
European
Union)
in
the
coming
years.
Responsible
innova-
tion
will
inevitably
be
a
dynamic
concept
enacted
at
multiple
levels
(see
Fisher
and
Rip,
2013).
We
have
considered
it
at
a
‘meso-level’,
emphasising
the
leadership
role
of
Research
Councils
in
developing
and
implementing
the
framework
that
we
describe.
The
legitimacy
6We
are
grateful
to
an
anonymous
reviewer
for
asking
us
to
explore
our
norma-
tivity
in
this
regard.
of
our
framework
is
perhaps
therefore
best
imagined
in
the
spirit
of
experimentation
suggested
by
Lövbrand
et
al.
(2011;
see
also
Stilgoe,
2012).
We
see
the
suggested
framework
as
a
way
to
guide
governance
developments
in
order
to
enable
social
learning
and
empower
social
agency.
We
would
suggest
that
the
framework
goes
beyond
previous
deliberative
experiments
so
that
governance
institutions
and
structures
become
part
of
the
experimental
appa-
ratus.
Ongoing
experiments
(including
our
own)
should
not
be
taken
as
evidence
of
implementation,
and
the
ease
with
which
‘respon-
sible
(research
and)
innovation’
can
be
inserted
into
policy
documents
should
remind
us
of
the
risks
of
instrumentalising
the
phrase
(see
Owen
et
al.,
2012
for
more
discussion).
Chilvers
(2012)
has
argued
that,
while
there
may
be
substantive
enthusiasm
for
an
opening
up
of
debates
around
science
and
emerging
technologies
among
individual
governance
actors,
institutional
and
governance
pressures
typically
close
down
such
processes
such
that
they
are
used
in
an
instrumental
way
(following
Fiorino’s
(1989)
definition).
We
have
discussed
elsewhere
some
features
and
underlying
policy
motivations
of
the
evolving
national
and
European
discuss-
ions
of
responsible
innovation
in
research
policy
(Owen
et
al.,
2012).
Reflections
on
the
‘responsible’
in
responsible
innovation
are
prompting
new
discussions
about
remit,
role,
division
of
labour
and
how
trans-disciplinary
programmes
of
science
and
innovation
within,
for
example,
the
European
Research
Area
should
be
config-
ured
and
resourced.
These
discussions
not
only
re-ignite
an
older
debate
about
scientific
autonomy
but
also
offer
new
opportunities
for
creating
value.
The
ways
in
which
the
concept
of
responsible
innovation
is
being
constituted
should
themselves
be
opened
up
to
broad
anticipation,
reflection
and
inclusive
deliberation,
with
the
aim
of
making
policy
more
responsive.
Acknowledgements
This
paper
has
benefitted
from
many
audiences
and
exchanges.
In
particular,
we
would
like
to
thank
the
following
individuals
for
their
help
and
support
with
the
development
of
the
ideas
that
led
to
this
paper:
Jason
Chilvers,
Nick
Cook,
Atti
Emecz,
Peter
Ferris,
Erik
Fisher,
Andy
Gibbs,
Nicola
Goldberg,
Chris
Groves,
Dave
Gus-
ton,
Jeroen
van
den
Hoven,
Hugh
Hunt,
Richard
Jones,
Matthew
Kearnes,
Kirsty
Kuo,
Claire
Marris,
Judith
Petts,
Tom
Rodden,
Dan
Sarewitz,
Rene
Von
Schomberg,
Andy
Stirling,
Alison
Wall,
Matt
Watson
and
Brian
Wynne.
The
paper
has
benefitted
from
com-
ments
on
an
earlier
draft
by
three
anonymous
reviewers
and
by
the
editors
of
Research
Policy.
This
research
was
supported
by
the
UK
Engineering
and
Physical
Sciences
Research
Council
and
Eco-
nomic
and
Social
Research
Council.
Grant
No.:
RES-077-26-0001.
Any
errors
of
judgement
or
shortcomings
remain,
of
course,
our
own.
References
Adam,
B.,
Groves,
G.,
2011.
Futures
tended:
care
and
future-oriented
responsibility.
Bulletin
of
Science,
Technology
&
Society
31,
17–27.
American
Meteorological
Society,
2009.
Geoengineering
the
Climate
System:
A
Policy
Statement
of
the
American
Meteorological
Society.
Washington,
DC,
Downloaded
from:
http://www.ametsoc.org/policy/2009geoengineering
climate
amsstatement.html
(01.02.13).
Arthur,
W.,
1989.
Competing
technologies,
increasing
returns,
and
lock-in
by
his-
torical
events.
Economic
Journal
99,
116–131.
Bäckstrand,
K.,
2006.
Multi-stakeholder
partnerships
for
sustainable
development:
rethinking
legitimacy,
accountability
and
effectiveness.
European
Environment
16,
290–306.
Barben,
D.,
Fisher,
E.,
Selin,
C.,
Guston,
D.,
2008.
Anticipatory
governance
of
nano-
technology:
foresight,
engagement,
and
integration.
In:
Hackett,
E.,
Lynch,
M.,
Wajcman,
J.
(Eds.),
The
Handbook
of
Science
and
Technology
Studies.
,
third
ed.
MIT
Press,
Cambridge,
MA,
pp.
979–1000.
Beck,
U.,
1992.
The
Risk
Society.
Towards
a
New
Modernity.
Sage,
London.
1578 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
Beck,
U.,
2000.
Risk
society
revisited:
theory,
politics
and
research
programmes.
In:
Adam,
B.,
Beck,
U.,
Van
Loon,
J.
(Eds.),
The
Risk
Society
and
Beyond:
Critical
Issues
for
Social
Theory.
Sage,
London,
pp.
211–230.
Beck,
U.,
Giddens,
A.,
Lash,
S.,
1994.
Reflexive
Modernisation:
Politics,
Tradition
and
Aesthetics
in
the
Modern
Social
Order.
Polity,
Cambridge.
Bernal,
J.D.,
1939.
The
Social
Function
of
Science.
Routledge,
London.
Berne,
R.,
2006.
Nanotalk:
Conversations
with
Scientists
and
Engineers
about
Ethics,
Meaning,
and
Belief
in
the
Development
of
Nanotechnology.
Lawrence
Erlbaum
Associates,
Mahwah,
NJ.
Borup,
M.,
Brown,
N.,
Konrad,
K.,
Van
Lente,
H.,
2006.
The
sociology
of
expectations
in
science
and
technology.
Technology
Analysis
&
Strategic
Management
18,
285–298.
Boyd,
P.,
2008.
Ranking
geo-engineering
schemes.
Nature
Geoscience
1,
722–724.
Bipartisan
Policy
Centre
Task
Force
on
Climate
Remediation
Research,
2011.
Geoengineering:
A
National
Strategic
Plan
for
Research
on
the
Potential
Effec-
tiveness,
Feasibility,
and
Consequences
of
Climate
Remediation
Technologies.
Bipartisan
Policy
Centre,
Washington,
DC,
Downloaded
from:
http://www.
bipartisanpolicy.org/library/report/task-force-climate-remediation-research
(01.02.13).
Brown,
W.,
2002.
Inclusive
governance
practices
in
nonprofit
organizations
and
implications
for
practice.
Nonprofit
Management
and
Leadership
12,
369–385.
Brown,
N.,
2003.
Hope
against
hype
accountability
in
biopasts,
presents
and
futures.
Science
Studies
16,
3–21.
Brown,
N.,
Michael,
M.,
2003.
A
sociology
of
expectations:
retrospecting
prospects
and
prospecting
retrospects.
Technology
Analysis
and
Strategic
Management
15,
3–18.
Brown,
N.,
Rappert,
B.,
Webster,
A.
(Eds.),
2000.
Contested
Futures:
A
Sociology
of
Prospective
TechnoScience.
Ashgate,
Aldershot,
UK.
Busch,
L.,
2011.
Standards.
Recipes
for
Reality.
MIT
Press,
Cambridge,
MA.
Bush,
V.,
1945.
Science,
the
Endless
Frontier:
A
Report
to
the
President.
U.S.
Gov-
ernment
Printing
Office,
Washington,
DC.
Buur,
J.,
Matthews,
B.,
2008.
Participatory
innovation.
International
Journal
of
Inno-
vation
Management
12,
255–273.
Caldeira,
K.,
Keith,
D.,
2010.
The
need
for
climate
engineering
research.
Issues
in
Science
and
Technology
27,
57–62.
Callon,
M.,
1987.
Society
in
the
making:
the
study
of
technology
as
a
tool
for
socio-
logical
analysis.
In:
Bijker,
W.,
Hughes,
T.,
Pinch,
T.
(Eds.),
The
Social
Construction
of
Technological
Systems:
New
Directions
in
the
Sociology
and
History
of
Tech-
nology.
MIT
Press,
London,
pp.
83–103.
Callon,
M.,
Lascoumes,
P.,
Barthe,
Y.,
2009.
Acting
in
an
Uncertain
World:
An
Essay
on
Technical
Democracy.
MIT
Press,
Cambridge,
MA.
Chesbrough,
H.,
2003.
Open
Innovation:
The
New
Imperative
for
Creating
and
Prof-
iting
from
Technology.
Harvard
Business
School
Press,
Boston,
MA.
Chilvers,
J.,
2009.
Deliberative
and
participatory
approaches
in
environ-
mental
geography.
In:
Castree,
N.,
Demeritt,
D.,
Liverman,
D.,
Rhoads,
B.
(Eds.),
The
Companion
to
Environmental
Geography.
Blackwell,
London,
pp.
400–417.
Chilvers,
J.,
2010.
Sustainable
Participation?
Mapping
Out
and
Reflecting
on
the
Field
of
Public
Dialogue
in
Science
and
Technology.
Sciencewise-ERC,
Harwell.
Chilvers,
J.,
2012.
Reflexive
engagement?
Actors,
learning,
and
reflexivity
in
public
dialogue
on
science
and
technology.
Science
Communication,
http://dx.doi.org/10.1177/1075547012454598.
Collingridge,
D.,
1980.
The
Social
Control
of
Technology.
Open
University
Press,
Milton
Keynes,
UK.
Cooke,
B.,
Kothari,
U.
(Eds.),
2001.
Participation:
The
New
Tyranny?
Zed
Books,
London
and
New
York.
Cooper,
R.,
1990.
Stage-gate
systems:
a
new
tool
for
managing
new
products.
Busi-
ness
Horizons
33,
44–54.
Corner,
A.,
Parkhill,
K.,
Pidgeon,
N.,
2011.
‘Experiment
Earth?’
Reflections
on
a
Public
Dialogue
on
Geoengineering.
Understanding
Risk
Working
Paper
11-02.
School
of
Psychology,
Cardiff
University,
Cardiff.
Cressey,
D.,
2012.
Cancelled
project
spurs
debate
over
geoengineering
patents.
Nature
485,
429.
David,
P.,
2001.
Path
dependence,
its
critics
and
the
quest
for
‘historical
economics’.
In:
Garrouste,
P.,
Ioannides,
S.
(Eds.),
Evolution
and
Path
Dependence
in
Eco-
nomic
Ideas:
Past
and
Present.
Edward
Elgar
Publishing,
Cheltenham,
UK.
Davidson,
P.,
Hunt,
H.,
Burgoyne,
C.,
2011.
Atmospheric
delivery
system.
GB
2476518,
patent
application,
published
29th
June
2011,
Downloaded
from:
http://
worldwide.espacenet.com/publicationDetails/biblio?CC=GB&NR=2476518
(01.02.13).
Doubleday,
R.,
2007.
The
laboratory
revisited:
academic
science
and
the
responsible
governance
of
nanotechnology.
NanoEthics
1,
167–176.
Douglas,
H.,
2003.
The
moral
responsibilities
of
scientists
(tensions
between
auton-
omy
and
responsibility).
American
Philosophical
Quarterly
40,
59–68.
Dryzek,
J.,
2011.
Foundations
and
Frontiers
of
Deliberative
Governance.
Oxford
University
Press,
Oxford.
EEA
(European
Environment
Agency),
2001.
Late
Lessons
from
Early
Warnings:
The
Precautionary
Principle
1896–2000.
Office
for
Official
Publications
of
the
European
Communities,
Luxemburg.
EEA
(European
Environment
Agency),
2013.
Late
Lessons
from
Early
Warnings:
Science,
Precaution,
Innovation.
Office
for
Official
Publications
of
the
European
Communities,
Luxemburg.
EPSRC/NERC/LWEC,
2010.
Climate
Geoengineering
Sandpit
15–19
March
2010,
Downloaded
from:
http://www.epsrc.ac.uk/SiteCollectionDocuments/Calls/
2010/CallForParticipantsGEOENG.pdf
(01.02.13).
EPSRC/NERC/LWEC,
2009.
Geoengineering
Scoping
Workshop
Outputs,
Down-
loaded
from:
http://www.epsrc.ac.uk/SiteCollectionDocuments/Publications/
reports/ReportOfGeoengineeringScopingWorkshop.pdf
(01.02.13).
European
Commission,
2008.
Commission
Recommendation
of
7
February
2008,
on
a
code
of
conduct
for
responsible
nanosciences
and
nanotech-
nologies
research,
Brussels,
07/02/2008
C(2008)
424
final,
Downloaded
from:
http://ec.europa.eu/nanotechnology/pdf/nanocode-rec
pe0894c
en.pdf
(01.02.13).
Felt,
U.,
Wynne,
B.,
et
al.,
2007.
Taking
European
Knowledge
Seriously.
Report
of
the
Expert
Group
on
Science
and
Governance
to
the
Science,
Economy
and
Society
Directorate.
Directorate-General
for
Research.
European
Commis-
sion,
Brussels,
Downloaded
from:
http://ec.europa.eu/research/science-society/
document
library/pdf
06/european-knowledge-society
en.pdf
(01.02.13).
Fiorino,
D.,
1989.
Environmental
risk
and
democratic
process:
a
critical
review.
Columbia
Journal
of
Environmental
Law
14,
501–547.
Fisher,
E.,
2007.
Ethnographic
invention:
probing
the
capacity
of
laboratory
deci-
sions.
NanoEthics
1,
155–165.
Fisher,
E.,
Rip,
A.,
2013.
Responsible
innovation:
Multi-level
dynamics
and
soft
intervention
practices.
In:
Owen,
R.,
Bessant,
J.,
Heintz,
M.
(Eds.),
Responsible
Innovation:
Managing
the
Responsible
Emergence
of
Science
and
Innovation
in
Society.
Wiley,
London,
pp.
165–183.
Fisher,
E.,
Mahajan,
R.,
Mitcham,
C.,
2006.
Midstream
modulation
of
technol-
ogy:
governance
from
within.
Bulletin
of
Science,
Technology
&
Society
26,
485–496.
Flanagan,
K.,
Uyarra,
E.,
Laranja,
M.,
2011.
Reconceptualising
the
‘policy
mix’
for
innovation.
Research
Policy
40,
702–713.
Fortun,
M.,
2001.
Mediated
speculations
in
the
genomics
futures
markets.
New
Genetics
and
Society
20,
139–156.
Fortun,
M.,
2005.
For
an
ethics
of
promising,
or:
a
few
kind
words
about
James
Watson.
New
Genetics
and
Society
24,
157–174.
Friedman,
B.,
1996.
Value-sensitive
design.
ACM
Interactions
3
(6),
17–23.
Fujimura,
J.,
1987.
Constructing
do-able’
problems
in
cancer
research:
articulating
alignment.
Social
Studies
of
Science
17,
257–293.
Fujimura,
J.,
2003.
Future
imaginaries:
genome
scientists
as
socio-cultural
entrepreneurs.
In:
Goodman,
A.,
Heath,
D.,
Lindee,
S.
(Eds.),
Genetic
Nature/Culture:
Anthropology
and
Science
beyond
the
Two-Culture
Divide.
Uni-
versity
of
California
Press,
Berkeley,
CA,
pp.
176–199.
Goodin,
R.,
Dryzek,
J.,
2006.
Deliberative
impacts:
the
macro-political
uptake
of
mini-publics.
Politics
&
Society
34,
219–244.
Grin,
J.,
Grunwald,
A.
(Eds.),
2000.
Vision
Assessment:
Shaping
Technology
in
21st
Century
Society.
Towards
a
Repertoire
for
Technology
Assessment.
Springer,
Berlin.
Grinbaum,
A.,
Groves,
C.,
2013.
What
is
“responsible”
about
responsible
innova-
tion?
Understanding
the
ethical
issues.
In:
Owen,
R.,
Bessant,
J.,
Heintz,
M.
(Eds.),
Responsible
Innovation:
Managing
the
Responsible
Emergence
of
Science
and
Innovation
in
Society.
Wiley,
London,
pp.
119–142.
Groves,
C.,
2006.
Technological
futures
and
non-reciprocal
responsibility.
Interna-
tional
Journal
of
the
Humanities
4,
57–61.
Grove-White,
R.,
Macnaghten,
P.,
Mayer,
S.,
Wynne,
B.,
1997.
Uncer-
tain
World:
Genetically
Modified
Organisms,
Food
and
Public
Attitudes
in
Britain.
Centre
for
the
Study
of
Environmental
Change,
Lancaster,
UK.
Grove-White,
R.,
Macnaghten,
P.,
Wynne,
B.,
2000.
Wising
Up:
The
Public
and
New
Technologies.
Centre
for
the
Study
of
Environmental
Change,
Lancaster,
UK.
Guston,
D.,
2006.
Toward
Centres
for
Responsible
Innovation
in
the
Commercialized
University.
Public
Science
in
a
Liberal
Democracy:
The
Challenge
to
Science
and
Democracy.
University
of
Toronto
Press,
Toronto.
Guston,
D.,
2012.
The
pumpkin
or
the
tiger?
Michael
Polanyi,
Frederick
Soddy,
and
anticipating
emerging
technologies.
Minerva
50,
363–379.
Guston,
D.,
Sarewitz,
D.,
2002.
Real-time
technology
assessment.
Technology
in
Society
24,
93–109.
Hacking,
I.,
1986.
Culpable
ignorance
of
interference
effects.
In:
MacLean,
D.
(Ed.),
Values
at
Risk.
Rowman
and
Allanheld,
Totowa,
NJ,
pp.
136–154.
Hajer,
M.,
2003.
Policy
without
polity?
Policy
analysis
and
the
institutional
void.
Policy
Sciences
36,
175–195.
Hajer,
M.,
2009.
Authoritative
Governance:
Policy
Mak-
ing
in
the
Age
of
Mediatization.
Oxford
University
Press,
Oxford.
Hajer,
M.,
Wagenaar,
H.
(Eds.),
2003.
Deliberative
Policy
Analysis:
Understanding
Governance
in
the
Network
Society.
Cambridge
University
Press,
Cambridge.
Hamilton,
C.,
2013.
Earthmasters:
The
Dawn
of
the
Age
of
Climate
Engineering.
Yale
University
Press,
London.
Hedgecoe,
A.,
2010.
Bioethics
and
the
reinforcement
of
socio-technical
expectations.
Social
Studies
of
Science
40,
163–186.
Hedgecoe,
A.,
Martin,
P.,
2003.
The
drugs
don’t
work:
expectations
and
the
shaping
of
pharmacogenetics.
Social
Studies
of
Science
33,
327–364.
Hellström,
T.,
2003.
Systemic
innovation
and
risk:
technology
assessment
and
the
challenge
of
responsible
innovation.
Technology
in
Society
25,
369–384.
Henwood,
K.,
Pidgeon,
N.,
2013.
What
is
the
Relationship
between
Identity
and
Tech-
nological,
Economic,
Demographic,
Environmental
and
Political
Change
Viewed
through
a
Risk
Lens?
Government
Office
for
Science,
London,
Downloaded
from:
http://www.bis.gov.uk/assets/foresight/docs/identity/13-519-identity-and-
change-through-a%20risk-lens.pdf
(01.02.13).
Hessels,
L.,
van
Lente,
H.,
Smits,
R.,
2009.
In
search
of
relevance:
the
changing
contract
between
science
and
society.
Science
and
Public
Policy
36,
387–401.
J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580 1579
Hilgartner,
S.,
2009.
Intellectual
property
and
the
politics
of
emerging
technology:
inventors,
citizens,
and
powers
to
shape
the
future.
Chicago-Kent
Law
Review
84,
197–224.
HM
Treasury/DTI/DfES,
2004.
Science
and
Innovation
Investment
Framework.
HM
Treasury,
London.
Hoffmann-Riem,
H.,
Wynne,
B.,
2002.
In
risk
assessment,
one
has
to
admit
ignorance.
Nature
416,
123.
Hands
Off
Mother
Earth
(HOME),
2011.
SPICE
Opposition
Letter,
Downloaded
from:
http://www.handsoffmotherearth.org/hose-experiment/spice-opposition-
letter
(01.02.13).
Horlick-Jones,
T.,
Walls,
J.,
Rowe,
G.,
Pidgeon,
N.,
Poortinga,
W.,
Murdock,
G.,
O’Riordan,
T.,
2007.
The
GM
Debate:
Risk,
Politics
and
Public
Engagement.
Routledge,
London.
House
of
Lords,
2000.
Select
Committee
on
Science
and
Technology,
Third
Report:
Science
and
Society.
HMSO,
London.
Illies,
C.,
Meijers,
A.,
2009.
Artefacts
without
agency.
The
Monist
92,
420–440.
Ipsos-MORI,
2010.
Experiment
Earth:
Report
on
a
Public
Dialogue
on
Geoengi-
neering.
Natural
Environment
Research
Council,
Swindon,
Downloaded
from:
www.nerc.ac.uk/about/consult/geoengineering-dialogue-final-report.pdf
(01.02.13).
Irwin,
A.,
2006.
The
politics
of
talk:
coming
to
terms
with
the
‘new’
scientific
gov-
ernance.
Social
Studies
of
Science
36,
299–330.
Irwin,
A.,
Jensen,
T.,
Jones,
K.,
2013.
The
good,
the
bad
and
the
perfect:
criticizing
engagement
practice.
Social
Studies
of
Science
43,
118–135.
Jasanoff,
S.,
2003.
Technologies
of
humility:
citizen
participation
in
governing
sci-
ence.
Minerva
41,
223–244.
Jonas,
H.,
1984.
The
Imperative
of
Responsibility.
University
of
Chicago
Press,
Chicago.
Kaiser,
D.,
Moreno,
J.,
2012.
Dual-use
research:
self-censorship
is
not
enough.
Nature
492,
345–347.
Karinen,
R.,
Guston,
D.,
2010.
Toward
anticipatory
governance:
the
experience
with
nanotechnology.
Governing
future
technologies.
Sociology
of
the
Sciences
Year-
book
27,
217–232.
Kearnes,
M.,
Rip,
A.,
2009.
The
emerging
governance
landscape
of
nanotechnol-
ogy.
In:
Gammel,
S.,
Lösch,
A.,
Nordmann,
A.
(Eds.),
Jenseits
von
Regulierung:
Zum
politischen
Umgang
mit
der
Nanotechnologie.
Akademische
Verlagsge-
sellschaft,
Berlin,
pp.
97–121.
Kearnes,
M.,
Wienroth,
M.,
2011.
A
New
Mandate?
Research
Policy
in
a
Technological
Society.
Research
Report.
Durham
University,
Durham,
UK.
Kerr,
A.,
Cunningham-Burley,
S.,
Tutton,
R.,
2007.
Shifting
subject
positions:
experts
and
lay
people
in
public
dialogue.
Social
Studies
of
Science
37,
385–411.
Kjølberg,
K.,
2010.
The
Notion
of
‘Responsible
Development’
in
New
Approaches
to
Governance
of
Nanosciences
and
Nanotechnologies.
University
of
Bergen,
Norway
(PhD
dissertation).
Kooiman,
J.,
Jentoft,
S.,
2009.
Meta-governance:
values,
norms
and
principles,
and
the
making
of
hard
choices.
Public
Administration
87,
818–836.
Latour,
B.,
1987.
Science
in
Action:
How
to
Follow
Scientists
and
Engineers
through
Society.
Open
University
Press,
Milton
Keynes,
UK.
Latour,
B.,
2008.
“It’s
Development,
Stupid!”
or:
How
to
Modernize
Mod-
ernization,
Downloaded
from:
http://test.espacestemps.net/articles/itrsquos-
development-stupid-or-how-to-modernize-modernization/
(01.02.13).
Lave,
R.,
Mirowski,
P.,
Randalls,
S.,
2010.
Introduction:
STS
and
neoliberal
science.
Social
Studies
of
Science
40,
659–675.
Lee,
R.,
2012.
Look
at
mother
nature
on
the
run
in
the
21st
century:
responsibility,
research
and
innovation.
Transnational
Environmental
Law
1,
105–117.
Lezaun,
J.,
Soneryd,
L.,
2007.
Consulting
citizens:
technologies
of
elicitation
and
the
mobility
of
publics.
Public
Understanding
of
Science
16,
279–297.
Lövbrand,
E.,
Pielke,
R.,
Beck,
S.,
2011.
A
democracy
paradox
in
studies
of
science
and
technology.
Science,
Technology
&
Human
Values
36,
474–496.
Lund
Declaration,
2009.
Conference:
New
Worlds
New
Solutions.
Research
and
Innovation
as
a
Basis
for
Developing
Europe
in
a
Global
Context.
Lund,
Sweden,
7–8
July
2009
,
Downloaded
from:
http://www.vr.se/download/18.7
dac901212646d84fd38000338/1264064614339/New Worlds
New
Solutions
Background.pdf
(01.02.13).
Lynch,
M.,
2000.
Against
reflexivity
as
an
academic
virtue
and
source
of
privileged
knowledge.
Theory,
Culture
&
Society
17,
26–54.
Macnaghten,
P.,
2010.
Researching
technoscientific
concerns
in
the
making:
narra-
tive
structures,
public
responses
and
emerging
nanotechnologies.
Environment
&
Planning
A
41,
23–37.
Macnaghten,
P.,
Chilvers,
J.,
2013.
The
future
of
science
governance:
publics,
policies,
practices.
Environment
and
Planning
C:
Government
and
Policy
(forthcoming).
Macnaghten,
P.,
Guivant,
J.,
2011.
Converging
citizens?
Nanotechnology
and
the
political
imaginary
of
public
engagement
in
Brazil
and
the
United
Kingdom.
Public
Understanding
of
Science
20,
207–220.
Macnaghten,
P.,
Owen,
R.,
2011.
Good
governance
for
geoengineering.
Nature
479,
293.
Macnaghten,
P.,
Szerszynski,
B.,
2013.
Living
the
global
social
experiment:
an
anal-
ysis
of
public
discourse
on
solar
radiation
management
and
its
implications
for
governance.
Global
Environmental
Change
23,
465–474.
Martin,
B.,
2010.
The
origins
of
the
concept
of
‘foresight’
in
science
and
technol-
ogy:
an
insider’s
perspective.
Technological
Forecasting
and
Social
Change
77,
1438–1447.
Michael,
M.,
Brown,
N.,
2005.
On
doing
scientific
citizenships:
reflections
on
xeno-
transplantation’s
publics.
Science
as
Culture
14,
39–57.
Miller,
C.,
Bennett,
I.,
2008.
Thinking
longer
term
about
technology:
is
there
value
in
science
fiction-inspired
approaches
to
constructing
futures?
Science
and
Public
Policy
35,
597–606.
Mitcham,
C.,
2003.
Co-responsibility
for
research
integrity.
Science
and
Engineering
Ethics
9,
273–290.
Moore,
A.,
2010.
Beyond
participation:
opening
up
political
theory
in
STS.
Social
Studies
of
Science
40,
793–799.
Morlacchi,
P.,
Martin,
B.,
2009.
Emerging
challenges
for
science,
technology
and
innovation
policy
research:
a
reflexive
overview.
Research
Policy
38,
571–582.
Muniesa,
F.,
Lenglet,
M.,
2013.
Responsible
innovation
in
finance:
directions
and
implications.
In:
Owen,
R.,
Bessant,
J.,
Heintz,
M.
(Eds.),
Responsible
Innova-
tion:
Managing
the
Responsible
Emergence
of
Science
and
Innovation
in
Society.
Wiley,
London,
pp.
185–198.
Nature,
2012.
A
charter
for
geoengineering.
Nature
485,
415
(Editorial).
Nelkin,
D.,
2001.
Beyond
risk:
reporting
about
genetics
in
the
post-Asilomar
press.
Perspectives
in
Biology
and
Medicine
44,
199–207.
Nuffield
Council
on
Bioethics,
2012.
Emerging
Biotechnologies:
Technology,
Choice
and
the
Public
Good.
Nuffield
Council
on
Biotechnologies,
London,
Down-
loaded
from:
http://www.nuffieldbioethics.org/sites/default/files/Emerging
biotechnologies
full
report
web
0.pdf
(01.02.13).
Olson,
R.,
2012.
Soft
geoengineering:
a
gentler
approach
to
addressing
climate
change.
Environment:
Science
and
Policy
for
Sustainable
Development
54
(5),
29–39.
Owen,
R.,
2011.
Legitimate
conditions
for
climate
engineering.
Environmental
Sci-
ence
and
Technology
45,
9116–9117.
Owen,
R.,
Goldberg,
N.,
2010.
Responsible
innovation:
a
pilot
study
with
the
UK
Engi-
neering
and
Physical
Sciences
Research
Council.
Risk
Analysis
30,
1699–1707.
Owen,
R.,
Macnaghten,
P.,
Stilgoe,
J.,
2012.
Responsible
research
and
innovation:
from
science
in
society
to
science
for
society,
with
society.
Science
and
Public
Policy
39,
751–760.
Parkhill,
K.,
Pidgeon,
N.,
2011.
Public
engagement
on
geoengineering
research:
pre-
liminary
report
on
the
SPICE
deliberative
workshops.
Understanding
Risk
Group
Working
Paper,
11-01.
Cardiff
University
School
of
Psychology,
Cardiff,
Down-
loaded
from:
http://psych.cf.ac.uk/understandingrisk/docs/spice.pdf
(01.02.13).
Pellizzoni,
L.,
2004.
Responsibility
and
environmental
governance.
Environmental
Politics
13,
541–565.
Pidgeon,
N.,
Parkhill,
K.,
Corner,
A.,
Vaughan,
N.,
2013.
Deliberating
stratospheric
aerosols
for
climate
geoengineering
and
the
SPICE
project.
Nature
Climate
Change
3
(5),
451–457.
Pinch,
T.,
Bijker,
W.,
1984.
The
social
construction
of
facts
and
artefacts:
or
how
the
sociology
of
science
and
the
sociology
of
technology
might
benefit
each
other.
Social
Studies
of
Science
14,
388–441.
Polanyi,
M.,
1962.
The
republic
of
science:
its
political
and
economic
theory.
Minerva
1,
54–73.
Popper,
K.,
1963.
Conjectures
and
Refutations.
Routledge
and
Kegan
Paul,
London.
Powell,
W.,
Koput,
K.,
Smith-Doerr,
L.,
1996.
Interorganizational
collaboration
and
the
locus
of
innovation:
networks
of
learning
in
biotechnology.
Administrative
Science
Quarterly
41,
116–145.
Prainsack,
B.,
Buyx,
A.,
2012.
Solidarity
in
contemporary
bioethics:
towards
a
new
approach.
Bioethics
26,
343–350.
Randles,
S.,
Youtie,
J.,
et
al.,
2012.
A
trans-Atlantic
conversation
on
responsible
innovation
and
responsible
governance.
In:
Van
Lente,
H.,
Coenen,
C.,
et
al.
(Eds.),
Little
by
Little:
Expansions
of
Nanoscience
and
Emerging
Technologies.
Akademische
Verlagsgesellschaft,
Heidelberg,
pp.
169–180.
Ravetz,
J.,
1997.
The
science
of
‘what-if?’.
Futures
29,
533–539.
Raymond,
E.,
1999.
The
Cathedral
and
the
Bazaar:
Musings
on
Linux
and
Open
Source
by
an
Accidental
Revolutionary.
O’Reilly
and
Associates,
Inc.,
Sebastopol,
CA.
Rayner,
S.,
Heyward,
C.,
Kruger,
T.,
Pidgeon,
N.,
Redgwell,
C.,
Savulescu,
J.,
2013.
The
Oxford
principles.
Climatic
Change
(in
press).
Royal
Commission
on
Environmental
Pollution
(RCEP),
1998.
21st
Report,
Setting
Environmental
Standards.
The
Stationery
Office,
London.
Richardson,
H.,
1999.
Institutionally
divided
moral
responsibility.
In:
Paul,
E.,
Miller,
F.,
Paul,
J.
(Eds.),
Responsibility.
Cambridge
University
Press,
Cambridge,
pp.
218–249.
Rip,
A.,
2011.
Responsible
innovation
responsible
governance
position
statement.
In:
Third
Annual
Conference
for
the
Society
for
the
Study
of
Nanotechnology
and
Emerging
Technologies,
Tempe,
AZ,
7–10
November.
Rip,
A.,
Misa,
T.,
Schot,
J.
(Eds.),
1995.
Managing
Technology
in
Society:
The
Approach
of
Constructive
Technology
Assessment.
Thomson,
London.
Robinson,
D.,
2009.
Co-evolutionary
scenarios:
an
application
to
prospecting
futures
of
the
responsible
development
of
nanotechnology.
Technological
Forecasting
and
Social
Change
76,
1222–1239.
Rogers,
M.,
1975.
The
Pandora’s
Box
Congress.
Rolling
Stone
Magazine.
Rogers-Hayden,
T.,
Pidgeon,
N.,
2007.
Moving
engagement
“upstream”?
Nanotech-
nologies
and
the
Royal
Society
and
Royal
Academy
of
Engineering’s
inquiry.
Public
Understanding
of
Science
16,
345–364.
Rose,
N.,
1999.
Powers
of
Freedom:
Reframing
Political
Thought.
Cambridge
Uni-
versity
Press,
Cambridge.
Rothstein,
H.,
2007.
Talking
shop
or
talking
turkey?
Institutionalizing
consumer
representation
in
risk
regulation.
Science,
Technology
&
Human
Values
32,
582–607.
Rowe,
G.,
Frewer,
L.,
2000.
Public
participation
methods:
a
framework
for
evaluation.
Science,
Technology
&
Human
Values
25,
3–29.
Rowe,
G.,
Frewer,
L.,
2005.
A
typology
of
public
engagement
mechanisms.
Science,
Technology
&
Human
Values
30,
251–290.
1580 J.
Stilgoe
et
al.
/
Research
Policy
42 (2013) 1568–
1580
Royal
Society,
2009.
Geoengineering
the
Climate:
Science,
Governance
and
Uncer-
tainty.
Royal
Society,
London.
Royal
Society/Royal
Academy
of
Engineering,
2004.
Nanoscience
and
Nanotech-
nologies:
Opportunities
and
Uncertainties.
Royal
Society,
London.
Schot,
J.,
Geels,
F.,
2008.
Strategic
niche
management
and
sustainable
innovation
journeys:
theory,
findings,
research
agenda,
and
policy.
Technology
Analysis
&
Strategic
Management
20,
537–554.
Schuurbiers,
D.,
2011.
What
happens
in
the
lab:
applying
midstream
modulation
to
enhance
critical
reflection
in
the
laboratory.
Science
and
Engineering
Ethics
17,
769–788.
Schuurbiers,
D.,
Fisher,
E.,
2009.
Lab-scale
intervention.
Science
and
society
series
on
convergence
research.
EMBO
Reports
10,
424–427.
Selin,
C.,
2007.
Expectations
and
the
emergence
of
nanotechnology.
Science,
Tech-
nology
&
Human
Values
32,
196–220.
Selin,
C.,
2011.
Negotiating
plausibility:
intervening
in
the
future
of
nanotechnology.
Science
and
Engineering
Ethics
17,
723–737.
Smith,
A.,
Stirling,
A.,
Berkhout,
F.,
2005.
The
governance
of
sustainable
socio-
technical
transitions.
Research
Policy
34,
1491–1510.
SPICE
(Stratospheric
Particle
Injection
for
Climate
Engineering),
2010.
The
SPICE
Project,
Downloaded
from
http://www2.eng.cam.ac.uk/hemh/SPICE/SPICE.
htm
(01.02.13).
Solar
Radiation
Management
Governance
Initiative
(SRMGI),
2011.
Solar
Radi-
ation
Management:
The
Governance
of
Research.
The
Royal
Society,
Lon-
don,
Downloaded
from:
http://www.srmgi.org/files/2012/01/DES2391
SRMGI-
report
web
11112.pdf
(01.02.13).
Steneck,
N.,
2006.
Fostering
integrity
in
research:
definitions,
current
knowledge,
and
future
directions.
Science
and
Engineering
Ethics
12,
53–74.
Stilgoe,
J.,
2011.
A
question
of
intent.
Nature
Climate
Change
1,
325–326.
Stilgoe,
J.,
2012.
Experiments
in
science
policy:
an
autobiographical
note.
Minerva
50,
197–204.
Stilgoe,
J.,
Watson,
M.,
Kuo,
K.
From
Bio
to
Geo:
Learning
from
Public
Engagement
with
new
technologies.
Plos
Biology,
submitted
for
publication.
Stirling,
A.,
2006.
Analysis,
participation
and
power:
justification
and
closure
in
participatory
multi-criteria
analysis.
Land
Use
Policy
23,
95–107.
Stirling,
A.,
2007.
A
general
framework
for
analysing
diversity
in
science,
technology
and
society.
Journal
of
the
Royal
Society
Interface
4,
707–719.
Stirling,
A.,
2008.
“Opening
up”
and
“closing
down”:
power,
participation,
and
plu-
ralism
in
the
social
appraisal
of
technology.
Science
Technology
&
Human
Values
33,
262–294.
Stirling,
A.,
2010.
Keep
it
complex.
Nature
468,
1029–1031.
Sutcliffe,
H.,
2011.
A
Report
on
Responsible
Research
and
Innovation
for
the
Euro-
pean
Commission.
MATTER,
London,
Downloaded
from:
http://ec.europa.eu/
research/science-society/document
library/pdf
06/rri-report-hilary-sutcliffe
en.pdf
(01.02.13).
Swierstra,
T.,
Rip,
A.,
2007.
Nano-ethics
as
NEST-ethics:
patterns
of
moral
argumen-
tation
about
new
and
emerging
science
and
technology.
Nanoethics
3,
3–20.
Swierstra,
T.,
Stemerding,
D.,
Boenink,
M.,
2009.
Exploring
techno-moral
change:
the
case
of
the
obesity
pill.
Humanities,
Social
Science
and
Law
3,
119–138.
te
Kulve,
H.,
Rip,
A.,
2011.
Constructing
productive
engagement:
pre-engagement
tools
for
emerging
technologies.
Science
and
Engineering
Ethics
17,
699–714.
TNS-BMRB,
2010.
Synthetic
Biology
Dialogue.
Sciencewise,
London.
Toffler,
A.,
1970.
Future
Shock.
Random
House,
New
York.
van
den
Hoven,
M.J.,
1998.
Moral
responsibility,
public
office
and
information
technology.
In:
Snellen,
I.,
Van
De
Donk,
W.
(Eds.),
Public
Administration
in
an
Information
Age.
IOS
Press,
Amsterdam.
van
den
Hoven,
M.J.,
Lokhorst,
G.,
van
de
Poel,
I.,
2012.
Engineering
and
the
problem
of
moral
overload.
Science
and
Engineering
Ethics
18,
1–13.
van
Lente,
H.,
1993.
Promising
Technology:
The
Dynamics
of
Expectations
in
Techno-
logical
Developments.
Universitet
Twente,
The
Netherlands
(PhD
dissertation).
van
Lente,
H.,
Rip,
A.,
1998.
The
rise
of
membrane
technology:
from
rhetorics
to
social
reality.
Social
Studies
of
Science
28,
221–254.
van
Oudheusden,
M.,
2011.
Questioning
‘participation’:
a
critical
appraisal
of
its
con-
ceptualization
in
a
Flemish
participatory
technology
assessment.
Science
and
Engineering
Ethics
17,
673–690.
von
Hippel,
E.,
1976.
The
dominant
role
of
users
in
the
scientific
instrument
inno-
vation
process.
Research
Policy
5,
212–239.
von
Hippel,
E.,
2005.
Democratizing
Innovation,
Downloaded
from:
http://web.mit.edu/evhippel/www/democ1.htm
(01.02.13).
Von
Schomberg,
R.,
2007.
From
the
Ethics
of
Technology
towards
an
Ethics
of
Knowledge
Policy
&
Knowledge
Assessment.
A
Working
Document
from
the
European
Commission
Services,
EUR
22429,
Brussels,
Downloaded
from:
http://ec.europa.eu/research/science-society/pdf/ethicsofknowledgepolicy
en.
pdf
(01.02.13).
von
Schomberg,
R.,
2011a.
Prospects
for
technology
assessment
in
a
framework
of
responsible
research
and
innovation.
In:
Dusseldorp,
M.,
Beecroft,
R.
(Eds.),
Tech-
nikfolgen
Abschätzen
Lehren:
Bildungspotenziale
Transdisziplinärer.
Vs
Verlag,
Methoden,
Wiesbaden.
von
Schomberg,
R.
(Ed.),
2011b.
Towards
Responsible
Research
and
Innovation
in
the
Information
and
Communication
Technologies
and
Security
Tech-
nologies
Fields.
European
Commission,
Brussels,
Downloaded
from:
http://
ec.europa.eu/research/science-society/document
library/pdf
06/mep-rapport-
2011
en.pdf
(01.02.13).
von
Schomberg,
R.,
2011c.
On
identifying
plausibility
and
deliberative
public
policy.
Science
and
Engineering
Ethics
17,
739–742.
von
Schomberg,
R.,
2013.
A
vision
of
responsible
research
and
innovation.
In:
Owen,
R.,
Bessant,
J.,
Heintz,
M.
(Eds.),
Responsible
Innovation:
Managing
the
Respon-
sible
Emergence
of
Science
and
Innovation
in
Society.
Wiley,
London,
pp.
51–74.
Watson,
M.,
2011.
The
Reluctant
Geoengineer
Blog,
Downloaded
from:
http://there
luctantgeoengineer.blogspot.co.uk/
(01.02.13).
Weart,
S.,
1976.
Scientists
with
a
secret.
Physics
Today
29,
23–30.
Williams,
B.,
1981.
Moral
Luck.
Cambridge
University
Press,
Cambridge.
Williams,
R.,
Edge,
D.,
1996.
What
is
the
social
shaping
of
technology?
Research
Policy
25,
856–899.
Wilsdon,
J.,
Willis,
R.,
2004.
See-Through
Science.
Demos,
London.
Winner,
L.,
1977.
Autonomous
Technology:
Technics
Out
of
Control
as
a
Theme
in
Political
Thought.
MIT
Press,
Cambridge,
MA.
Winner,
L.,
1986.
The
Whale
and
the
Reactor:
A
Search
for
Limits
in
an
Age
of
High
Technology.
University
of
Chicago
Press,
Chicago.
Wright,
S.,
2001.
Legitimating
genetic
engineering.
Perspectives
in
Biology
and
Medicine
44,
235–247.
Wynne,
B.,
1992.
Misunderstood
misunderstandings:
social
identities
and
the
public
uptake
of
science.
Public
Understanding
of
Science
1,
281–304.
Wynne,
B.,
1993.
Public
uptake
of
science:
a
case
for
institutional
reflexivity.
Public
Understanding
of
Science
2,
321–337.
Wynne,
B.,
2002.
Risk
and
environment
as
legitimatory
discourses
of
science
and
technology:
reflexivity
inside-out?
Current
Sociology
50,
459–477.
Wynne,
B.,
2003.
Seasick
on
the
third
wave?
Subverting
the
hegemony
of
proposi-
tionalism.
Social
Studies
of
Science
33,
401–418.
Wynne,
B.,
2005.
Risk
as
globalizing
discourse?
Framing
subjects
and
cit-
izens.
In:
Leach,
M.,
Scoones,
I.,
Wynne,
B.
(Eds.),
Science
and
Citizens:
Globalization
and
the
Challenge
of
Engagement.
Zed
Books,
London
and
New
York.
Wynne,
B.,
2011.
Lab
work
goes
social,
and
vice-versa:
strategising
public
engage-
ment
processes.
Science
and
Engineering
Ethics
17,
791–800.
Wynne,
B.,
2006.
Public
engagement
as
a
means
of
restoring
public
trust
in
science–hitting
the
notes,
but
missing
the
music?
Community
Genetics
9,
211–220.
... Hence, the RRI keys include rather concrete and categorical issues such as gender and public engagement. In contrast, and by their own admission, Stilgoe et al. (2013) do not define normative objectives of this kind, preferring not to decide, as they comment, 'what von Schomberg calls the "right impacts"' (Stilgoe et al., 2013(Stilgoe et al., : p 1577. Hence, for example, rather than specifying gender equality as an aim, RI specifies inclusiveness as a principle -which both leaves room for interpretation and application in ways appropriate to a particular innovation and cultural context, and incorporates inclusiveness on dimensions other than gender -reflecting the contemporary concern with EDI (equality, diversity and inclusion) in all its forms. ...
... Hence, the RRI keys include rather concrete and categorical issues such as gender and public engagement. In contrast, and by their own admission, Stilgoe et al. (2013) do not define normative objectives of this kind, preferring not to decide, as they comment, 'what von Schomberg calls the "right impacts"' (Stilgoe et al., 2013(Stilgoe et al., : p 1577. Hence, for example, rather than specifying gender equality as an aim, RI specifies inclusiveness as a principle -which both leaves room for interpretation and application in ways appropriate to a particular innovation and cultural context, and incorporates inclusiveness on dimensions other than gender -reflecting the contemporary concern with EDI (equality, diversity and inclusion) in all its forms. ...
... The EC's more recent emphasis on driving (and measuring) institutional changes, and Owen, Pansera et al.'s (2021) call for RRI to be embedded into incentives and evaluations such as the REF in the UK, and incorporated into leadership and doctoral training, reflect the long-term and profound nature of the changes required by RI and RRI, and the measures required to achieve them. RI is not something that can simply be added on as an extra box in a funding application process, and, as Stilgoe et al. (2013) show in the case of geoengineering, introducing a serious consideration of RI after the event can lead to insurmountable obstacles to delivering research that, not surprisingly, was ill-conceived (in RI terms) from the outset because of existing disciplinary and institutional assumptions and practices. ...
Chapter
Full-text available
... As Roberson [12] has argued, RI is a broad field featuring competing values and diverse perspectives, however in general it promotes ethical, inclusive, and value sensitive innovation. According to Stilgoe et al. [19], RI is defined by four features: anticipation, reflexivity, inclusivity, and responsiveness. Anticipation means an engaged, participatory assessment of the future to discover risks, but also develop a vision of what citizens would like the future to look like. ...
... Roberson [12] makes the link to public good explicit when she discusses the objectives of QTs with researchers. I think this is a valuable point to keep in mind: if RI is about shaping and guiding innovation [19], then it's clearly guiding technology towards outcomes that are good for the public. We may need anticipation, dialogue, and reflexivity to deliberate on what the common good looks like, but it is an overall aim. ...
... Many authors in the broader field include sustainability as a key aspect of responsible innovation. Fisher et al. [34] reflect on the "responsibility of humankind at the planetary level" (19). The authors recommend that climate change and urgent planetary action be the subject of future RI analysis. ...
Article
Full-text available
Using quantum technologies (QTs) to solve problems related to climate change is a key goal for many physicists at the research and development stage. Recent research anticipates numerous real-world applications for quantum technologies that will address climate change and further sustainable development goals. However, currently there is no guiding framework for implementing responsible, sustainable innovation, or criteria for evaluating the sustainability of QTs. The goal of this article is to augment previous responsible innovation (RI) analysis of, and recommendations for, quantum innovation by emphasizing sustainability as a key value. This article will also provide specific recommendations for developing sustainable QTs and criteria to assess the sustainability of QTs. With increases in funding for quantum innovation and the predicted operationality of many QTs in the coming decades, this is a key moment to discuss values and shape the quantum innovation trajectory. By using an RI approach with an added emphasis on sustainability, this article offers tools for developing responsible, sustainable QTs that are sensitive to the climate change context.
... Of the various formulations of Responsible Innovation, we focus here on the one proposed by Stilgoe et al. (2013). This has been defined as a "procedural" approach by Boenink and Kudina (2020) since it relies on a process of assessing, confronting, and discussing the different values of the various shareholders and stakeholders, as well as their expression in the innovation. ...
... As it might be clear, our paper limits itself to analysing the 'reflexivity' dimension, defined as the scrutiny of "commitments and assumptions" and "value systems and theories" that guide the innovation process. That implies assessing their appropriateness with the specific situation and challenges, as well as their effects on the outcome (Lubberink et al., 2017(Lubberink et al., , p. 1571Stilgoe et al., 2013). This is a dimension often neglected in the professional sphere, although the dedication to objectivity and thoughtful adherence to professional standards are highly significant in both science and engineering (Martin, 2002, pp. ...
... RRI has been regarded as a key concept in the discussions on science and technology policy as well (European Commission Horizon, 2020;OECD, 2022). The issues considered cover: broad public engagement in science and technology, increasing accessibility to scientific results, ensuring gender equality in both the research process and research content, encouraging of formal and informal science education, taking ethical, legal, and social implications (ELSI) (Stilgoe et al., 2013). According to Stilgoe and Guston, "Responsible innovation means taking care of the future through collective stewardship of science and innovation in the present" Guston, 2017, p. 1570). ...
... In other words, RRI is a way of thinking about reflexive and adaptive innovation concerning emerging science and technology. This approach encourages us to develop legitimate and more effective processes of decision-making through open, transparent, and upstream dialog with various stakeholders, considering the previous experiences with GMOs, nanotechnologies, and others (Stilgoe et al., 2013;Stilgoe and Guston, 2017;Komiya et al., 2022). ...
Article
Full-text available
Media coverage is an important determinant of the social conception and public understanding of science. Therefore, understanding the media framing of science and technology is important for science communication. As such, we try to determine the frames that are significant in news coverage concerning science and technology, whether the dominant frames changed over time, and whether there are any overlooked frames. To this end, we focused on news articles on multiple life-science fields in Japan to examine the ethical, legal, and social implications covered in the media of three fields: genetic modification, stem cell science and regenerative medicine, and brain-neuroscience. We examined seven frames (i.e., instrumental science, risky science, juggernaut science, techno-nationalism, governance, communication matters, and trust in science) related to the ethical and social implications for the three technologies. We collected 37,009 articles from the newspaper database. After a pilot analysis of the collected articles based on text mining, we coded a total of 1,805 articles from 1991 to 2020 using random sampling. Our results showed that the frames varied among the three technologies over time and no frame synchronization was observed. This implies that the media coverage of each technology was independent of those of the other technologies. A trend common to all technologies was that the frame “instrumental science” was dominant, meaning that positive opinions predominate in the Japanese media coverage of life sciences. This result suggests ethical issues of life sciences were often missing in Japanese media discourse. An urgent task is to bridge the gap between the discussions of ethics communities and the media coverage. Our study provides evidence of the potential social implications of life science according to assumed for public understanding.
... Since 2005, there has also been the emergence of a multidisciplinary research area, responsible innovation (RI) (or responsible research and innovation, RRI), which aims to align research and innovation processes with societal needs, values, and expectations to ensure that technological advancements contribute to sustainable, inclusive, and ethical outcomes (Stilgoe et al., 2013;Gianni et al., 2019). This research area involves collaboration between ethicists, social scientists, scientists and engineers, and nonacademic actors, including stakeholders from industry, government, and civil society. ...
Article
Full-text available
This article traces the historical development of the ethics of emerging technologies. It argues that during the late 2000s and 2010s, the field of ethics of technology transformed from a fragmented, reactive, and methodologically underdeveloped discipline focused on mature technologies and lacking policy orientation into a more cohesive, proactive, methodologically sophisticated, and policy-focused field with a strong emphasis on emerging technologies. An agenda for this transition was set in Jim Moor’s seminal publication “Why We Need Better Ethics for Emerging Technologies”.
... After 2010, issues related to retractions became a central focus, with numerous burst references discussing various aspects of this topic (Fang and Casadevall 2011;Grieneisen and Zhang 2012;Steen 2011;Van Noorden 2011;Wager and Williams 2011). Meanwhile, discussions on responsible research and innovation produced several high-burst references (Brand and Blok 2019;Burget, Bardone, and Pedaste 2016;Owen, von Schomberg, and Macnaghten 2021;Stilgoe, Owen, and Macnaghten 2013;Wiarda et al. 2021). Other isolated burst references suggest emerging areas of interest, such as research culture, gender and research integrity (Fanelli, Rodrigo, and Vincent 2015), paper mills (Else and Van Noorden 2021), and access to research integrity education resources (Mejlgaard et al. 2020;Pizzolato, Abdi, and Dierickx 2020). ...
Article
Background: Research integrity is fundamental to responsible research practice. Despite attention, the intellectual structure and evolution of this field remains underexplored. This study maps the knowledge landscape of research integrity, identifying key themes, contributions, and trends. Methods: A scientometric analysis was conducted on 6,895 records from Web of Science and Scopus (1935–2024). CiteSpace facilitated network analysis, including co-authorship, keyword co-occurrence, and co-citation patterns, while burst detection identified topics. Results: Research integrity studies have grown significantly since the 1980s, with interdisciplinary collaboration. Keyword and co-citation analyses reveal a shift from early discussions on scientific misconduct to concerns such as open science, AI ethics, and research governance. A collaboration network has emerged, with leading contributions from North America, Europe, and Asia. Conclusions: Research integrity has matured into an interdisciplinary field, reaching academic consensus with growing integration of policies, regulations, and technology. Future research is expected to focus on AI’s role in research integrity. Key areas of concern include algorithmic bias, automation ethics, and implications for scholarly publishing. Open science and transparency will remain central, particularly in addressing data fabrication, paper mills, and predatory publishing. Institutional policies will continue evolving, embedding integrity principles into governance and public engagement initiatives.
Chapter
This chapter examines the pressing contemporary challenges in climate governance, addressing the complexities and dynamics that shape current policy and practice. This chapter explores the multifaceted issues arising from rapid environmental changes, evolving socio-economic contexts, and emerging technological developments. It highlights the conceptual frameworks used to understand and address these challenges, focusing on the integration of scientific research, policy responses, and stakeholder engagement. The chapter provides a critical analysis of current governance structures and identifies gaps and opportunities for improvement. By presenting case studies and theoretical insights, it offers practical recommendations for enhancing the effectiveness of climate governance in addressing modern-day issues. This comprehensive overview aims to inform policymakers, researchers, and practitioners about effective strategies and innovative approaches to overcome contemporary climate governance challenges.
Article
Full-text available
This systematic literature review examines AI transparency laws and governance in the European Union (EU) and the United Kingdom (UK) through a socio-legal lens. The study highlights the importance of transparency in AI systems as a key regulatory focus globally, driven by the need to address the risks posed by opaque, ‘black box’ algorithms that can lead to unfair outcomes, privacy violations, and a lack of accountability. It identifies significant differences between the EU and UK approaches to AI regulation post-Brexit, with the EU's tiered, risk-based framework and the UK's more flexible, sector-specific strategy. The review categorises the literature into five themes: the necessity of AI transparency, challenges in achieving transparency, techniques for governing transparency, laws governing AI transparency, and soft law governance toolkits. The findings suggest that while technical solutions like eXplainable AI (XAI) and counterfactual methodologies are widely discussed, there is a critical need for a comprehensive, whole-of-organisation approach to embedding AI transparency within the cultural and operational fabric of organisations. This approach is argued to be more effective than top-down mandates, fostering an internal culture where transparency is valued and sustained. The study concludes by advocating for the development of AI transparency toolkits, particularly for small and medium-sized enterprises (SMEs), to address sociotechnical barriers and ensure that transparency in AI systems is practically implemented across various organisational contexts. These toolkits would serve as practical guides for companies to adopt best practices in AI transparency, aligning with both legal requirements and broader sociocultural considerations.
Chapter
AI in Society provides an interdisciplinary corpus for understanding artificial intelligence (AI) as a global phenomenon that transcends geographical and disciplinary boundaries. Edited by a consortium of experts hailing from diverse academic traditions and regions, the 11 edited and curated sections provide a holistic view of AI’s societal impact. Critically, the work goes beyond the often Eurocentric or U.S.-centric perspectives that dominate the discourse, offering nuanced analyses that encompass the implications of AI for a range of regions of the world. Taken together, the sections of this work seek to move beyond the state of the art in three specific respects. First, they venture decisively beyond existing research efforts to develop a comprehensive account and framework for the rapidly growing importance of AI in virtually all sectors of society. Going beyond a mere mapping exercise, the curated sections assess opportunities, critically discuss risks, and offer solutions to the manifold challenges AI harbors in various societal contexts, from individual labor to global business, law and governance, and interpersonal relationships. Second, the work tackles specific societal and regulatory challenges triggered by the advent of AI and, more specifically, large generative AI models and foundation models, such as ChatGPT or GPT-4, which have so far received limited attention in the literature, particularly in monographs or edited volumes. Third, the novelty of the project is underscored by its decidedly interdisciplinary perspective: each section, whether covering Conflict; Culture, Art, and Knowledge Work; Relationships; or Personhood—among others—will draw on various strands of knowledge and research, crossing disciplinary boundaries and uniting perspectives most appropriate for the context at hand.
Article
Full-text available
This chapter discusses initiatives in the field of "Responsible Innovation" (RI) at different levels and some of their dynamics. Then it focuses on two types of intervention-oriented activities that concretely support and stimulate aspirations for RI/Responsible Research and Innovation" (RRI): socio-technical integration research (STIR) and constructive technology assessment (CTA). Finally, it discusses tensions in the concept and practices of RI/RRI, particularly in the light of its prospects for effectiveness and institutionalization.
Book
An investigation into standards, the invisible infrastructures of our technical, moral, social, and physical worlds. Standards are the means by which we construct realities. There are established standards for professional accreditation, the environment, consumer products, animal welfare, the acceptable stress for highway bridges, healthcare, education—for almost everything. We are surrounded by a vast array of standards, many of which we take for granted but each of which has been and continues to be the subject of intense negotiation. In this book, Lawrence Busch investigates standards as “recipes for reality.” Standards, he argues, shape not only the physical world around us but also our social lives and even our selves. Busch shows how standards are intimately connected to power—that they often serve to empower some and disempower others. He outlines the history of formal standards and describes how modern science came to be associated with the moral-technical project of standardization of both people and things. Busch suggests guidelines for developing fair, equitable, and effective standards. Taking a uniquely integrated and comprehensive view of the subject, Busch shows how standards for people and things are inextricably linked, how standards are always layered (even if often addressed serially), and how standards are simultaneously technical, social, moral, legal, and ontological devices.
Conference Paper
Over the last 25 years, a small but growing body of research on research behavior has slowly provided a more complete and critical understanding of research practices, particularly in the biomedical and behavioral sciences. The results of this research suggest that some earlier assumptions about irresponsible conduct are not reliable, leading to the conclusion that there is a need to change the way we think about and regulate research behavior. This paper begins with suggestions for more precise definitions of the terms "responsible conduct of research," "research ethics, " and "research integrity. " It then summarizes the findings presented in some of the more important studies of research behavior, looking first at levels of occurrence and then impact. Based on this summary, the paper concludes with general observations about priorities and recommendations for steps to improve the effectiveness of efforts to respond to misconduct and foster higher standards for integrity in research.
Article
This book goes to the heart of the unfolding reality of the twenty-first century: international efforts to reduce greenhouse gas emissions have all failed, and before the end of the century Earth is projected to be warmer than it has been for 15 million years. The question "can the crisis be avoided?" has been superseded by a more frightening one, "what can be done to prevent the devastation of the living world?" And the disturbing answer, now under wide discussion both within and outside the scientific community, is to seize control of the very climate of the Earth itself. Clive Hamilton begins by exploring the range of technologies now being developed in the field of geoengineering--the intentional, enduring, large-scale manipulation of Earth's climate system. He lays out the arguments for and against climate engineering, and reveals the extent of vested interests linking researchers, venture capitalists, and corporations. He then examines what it means for human beings to be making plans to control the planet's atmosphere, probes the uneasiness we feel with the notion of exercising technological mastery over nature, and challenges the ways we think about ourselves and our place in the natural world.