ArticlePDF Available

Enacting Catastrophe: Preparedness, Insurance, Budgetary Rationalization

Authors:

Abstract and Figures

This article examines 'enactment' as a significant new form of knowledge about collective life that differs fundamentally from familiar forms of 'social' knowledge. The emergence of enactment is traced through a series of domains where the problem of estimating the likelihood and consequence of potentially catastrophic future events has been made an explicit object of expert reflection: response to a possible nuclear attack in US civil defence planning in the late 1940s; the emergence of natural hazard modelling in the 1960s and 1970s; and the emergence today of terrorism risk assessment and its proposed application to federal budgetary distributions. The article engages with central questions in debates around 'risk society' and insurance, holding that new approaches to understanding and assessing risk are not merely idiosyncratic or subjective. Rather, they should be treated as coherent new forms of knowledge and practice whose genealogy and present assemblies must be traced.
Content may be subject to copyright.
This article was downloaded by: [New York University]
On: 07 February 2014, At: 08:56
Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954
Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,
UK
Economy and Society
Publication details, including instructions for authors
and subscription information:
http://www.tandfonline.com/loi/reso20
Enacting catastrophe:
preparedness, insurance,
budgetary rationalization
Stephen J. Collier
Published online: 25 Jun 2008.
To cite this article: Stephen J. Collier (2008) Enacting catastrophe: preparedness,
insurance, budgetary rationalization, Economy and Society, 37:2, 224-250, DOI:
10.1080/03085140801933280
To link to this article: http://dx.doi.org/10.1080/03085140801933280
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the
information (the “Content”) contained in the publications on our platform.
However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness, or
suitability for any purpose of the Content. Any opinions and views expressed
in this publication are the opinions and views of the authors, and are not the
views of or endorsed by Taylor & Francis. The accuracy of the Content should
not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions,
claims, proceedings, demands, costs, expenses, damages, and other liabilities
whatsoever or howsoever caused arising directly or indirectly in connection
with, in relation to or arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes.
Any substantial or systematic reproduction, redistribution, reselling, loan, sub-
licensing, systematic supply, or distribution in any form to anyone is expressly
forbidden. Terms & Conditions of access and use can be found at http://
www.tandfonline.com/page/terms-and-conditions
Downloaded by [New York University] at 08:56 07 February 2014
Enacting catastrophe:
preparedness, insurance,
budgetary rationalization
Stephen J. Collier
Abstract
This article examines ‘enactment’ as a significant new form of knowledge about
collective life that differs fundamentally from familiar forms of ‘social’ knowledge.
The emergence of enactment is traced through a series of domains where the
problem of estimating the likelihood and consequence of potentially catastrophic
future events has been made an explicit object of expert reflection: response to a
possible nuclear attack in US civil defence planning in the late 1940s; the emergence
of natural hazard modelling in the 1960s and 1970s; and the emergence today of
terrorism risk assessment and its proposed application to federal budgetary
distributions. The article engages with central questions in debates around ‘risk
society’ and insurance, holding that new approaches to understanding and assessing
risk are not merely idiosyncratic or subjective. Rather, they should be treated as
coherent new forms of knowledge and practice whose genealogy and present
assemblies must be traced.
Keywords: insurance; risk society; security; terrorism; modelling; natural disasters.
Introduction
In recent years events such as terror attacks, natural disasters and technological
accidents have received growing attention from critical social scientists.
1
A
central question in this literature is whether such events exceed the mechanism
of risk spreading through insurance that was so essential to a certain kind of
security over the twentieth century. This problem of insurability has served as
Stephen J. Collier, Graduate Program in International Affairs, The New School, 66 West
12th Street, Room 612, New York, NY 10011, USA. E-mail: CollierS@newschool.edu
Copyright # 2008 Taylor & Francis
ISSN 0308-5147 print/1469-5766 online
DOI: 10.1080/03085140801933280
Economy and Society Volume 37 Number 2 May 2008: 224 250
Downloaded by [New York University] at 08:56 07 February 2014
a lens for broader issues: do such risks exceed mechanisms of calculated
mitigation altogether? How do they transform the apparatuses of collective
security that characterized industrial modernity?
The present article addresses these questions by shifting attention away
from insurability and calculability per se and towards an analysis of alternative
mechanisms for knowing and assessing risks. The knowledge form classically
associated with insurance which uses statistics to analyse an ‘archive’ of past
events is only one way through which uncertain threats can be known. In
what follows, I contrast this archival statistical knowledge with enactment-based
knowledge produced by ‘acting out’ uncertain future threats in order to
understand their impact.
2
I will trace the genealogy of enactment through a
series of historical moments in the United States: from civil defence
preparedness to disaster insurance to budgetary rationalization in contempor-
ary domestic security. In tracing this genealogy, I do not mean to suggest that
enactment-based knowledge has superseded archival statistical knowledge or
that enactment is the paradigmatic knowledge form of contemporary collective
security. It is, however, a significant new approach to producing knowledge
about collective life, one that is increasingly important in the diverse emerging
assemblages of risk, rationality, and security.
The article begins by examining the critical social science discussion
concerning catastrophe risk and insurance. I focus in particular on a central
point of reference in these discussions, Ulrich Beck’s ‘insurability thesis’.
Beck’s thesis involves two interlinked claims. First, he argues that con-
temporary societies increasingly face ‘catastrophe’ risks such as industrial
accidents or large-scale terrorism that cannot be covered by private insurance.
Second, Beck argues that this insurance ‘limit’ marks the threshold of risk
society, whose dominant techno-political dynamics are shaped by uncertain
risks that are beyond rational assessment and calculated mitigation.
Beck’s claims have been criticized, both because they are empirically suspect
(insurers do cover some catastrophe risk) and because they paint an overly
epochal picture of the shift from the manageable risks of industrial modernity
to the uncertain and thus unmanageable risks of risk society (Bougen, 2003;
Ericson & Doyle, 2004a; O’Malley, 2004; Rose et al., 2006). These critiques are
persuasive. Nonetheless, I want to insist that Beck offers crucial insights into
contemporary mutations of risk and collective security. Following Beck’s own
suggestion in recent work (Beck & Lau, 2005), his epochal story can, with
some modification, be turned into a set of useful propositions for inquiry. Beck
argues that uncertain risks exceed the limits of calculative and instrumental
rationality in general. It would be more precise and analytically productive to
say that such risks push the limits of a specific form of calculative rationality
based on archival statistical knowledge. This distinction is important, since
what we observe today is not the paralysis of frameworks for rational response
to uncertain threats. Rather, we see the proliferation of such frameworks in
multifarious emerging initiatives whose aim is to generate knowledge about
uncertain future events and to link this knowledge to diverse mechanisms of
Stephen J. Collier: Enacting catastrophe 225
Downloaded by [New York University] at 08:56 07 February 2014
mitigation. Thus, as O’Malley (2000) suggests, the central question concerns
not the theoretical status of calculative rationality (or ‘insurability’) per se, but,
rather, how risks judged ‘uncertain’ from one perspective are already being
known and assessed using other approaches.
The remainder of the article examines one approach to assessing ‘uncertain’
risk that has emerged and become institutionally significant in the last fifty
years: enactment. Enactment, it will be shown, is of a fundamentally different
character from archival statistical knowledge (see Table 1). Rather than
drawing on an archive of past events, enactment uses as its basic data an
inventory of elements at risk, information about the vulnerability of these
elements and a model of the threat itself an event model. And, rather than
using statistical analysis, enactment ‘acts out’ uncertain future threats by
juxtaposing these various forms of data. These elements of enactment have
been noted in some critical scholarship on insurance (Bougen, 2003; Ericson &
Doyle, 2004a). But they have generally been treated as ‘idiosyncratic’ or non-
standard techniques of risk assessment. What is more, since such discussions
have been confined to insurance, they have not examined the diversity of
domains in which enactment has been deployed.
The analysis that follows traces a genealogical progression through which
enactment was articulated and linked to diverse mechanisms of mitigation:
first, civil defence preparedness planning in response to the Soviet nuclear
threat in the early 1950s; second, new approaches to natural hazard modelling,
which linked enactment to insurance from the 1960s to the 1980s; third, and
finally, the contemporary emergence of terrorism risk assessment based on
enactment, and its proposed application to the rationalization of federal
budgetary distributions. These moments in the genealogy of enactment affirm
many propositions in Beck’s risk society theses. In each, archival statistical
forms of knowledge and assessment encounter a ‘limit’. In each, the problem of
estimating the likelihood and consequence of uncertain future events is made
an explicit object of expert reflection and technocratic response. The
genealogy of enactment does not, however, confirm propositions concerning
a generalized crisis of rationality in the face of uncertain threats or a ‘structural
break’ (Beck & Lau, 2005, p. 532) in the status of expert knowledge. Rather, we
find alternative forms of knowledge and assessment enactment among them
with coherent genealogies, with bodies of sanctioned expertise that become
authoritative in certain contexts (though they remain disputed in others), and
Table 1 Archival statistical knowledge versus assessment through enactment
Archival statistical knowledge Enactment
Data Archive of past events Inventory of elements;
vulnerability data, event model
Technique of
Assessment
Statistical analysis of distribution
of risks over a population
Enactment of future events
226 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
with their own norms and modes of veridiction.
3
In specific sectors and in
response to events we see moments of re-problematization in which existing
forms are critiqued, redeployed and recombined. My analytic strategy is to
trace these recombinations and the emerging forms of collective security to
which they are giving rise.
4
Beck’s ‘insurability’ thesis and styles of reasoning about risk
In his now thoroughly debated work, Beck has characterized contemporary
society as ‘risk society’.
5
The distinction that Beck marks with this term is
between a society faced with threats whose ‘risk’ can be confidently expressed
as a likelihood of future harm and a society faced with threats whose risk is
uncertain.
6
He argues that this distinction is made visible by the limit of a
distinctive technology of collective security, namely, insurance and, in
particular, private insurance. Given the centrality of these claims to
contemporary discussions about risk, it bears rehearsing the outlines of Beck’s
‘insurability’ thesis.
The voluminous critical scholarship on insurance has shown that insurance
was applied to problems of collective security in response to a certain class of
events ‘pathologies of the social’ such as disease, workplace accidents and
poverty in the nineteenth century (Ewald, 1991, 2002; Foucault, 2007;
Hacking, 1990, 2003; Rabinow, 1989; Rose, 1999).
7
Through an emerging
practice of applied social science, these pathologies were mapped onto the
regularities of collective life through new knowledge forms that were, in turn,
linked to the insurential mechanism of risk spreading. The data associated with
this ‘social insurance’, that is, insurance deployed as a mechanism of collective
security, might be called archival (Collier & Lakoff, 2008a; Fearnley, 2005). It is
a ‘historical’ record of past events illnesses, crimes, incidents of poverty in a
population. The analytic technique or technique of assessment in social insurance
is statistics, used to understand the distribution of these events and, thus, the
distribution of risks over a population. This archival statistical knowledge
about the population, finally, was linked to the insurance form of risk spreading.
Actuaries could estimate the risks associated with an individual policy and with
a total portfolio, and make decisions about what to insure, whom to insure and
what premiums to charge. This ensemble of elements is portrayed in Table 2.
Table 2 Social insurance
Data Archival information about pathologies of the social
(disease, accidents)
Technique of analysis Statistical assessment of risk distribution over a
population
Mechanism of mitigation Insurance-based distribution of risk over a population of
rate payers
Stephen J. Collier: Enacting catastrophe 227
Downloaded by [New York University] at 08:56 07 February 2014
For Beck and, it should be noted, for many other critical social scientists
social insurance is paradigmatic of the modern ‘security pact’.
8
The
‘systematic creation of consequences’ by industrial modernity, Beck argues,
found a ‘counter principle’ in a ‘social compact against industrially produced
hazards and damages, stitched together out of public and private insurance
agreements’ (Beck, 1992b, p. 100). Through the mechanism of insurance ‘the
incalculable threats of pre-industrial society (plague, famine, natural cata-
strophes, wars, but also magic, gods, demons) [were] transformed into
calculable risks in the course of the development of instrumental rational
control’ (Beck, 1999, p. 76).
9
In his work on risk society, of course, Beck’s primary concern is with the
limits of this security pact. He associates this limit with the emergence of
threats such as nuclear war or environmental catastrophes that do not meet the
basic criteria of insurability based on archival statistical knowledge.
10
Such
threats are also systematically produced by industrial modernity; indeed, as
Beck points out, they are the products of industrial modernity’s very success, a
point to which we return. But unlike ‘pathologies of the social’, these threats of
risk society exceed industrial modernity’s characteristic mechanism of security.
There is no archive of past events whose analysis might provide a guide to
future events.
11
As a consequence, Beck argues, ‘standards of normality,
measuring procedures and therefore the basis for calculating the hazards are
abolished. ... [I]ncomparable entities are compared and calculation turns into
obfuscation’ (Beck, 1992b, p. 102). Such risks, in short, exceed mechanisms of
calculated, rational assessment.
Beck draws various implications from these observations, among them a
core thesis concerning a new politics around the distribution of social ‘bads’.
Here I want to focus on a narrower technical dimension of his argument. Beck
posits that the ‘limit’ of private insurance marks the threshold of risk society.
The market economy’s internal dynamic, he argues, ‘reveals the boundary line
of what is tolerable with economic precision, through the refusal of private
insurance. Where the logic of private insurance disengages, where the
economic risks of insurance appear too large or too unpredictable to insurance
concerns, the boundary that separates ‘‘predictable’’ risks from uncontrollable
threats has obviously been breached’ (Beck, 1992b, p. 130). The uninsurable
risk, for Beck, is the autonomic signalling mechanism of risk society.
Beck’s ‘insurability thesis’ has been subject to deserved critique, first of all on
empirical grounds: private insurers do, in some cases, offer coverage for
catastrophe risk. This empirical qualification has conceptual implications. It
does not seem plausible to maintain that private insurance serves as the
boundary marker of risk society (Ericson & Doyle, 2004, p. 137); and we may
wonder more broadly whether epoch-marking terms like ‘risk society’ do the
analytical work required to trace contemporary shifts in contemporary risk and
security. These problems, I hold, can be traced to two conceptual slippages in
Beck’s work. First, in his arguments about insurance Beck seems to conflate the
limits of a specific knowledge form what I have called the archival statistical
228 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
form of knowledge and assessment with the limits of the insurential
mechanism of risk spreading. Second, Beck associates the limit of a specific style
of reasoning about risk with the limit of rational assessment in general.
12
If these
slippages can be sorted out, we may be able to proceed with Beck’s categories,
understanding that they offer provocative orientations for inquiry rather than
an epochal diagnosis of the present.
First, the empirical question concerning the insurability of catastrophe risk:
discussions of this problem can be traced back to at least the 1960s and 1970s
when a combination of new economic formalizations and practical problems
specifically, mounting losses from natural disasters drew the attention of
insurance professionals and policy-makers. Beginning in the 1970s, a specialist
literature of economists and insurance experts began to argue that catastrophe
risk could be managed by private insurance companies (Anderson, 1976; Jaffee
& Russell, 1997). Over time, tools for assessing such risks became increasingly
widespread, and have recently been extended to new kinds of catastrophe risks,
such as terrorism. This development has been examined in critical studies of
insurance that have asked how actuaries and underwriters respond to
catastrophe risks (Bougen, 2003; Ericson & Doyle, 2004a; O’Malley, 2003).
Ericson and Doyle (2004a), for example, have studied how the insurance
industry successfully reconfigured itself to provide terrorism risk insurance
following an initial retrenchment after 9/11. Faced with the uncertain risk of
terrorism, they show, underwriters do not proceed on the basis of archival
statistical knowledge and assessment. Rather, they draw on alternative forms of
risk assessment and mechanisms of risk spreading that are ‘assembled in
different ways’ (Ericson & Doyle, 2004a, p. 139). These include new approaches
to risk assessment about which more below as well as the introduction of
‘secondary’ mechanisms for distributing risk such as reinsurance, securitization
and government backstops (Bougen, 2003; Economist, 2007).
In showing how catastrophe risks are covered by insurers, critical studies of
insurance have refuted some elements of Beck’s position but seem to confirm
others. They refute his claim that the limit of archival statistical knowledge is
also the limit of the insurance mechanism of risk spreading. If we distinguish
these, such studies suggest, we see that the insurance mechanism can be
extended beyond the limits of archival statistical knowledge with which it has
been associated. But in their discussions of how insurance is extended beyond
this insurance limit these studies seem to confirm another key Beckian thesis:
that the limit of archival statistical knowledge, at least in insurance, is also the
limit of rational knowledge and calculated assessment. Insurers faced with
uncertain threats, critical scholars argue, must fall back on ‘subjective’ or ‘non-
scientific’ approaches to assessing catastrophe risk; they do not have access to
tools of rational assessment or quantitative calculation.
13
For example, Ericson
and Doyle argue that ‘[i]n spite of claims about a ‘‘fully probabilistic
framework’’ and an understanding of ‘‘the true nature of risk,’’ terrorism
loss estimate models are heavily dependent upon the subjective opinions of
experts’ (2004a, p. 150). ‘Scientific data on risk’, they note, ‘are variously
Stephen J. Collier: Enacting catastrophe 229
Downloaded by [New York University] at 08:56 07 February 2014
absent, inadequate, controversial, contradictory, and ignored. Insurers impose
meaning on uncertainty through non-scientific forms of knowledge that are
intuitive, emotional, aesthetic, moral, and speculative’ (Ericson & Doyle, 2004a,
p. 138). Bougen (2003), p. 259), meanwhile, refers to the ‘highly idiosyncratic
methods’ that are used to assess uncertain risks. ‘The industry in dealing with
low probability events’, he notes, ‘has a particularly fragile connection to
statistical technologies. As regards natural catastrophes, reinsurers operate in a
calculative space invested by inescapable uncertainty’ (Bougen, 2003, p. 258).
No doubt, there is something to this. In situations where existing techniques
of assessment are destabilized subjective judgement plays a significant role, as
these scholars have shown through extensive interviews. The problem is that
oppositions of scientific versus non-scientific, objective versus subjective or
idiosyncratic versus standard may obscure more than they reveal. They seem to
render unnecessary the conceptualization of alternative approaches to risk
assessment or the search for coherent genealogies through which these have
evolved and been recombined across multiple domains. As O’Malley puts this
point, it is necessary to ‘appreciate the value in recognizing the diversity of ...
techniques for governing the future, rather than conflating them into a binary
of ‘‘risk’’ and the ‘‘incalculable’’ other of estimation’ (O’Malley, 2003).
In this light, I would push the critique of Beck in a different direction, while
affirming many of his basic insights. In his arguments about insurance, Beck
has associated the crisis of a specific approach to knowing and assessing risk
the statistical analysis of past events with a crisis of economic or calculative
rationality in general. An alternative approach, following the old Weberian
argument, would be to ask not whether a given form of knowledge and
assessment is rational, but what form of rationalization is being employed
(Weber, 2002).
14
From this perspective, one might argue that the archival
statistical model is only one approach to knowing and assessing uncertain
future events. There are, to borrow Hacking’s (1992) phrase, other ‘styles of
reasoning’ about risk, and one need not make any general judgement about
their rationality to acknowledge their systematicity, specificity and rigour or to
acknowledge that they may provide frameworks for quantitative, calculative
choice. What is more, as O’Malley points out, these forms ‘have long and
effective histories’ (2003, p. 277). The questions that ought to be asked, in this
light, are: what are these histories of alternative approaches to risk assessment?
And how are their elements taken up, reworked and redeployed by experts
facing diverse risks in heterogeneous domains?
The remainder of this article approaches these questions by examining the
‘long and effective history’ of one approach to knowing collective life and
assessing its risks enactment. Various starting points for a genealogy of
enactment could be chosen: the development of war games and war
simulations (Collier & Lakoff, 2008a; Der Derian, 2003); the Dutch
development of flood models beginning in the 1920s (Bijker, 2002); the
formulation of earthquake loss models for insurance decisions in the 1930s
(Freeman, 1932); or, the case discussed presently, civil defence planning after
230 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
World War II. Given these various possible starting points, it should be clear
that in what follows I do not propose a definitive historical account of
enactment. Rather, I examine exemplary moments that illustrate how
techniques of enactment have been invented, redeployed and recombined in
response to new threats.
Civil defence and the enactment of nuclear attack
The episode with which I begin is situated in the early years of the Cold War,
in American civil defence. It is a nice case here because it describes a context in
which planners were beginning to think about what Beck identified as a
signature risk of ‘risk society’: nuclear war. Planners saw civil defence as
essential for limiting the damage of nuclear attack on American cities. They
also thought that civil defence would help ensure that the United States, if
attacked, could fight back. Later, this problematic of ‘second strike’ focused on
hardening and multiplying missile launch facilities.
15
But immediately after
World War II, the assumption still ran that the continued functioning of cities
and industry after an enemy strike would be crucial to successfully prosecuting
a war. Civil defence authorities saw that, in the era of total war, the very
systems that had been developed to support modern urban life were now
sources of vulnerability.
16
Industrial plants, systems of transportation,
communication and urban hygiene whose construction had been essential to
modern prosperity and security were, thus, understood in a new light as
possible targets. This, too, was a quintessentially Beckian moment. The success
of modernization created new vulnerabilities and risks. The question was: how
could these vulnerabilities be systematically assessed? And how could they be
mitigated?
The answers to these questions were outlined in the technical and policy
documents that defined the US approach to civil defence in the early Cold
War. These documents laid out a new form of knowledge about collective life
and its risks. It was concerned not with the regularly occurring pathologies of
the social but with an uncertain future catastrophe: a nuclear attack (Collier &
Lakoff, 2008a). The basic technique for producing such knowledge was to ‘act
out’ a nuclear strike on an American city by juxtaposing various kinds of data
arranged on maps. The aim was to generate information about an attack’s
immediate impact, about the consequences for urban systems that would flow
from this initial impact and about the capacities that would be required for
response. Although the techniques in these documents were rudimentary, the
basic components of enactment were already present.
The discussion that follows describes enactment in civil defence planning by
examining a 1953 document entitled Civil Defense Urban Analysis that Lakoff
and I (2008a) have analysed in a more extensive discussion from which this
section draws.
17
The purpose of Civil Defense Urban Analysis (CDUA) was to
instruct local officials on the procedures for conducting ‘urban analyses’ that
Stephen J. Collier: Enacting catastrophe 231
Downloaded by [New York University] at 08:56 07 February 2014
would serve as the basis for civil defence planning. The starting point for such
urban analysis was to catalogue forty-seven types of ‘urban features’ that, the
document explained, were significant for assessing the impact of an attack and
for planning post-attack response requirements. These features included
infrastructures (streets and highways, port facilities, the telephone system,
bridges, the water distribution system, the electric power system), spatial
characteristics of human settlement (patterns of land use, building density,
population distribution), organizations (industrial plants, police stations,
hospitals, zoos, penal institutions) and features of the physical environment
(underground openings such as caves and mines, topography and prevailing
winds) (CDUA 66 77). The resulting catalogue of urban elements comprised a
distinctive set of data about collective life that can be contrasted to the ‘archive’
of events used in archival statistical analysis. The components of the catalogue
were not past events (illnesses or accidents) but elements of a city that might be
relevant to a potentially catastrophic future event.
After cataloguing these urban features, the next step for local civil defence
planners was to engage in assessment through enactment. In early Cold War
civil defence, enactment meant physically juxtaposing maps to produce
knowledge about an uncertain future event. Here, I consider just one part of
this process the estimation of initial blast impact to indicate how this
mapping procedure allowed planners to model a nuclear blast and its impact
on a specific city. First, planners were to arrange the catalogued urban
elements on a map of the city to produce a diagram of the pre-attack event-
space. This diagram served as a kind of template over which a variety of
attacks could be acted out. Second, planners were to model a given possible
attack as it unfolded over this event-space by placing a transparent acetate
overlay with regularly spaced concentric circles on top of the event-space
diagram. The centre was placed at an ‘assumed aiming point’ so that each
concentric circle radiating out from this point demarcated a zone of common
blast intensity, thus mapping the distribution of blast intensity throughout
the city.
Once these two elements the catalogue and the modelled event were
juxtaposed, planners were instructed to introduce data about how a given
‘feature’ in the catalogue would be affected by a given level of blast intensity.
They did so by drawing on information about what later, in other contexts,
would be called the feature’s vulnerability its susceptibility to damage from
a given event. In civil defence planning, this information about vulnerability
was drawn from various sources including atomic test data, engineering
analyses and studies of prior events, such as the bombings in Nagasaki and
Hiroshima. By combining vulnerability data with information about blast
intensity at a certain location, planners could estimate damage to specific
urban features. Data on vulnerability provided in CDUA were drawn from a
document called ‘The Effects of an Atomic Bomb Explosion on Structures
and Personnel’ prepared by the Defence Research Board of Ottawa, Canada.
These data allowed planners to classify damage to buildings in one of four
232 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
categories: (1) collapse or 100 per cent structural damage; (2) partial
structural damage; (3) heavy damage to window frames and doors; and
moderate plaster damage; and (4) 100 per cent window breakage. An
enforced concrete structure half a mile from ground zero, according to these
data, would be totally destroyed by a blast eight times the strength of
Hiroshima. A bomb the size of the Hiroshima blast, by contrast, would
damage only window frames and doors of a concrete structure, but such a
blast would totally destroy private homes up to two miles from ground zero.
It is noteworthy that in using such ‘vulnerability’ data planners were
drawing on information about past events, but in a distinct way. In the
archival statistical model, a large number of past events are analysed to
determine an ‘average’ event or a normal distribution of events over a
population. Civil defence planners, by contrast, used data from past events
whether actual bombings or carefully constructed tests to understand an
uncertain future event that would be heterogeneous to the original. Nuclear tests
or the bombings in Hiroshima and Nagasaki, thus, were not treated as average
events. But they provided data about the vulnerability of structures that, when
combined through enactment with features of a future event, allowed planners
to understand how a similar detonation or a larger one would affect an
American city.
Civil defence planning contained, in preliminary form, the characteristic
elements found in later articulations of enactment: a hazard model that
indicated the spatial distribution of blast intensity; a catalogue or what would
later be called an inventory of elements at risk; and information about
vulnerability of these elements. In civil defence, these elements were linked to a
specific mechanism of mitigation: preparedness planning. On the basis of
information derived from the mapping procedure, civil defence planners could
estimate required response capacities and mitigate vulnerabilities (see Table 3).
Developments in civil defence were to have longer-term significance in
various domains. For example, preparedness planning based on enactment
can be traced from civil defence to domestic emergency preparedness (Collier
& Lakoff, 2008a; Lakoff, 2007). The analysis here, however, traces a different
line of development. It follows the forms of knowledge and assessment in
civil defence as they were redeployed to respond to other problems and
linked to other mitigation mechanisms, first of all catastrophe insurance, the
problem that is at the centre of the ‘insurability’ debates.
Table 3 Enactment in civil defence
Data Model of a nuclear blast; catalogue of ‘significant’ urban
elements; data about vulnerability of urban elements
Technique of assessment Enactment through the physical juxtaposition of maps
Mechanism of mitigation Preparedness planning
Stephen J. Collier: Enacting catastrophe 233
Downloaded by [New York University] at 08:56 07 February 2014
Catastrophe insurance and hazard modelling
The development of catastrophe insurance was not related primarily to nuclear
war although, improbable as it may seem, nuclear war insurance was offered
by some companies. Rather, it was related to natural hazards, particularly
hurricanes, floods and earthquakes. The evolution of insurance coverage for
natural hazards is a surprisingly recent development. Despite some early
efforts to develop actuarial frameworks for such events, attention to the
problem of natural hazards risk assessment was limited well into the post-
World War II period. Insurers either did not provide coverage for natural
hazard risk or folded it into general property insurance.
18
Attention was turned to natural hazard risk with greater urgency in the
1960s and 1970s. The reasons for this increased attention deserve detailed
study, although some initial indications can be given. Insurers were aware that
loss from natural hazards had increased markedly over the post-World War II
period (Anderson, 1976; United States. Task Force on Federal Flood Control
Policy, 1966). This increase was attributed to various factors. Among them was
urban and suburban development in areas such as coastal lowlands and
floodplains. By the late 1960s some observers were arguing that this
increasingly ‘risky’ pattern of development could be traced, at least in part,
to ‘successful’ modern mechanisms of security. Generous federal benefits
(beginning with the 1950 Federal Relief Act) and flood control works that
diminished the frequency of floods, but that did not necessarily prevent more
infrequent, catastrophic floods, encouraged development in areas exposed to
natural hazard risks (Kunreuther, 1968, 1973; United States. Task Force on
Federal Flood Control Policy, 1966). Here, again, was a paradigmatic dynamic
of risk society: the risk of catastrophic loss was made more acute by the very
success of modern mechanisms of security. Against this background, it can be
surmised that the problem of natural hazard risk assessment received greater
attention from private insurers in the 1960s and 1970s for at least two reasons:
first, because insurers anticipated a growing market for natural hazard
insurance as losses grew; second, because many policy analysts argued that
private insurance would better ‘price natural hazard risks, and federal
legislation (particularly the 1968 Federal flood insurance programme)
encouraged private insurers to offer natural hazard coverage.
19
In response, various approaches to modelling natural hazards and to
assessing catastrophe risk began to emerge. Here, I will examine one
exemplary figure in this development, Don M. Friedman.
20
Friedman was an
employee of Travelers’ Insurance in Hartford, Connecticut, who served as
director of that company’s Natural Hazard Research Program. The trajectory
of Friedman’s work is notable for the broader genealogy of enactment. In the
late 1960s, Friedman had produced a report on computer simulation of the
earthquake hazard for the Office of Emergency Preparedness (OEP). OEP
was one of a long line of successor agencies to the 1950s Federal Civil
Defense Administration. It was also a point of passage for many practices of
234 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
emergency management, preparedness and risk assessment as they migrated
from civil defence to other areas of domestic preparedness, disaster
modelling and vulnerability mitigation (Collier & Lakoff, 2008b; Lakoff,
forthcoming).
21
As we will see, Friedman’s approach closely resembled the form of
assessment through enactment that had been developed in civil defence. The
same basic elements were employed: a hazard model, vulnerability data and an
inventory. But Friedman’s approach was different in at least two significant
ways. First, he linked enactment to a different mechanism of mitigation:
insurance rather than preparedness. Second, Friedman was one important
figure in applying advances in computer modelling to natural hazard risk
assessment, both by drawing on sophisticated, computerized vulnerability
assessments and hazard models
22
and by ‘acting out’ uncertain threats using
spatially tagged data sets that could be analysed through computers.
Friedman described his approach in a 1984 article entitled ‘Natural hazard
risk assessment for an insurance program’, which reflected work conducted
over the past two decades. As Friedman presented the issue in 1984, natural
hazards presented multiple challenges to the traditional tools of actuarial risk
assessment. The traditional approach was based on an analysis of what
Friedman called ‘past loss experience’ or, in the terms I have been using
here, an ‘archive’ of past events that could be analysed statistically. The
question was: what universe of ‘past experience’ could be analysed to assess
probable loss from natural disasters? Here, Friedman reasoned, insurers faced
a dilemma. If only recent events were considered, then loss estimates might
be skewed by the small size of their sample. ‘Loss experience measured over
a short period of years,’ he noted, ‘can be highly biased by occurrence or
non-occurrence of a geophysical event during the period’ (Friedman, 1984, p.
70). But a longer time series introduced its own problems. Patterns of
settlement, techniques of construction and the cost of replacing damaged
property would change dramatically over time.
23
Therefore, past loss
experience might provide little useful information about future loss
experience.
24
Natural hazard risk assessment, Friedman noted, posed a further problem
not easily managed by the traditional actuarial approach that of
distributing risk. In order to make decisions about premiums for a given
policy, actuaries have to estimate the ‘expected annual loss’ from natural
hazards to each insured property. But because natural hazards might affect a
large number of insured properties simultaneously, it was also necessary to
assess what Friedman called the ‘catastrophe producing potential of
individual geophysical events’, that is, the potential of such events to affect
many insured properties simultaneously and, thus, to threaten the solvency
of an insurance company. This problem was not new to insurance. Such
‘portfolio risks’ were long recognized in the area of fire insurance. By the
early nineteenth century so-called ‘pin maps’ were being used to assess
insurers’ exposure to damage from fires that might damage many spatially
Stephen J. Collier: Enacting catastrophe 235
Downloaded by [New York University] at 08:56 07 February 2014
proximate insured properties in a single event.
25
But such models had fallen
out of use, and they were inadequate, in any case, for the much larger and
more complex patterns of damage produced by hurricanes, earthquakes or
floods (Mathewson, 2001).
Friedman concluded that these features of natural hazard risks rendered
traditional methods of risk assessment inadequate. The ‘unique features of
infrequent occurrence and the tendency to cause many losses when
[catastrophic events] occur’, he argued, ‘make it desirable to find a
supplementary means of estimating present or future risk rather than
depending solely upon the traditional method of evaluating the magnitude
of a hazard using past loss experience’ (Friedman, 1984, p. 66). Friedman’s
point was not that the experience of past events could tell insurers nothing;
data about the frequency and severity of past disasters, he held, could provide a
useful guide to future events.
26
But given changes in building materials,
patterns of settlement, disaster mitigation measures and so on, past experience
of physical damage and monetary loss could not provide a guide to future loss.
The question to be asked of past events, Friedman argued, was not how much
damage they caused when they occurred. Rather, the question was what the cost
of past events would be if they happened in the present. ‘What is needed,’
Friedman explained, ‘is not [information about] actual damages that occurred
as a result of a past geophysical event, but an estimate of potential damage
production to the present distribution of properties from a recurrence of the
past event’ (Friedman, 1984, p. 64).
27
The approach Friedman proposed was a form of enactment that echoed
nuclear attack modelling in early Cold War civil defence. Like civil defence
modelling, it juxtaposed data from heterogeneous sources to ‘act out’ a future
event. But Friedman drew on new applications of mathematical models and
computer databases developed by civil defence planners over the 1950s and
1960s. Using them, he was able to model ‘geophysical events, and attendant
severity patterns, that interact mathematically with a given array of properties
to produce synthetic loss experience’ (Friedman, 1984, p. 70).
In ‘Natural hazard risk assessment’ Friedman outlined four components of
natural hazard risk models, which closely paralleled the different dimensions
of enactment in civil defence: (1) the geographic pattern of severity; (2) local
conditions that might affect the severity of an event at a given location; (3)
the ‘elements at risk’; and (4) the ‘vulnerability’ of these elements (Friedman,
1984, p. 72). Here, I shall not consider the formal expression of these various
components. Something can be said, however, about how they were
assembled from heterogeneous sources of data. The first and second the
geographic pattern of severity and local conditions comprised the ‘hazard
model’. The ‘geographic pattern of severity’ of the event itself referred to the
spatial distribution of whatever feature of a hazard caused damage (water
depth for floods, shaking for earthquakes, wind and flooding for hurricanes).
Local conditions that might influence an event’s severity included in the
case of a hurricane mountains, hills or valleys that would affect wind
236 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
speed, the shape of the shoreline, the depth of offshore waters and any ‘man-
made constraints’ such as levees that would affect patterns of flooding
(Friedman, 1984, p. 72). By combining severity patterns with local conditions
it was possible to model how an event would unfold over a specific landscape.
The third component of the model what Friedman called ‘elements at
risk’ referred to properties or structures that might be damaged by a
natural hazard. These elements at risk were similar to the ‘urban features’
catalogued in civil defence planning. For an insurer, however, the elements of
central concern were those whose damage produced a liability that is,
insured properties. As in civil defence, the task was not only to identify these
elements but to arrange them spatially, creating what Friedman called a
‘geographic inventory’. This was accomplished by assigning spatial tags to
entries for each element in a computer. As Friedman explained, ‘the detailed
geographical distribution of over two-hundred million persons and fifty
million single family dwellings has been put into [a] grid system (Friedman,
1984, p. 78). The input process ‘included the use of United States Census
Data, large-scale maps, grid overlays, and an index to 75,000 towns and
cities’ (Friedman, 1984, p. 78). The result was a computerized database of
the spatial pattern of settlement in the United States that could be correlated
with the distribution of severity patterns determined by the hazard model.
The fourth and final element to be included in the model was
‘vulnerability,’ which referred, as noted above, to the susceptibility of a
given structure to damage. Friedman’s hurricane model employed a
simplified statistical approach to damage estimates that examined past events
to establish a correlation between the maximum wind speed of a hurricane
and the distribution of damage to structures in a certain area (a relationship
that was found to be non-linear).
28
‘Vulnerability’ of elements at risk could
then be expressed as a percentage of the insured value of a property that was
expected to be lost in an event of a given magnitude.
The final step in the modelling process was the enactment itself. If in civil
defence enactment was performed by physically overlaying maps, then
Friedman ‘acted out’ the event by combining the elements noted above in a
computer model. ‘Each of these areas’, Friedman noted, ‘is addressed in
computer storage so that building characteristics such as number, type, value,
usage, degree of exposure, and vulnerability can be stored at each grid address’
(Friedman, 1984, p. 78). The interaction of these elements with the event
could be simulated, resulting in an estimation of ‘loss experience’. Friedman
summarized the approach in a diagram reproduced here as Figure 1.
Despite its increasing technical sophistication, Friedman’s approach to
natural hazard risk assessment did not, of course, render uncertain risks certain
if what is meant by ‘certainty’ is the predictive power of a robust archival
statistical analysis. But it did offer a basis for calculative decision-making, one
that combined complex data about collective life with increasingly sophisti-
cated simulations of natural hazards in an enacted future. The ‘loss experience’
produced by the model provided insurers with information that could be used
Stephen J. Collier: Enacting catastrophe 237
Downloaded by [New York University] at 08:56 07 February 2014
to make calculative decisions about premiums in different areas and about the
management of portfolio risks.
29
The problem of natural hazard loss estimates, in sum, constituted a specific
site of re-problematization. Experts like Friedman, faced with a new class of
events, recognized that existing forms of knowledge and assessment were
inadequate. In response, they redeployed forms of knowledge and assessment
developed in one domain that of civil defence and recombined them with
mitigation mechanisms from another domain that of social insurance. This
recombinatorial process is portrayed in Figure 2.
Natural Hazard
Event
Geographical Array of
Elements-at-Risk
Vulnerability
Insurance Program
Loss Experience
Figure 1 ‘Interaction required in the determination of catastrophe potential’
(Friedman 1984, p. 71)
Mechanism of mitigation:Risk
spreading through insurance
Technique of assessment: Statistical
Data: Archive of past events
Social Insurance
Mechanism of mitigation:
Preparedness
Technique of assessment:
Enactment through physical
juxtaposition of maps
Data: “catalogue” of urban elements;
blast models; vulnerability data
Civil Defense
Mechanism of mitigation:Risk
spreading through insurance
Technique of Assessment:
Enactment through computer
models
Data: geographic inventory;
hazard model; vulnerability
data
Catastrophe Insurance
Figure 2 Recombinations I catastrophe insurance
238 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
From catastrophe insurance to risk-based budgeting
Despite substantial technical development in natural hazard risk modelling
from the 1960s through the 1980s, it was not until the late 1980s and early
1990s that broad interest in this new assessment tool grew among insurers
(Grossi & Kunreuther, 2005; Grossi et al., 2005; Mathewson, 2001). The field
was catalysed by a series of major disasters in the US, including Hurricane
Hugo (1989), the Loma Prieta Earthquake (1989) and Hurricane Andrew
(1992). These events were major shocks to the insurance industry. Andrew left
nine insurance companies insolvent and raised again old questions about
exposure concentrations and portfolio risks. These disasters increased demand
for reinsurance services (Kozlowski & Mathewson, 1995, 84);
30
they also
catalysed interest in new forms of risk assessment on the part of insurers and
reinsurers who realized that ‘in order to remain in business ... they needed to
estimate and manage their natural hazard risk more precisely’ (Grossi et al.,
2005, p. 25) see also Bougen (2003)). The growing demand for natural hazard
risk assessment was initially met by three firms, all of which had been founded
in the late 1980s and early 1990s: AIR Worldwide (founded in 1987); Risk
Management Solutions (1988); and EQECAT (1994). Over the 1990s, these
modelling companies grew and catastrophe models ‘increased in number,
availability, and capability’ (Grossi et al., 2005, p. 25).
By the 2000s, the practices of catastrophe modelling were being system-
atized. A 2005 volume entitled Catastrophe Modeling: A New Approach to
Managing Risk, edited by Patricia Grossi and Howard Kunreuther and
developed in collaboration with the major commercial modelling firms, sought
to reflect increasingly stabilized practice in this area.
31
The approach outlined
in the volume echoed the forms of assessment through enactment as they had
developed from civil defence to natural disaster modelling in catastrophe
insurance. Grossi and Kunreuther noted ‘four basic components of a
catastrophe model’ that are by now familiar: hazard, inventory, vulnerability
and loss (Grossi et al., 2005, 26).
The most important users of catastrophe models in the first decade of the
twenty-first century, the authors noted, were insurers and reinsurers. Capital
markets were also ‘eager users of this technology’ which they employed to
price catastrophe bonds (Kunreuther et al., 2005, p. 27; see also Bougen, 2003).
But in formalizing and generalizing the methodology of catastrophe modelling,
Kunreuther and Grossi addressed a broader range of possible users and
applications. Government agencies in particular, they noted, were showing
renewed interest in catastrophe modelling, initially (during the 1990s) in the
area of natural hazard modelling and increasingly, after 2001, in the area of
terrorism risk assessment.
32
These new users and new applications have
shaped new redeployments and recombinations of enactment that are more or
less contemporary.
Here I examine one specific example of these contemporary developments: a
proposal for government use of terrorism risk models adapted from natural
Stephen J. Collier: Enacting catastrophe 239
Downloaded by [New York University] at 08:56 07 February 2014
hazard models of the type just discussed. This example concerns a problem
that has been hotly contested in US domestic security discussions: the
rationalization of Homeland Security grants to US states (Lakoff et al., 2007).
In particular, I examine one proposal for such rationalization, formulated by
the Center for Terrorism Risk Management Policy (CTRMP), a unit of the
RAND Corporation founded specifically to study terrorism risk insurance.
The proposal, laid out in a 2005 report entitled ‘Terrorism Risk Assessment’
focuses on risk assessment in one specific federal programme, the Urban Areas
Security Initiative (UASI). This initiative, note the authors of the CTRMP
report, is intended to ‘enhance security and overall preparedness to prevent,
respond to, and recover from acts of terrorism’ by providing ‘financial
assistance to address unique planning, equipment, training, and exercise needs
of large urban areas’ (Willis et al., 2005, p. vii). There is agreement among
various stakeholders, they continue, that the distribution of this financial
assistance should reflect ‘the magnitude of risks to which different areas are
exposed’ (ibid., p. vii). Therefore, a funding formula should be used that
would weight distributions according to the risk faced by each state. The
suggestion that federal budgetary distribution should be based on a formula is
unexceptional. There is a substantial tradition of technocratic thought and
practice in the area of so-called ‘formula-based financing’ that concerns
precisely the problem of rationalizing the distribution of funds based on some
definition of ‘need’ (Collier, 2007). When federal grants are in areas such as
health care, housing, education or poverty alleviation, ‘need’ can be defined on
the basis of archival statistical data about collective life rates of poverty, local
levels of economic activity, numbers of school-age children or their
performance on aptitude tests. From such measures of ‘need’ a coefficient
can be derived that determines what portion of financing for a particular
programme should be allocated to a given sub-national government (Collier,
2005).
In the Urban Areas Security Initiative the relevant definition of ‘need’ is the
exposure of a given state to terror risks. The problem, as the authors of the
CTRMP report noted, is that there is no consensus about how the magnitude
of risk might be assessed (Willis et al., 2005, p. vii). Again, archival statistical
knowledge proved inadequate to a new class of threat. The CTRMP report
proposed two possible approaches to assessing the risk in a given state and,
thus, the proportion of total funds it should be awarded. The first would
employ density-weighted population as a proxy for terrorism risk, based on the
assumption that attacks were more likely in densely settled urban areas. The
second, recommended, option was what the authors referred to as an ‘event-
based approach’ to risk assessment that would draw on ‘models of specific
threat scenarios, calculations of economic and human life consequences of each
scenario, and assessments of the relative probability of different types of attacks
on different targets’ (ibid., p. 25). The specific event-based approach that the
report’s authors proposed to employ was adapted from a terrorism risk model
developed for insurance purposes by Risk Management Solutions (RMS), one
240 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
of the catastrophe modelling firms founded in the early 1990s (Ericson &
Doyle, 2004a). A double redeployment was under way: first, natural hazard
risk models had been applied to terrorism risk modelling for insurance
purposes; second, CTRMP proposed to use these loss models for budgetary
rationalization (see Figure 3). It is worth saying a few words about these steps
in turn.
The entry of the major catastrophe modelling firms into terrorism risk
assessment began in earnest after 9/11. As with natural hazard modelling in
the 1960s and 1970s, this development had to do both with the needs of
private insurance companies and with public regulation. Prior to 9/11,
terrorism insurance seems to have been handled in much the same way that
natural hazard insurance was approached in the 1960s. Insurance companies
either did not provide terrorism coverage in their policies or included it in
blanket property insurance (Kunreuther et al., 2005, p. 210). In the initial
aftermath of 9/11, insurance companies withdrew from the provision of
terrorism insurance, taking a precautionary approach in the face of what they
considered to be excessive uncertainty (Ericson & Doyle, 2004a; Grossi et al.,
2005). But the hand of insurance companies was forced by the passage of the
2002 Terrorism Risk Insurance Act (updated by the 2005 TRIEA), which
required insurers to offer terrorism insurance (Kunreuther et al., 2005, pp.
216 17).
As Kunreuther et al. (2005) point out, terrorism risk models have the same
structure as other catastrophe models. They include a ‘hazard model’, an
inventory of elements at risk, an assessment of the vulnerability of these
elements and a loss estimate. Here I will focus in particular on how two
elements of these risk models the hazard model and the vulnerability
assessment are adapted to terrorism risk modelling of the type developed
by RMS and taken up in the CTRMP proposal.
Mechanism of mitigation: Formula-
based budgeting
Technique of assessment: Statistical
Data: Population data
Budgetary Rationalization
Mechanism of mitigation: Insurance
Technique of assessment:
Enactment through computer models
Data: geographic inventory; hazard
model; vulnerability data
Catastrophe Insurance
Mechanism of mitigation: Formula-
based budgeting
Technique of assessment:
Enactment through computer
models
Data: geographic inventory;
scenario-based hazard model;
vulnerability data from tests,
engineering assessments, etc.
CTRMP Proposal for UASI
Figure 3 Recombinations II terrorism risk assessment
Stephen J. Collier: Enacting catastrophe 241
Downloaded by [New York University] at 08:56 07 February 2014
‘The greatest source of uncertainty’ in a terrorism risk model, the
CTRMP report notes, ‘derives from estimates of threat, which concern
terrorists’ goals, motives, and capabilities’ (Willis et al., 2005, p. 14).
Terrorism risk amplifies the assessment problems presented by natural
hazard risk: even less is known about the likely characteristics of events, and
even less information from past events is available (Kunreuther et al., 2005).
Therefore, terrorism risk models like that produced by RMS depend on a
distinct method for generating a hazard model, namely the use of imaginative
scenarios. The initial selection of attack scenarios and the determination of
frequency estimates are based on ‘elicitation’ of expert understanding. As
Ericson and Doyle (2004a, p. 150) point out, this technique of ‘elicitation’ is
‘subjective’ in the sense that it draws on expert opinion, not on statistical or
quantitative measures. But the methodology of such expert ‘elicitation’ is
systematized, and has a long history that can be traced to approaches such as
the Delphi Method, developed by RAND in the early years of the Cold War
(Kunreuther et al., 2005).
If the hazard module of terrorism risk assessments is based on methods
that are formalized but ‘subjective’ then the ‘vulnerability’ module in
terrorism risk modelling is ‘subject to lower levels of uncertainty’.
33
The
authors of ‘Estimating Terrorism Risk’ note that estimating vulnerability
often takes the form of ‘straightforward engineering and statistical problems’
that can be approached through existing and well-established methodologies.
Terrorism vulnerability assessments can draw on models of natural disasters
‘that are directly applicable, or nearly so’ (Willis et al., 2005, 15). They can
also use ‘methods of engineering risk analysis that have been used
successfully in estimating risks of space flight and operating nuclear
reactors’, as well as on military damage assessments: ‘[T]he military and
other government agencies’, the authors note, ‘have long studied the effects
of weapons on people and structures and this, too, is useful for estimating
consequences’ (Willis et al., 2005, p. 15).
34
It is in the final ‘output’ the loss estimate that a model of risk
assessment for government budgetary decisions differs most fundamentally
from an insurance loss estimation model. For government decision-makers,
the question does not concern the dollar value of insured loss and, thus, the
appropriate premiums and risk exposures. Rather, it concerns the total loss
inflicted by an event in life and dollars. CTRMP, in this light, proposed as
a metric ‘average annual loss’ in a given state as a guide to government
decision-makers. Average annual loss, was, of course, a basic metric in
insurance loss models, used to determine premiums and to manage
catastrophic loss exposures. In the CTRMP proposal, by contrast, average
annual loss would be used to derive a coefficient that expressed the proportion
of total annual loss expected in the US that was accounted for by a given
state. This coefficient, finally, could be plugged into a formula for the ‘risk-
based’ distribution of homeland security funds. Enactment, in this way,
would be linked to budgetary rationalization.
242 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
Conclusion: enactment, security and the politics of truth
Surveying the broad field of these developments, we clearly cannot conclude
that enactment has stabilized in apparatuses of security. The CTRMP proposal
has not been adopted in US domestic security funding, which continues to be a
highly politicized and disputed problem (Lakoff et al., 2007). Terrorism risk
modelling, for its part, remains in the early stages of development, and existing
methodologies are not widely trusted by insurers (Kunreuther et al., 2005).
Natural disaster risk assessment techniques based on enactment are increas-
ingly accepted, and in other areas, such as emergency preparedness, enactment
is accepted and well institutionalized. But even there we find various
knowledge forms and assessment techniques vying for legitimacy and
institutional force; enactment is only one among these.
35
Where does this leave us in relationship to risk society and Beck’s
insurability thesis? As noted above, Beck has been criticized for his epochal
claim that we have moved from a period of industrial modernity and
governable risks to a risk society characterized by ungovernable risks.
Acknowledging Beck’s recent efforts to qualify and clarify his position, at
least two analytical problems with the overdrawing of the distinction between
industrial modernity and risk society are particularly relevant in light of the
preceding discussion. The first is that it tends to obscure processes of
transformation that take the shape of partial mutations, sectoral redeployments
and recombinations of existing elements. For example, the recombination of
the insurance risk spreading mechanism with the enactment mode of
knowledge and assessment seems difficult to analyse in relation to risk society.
Is it an example of risk society? An exception to it? The second problem is that,
as O’Malley has pointed out, an insistence on the ‘ungovernability’ of the risks
that characterize risk society can divert attention from new forms of knowledge
and assessment.
36
As I suggest above, it is crucial to draw a distinction between
the limit of archival statistical knowledge and the limit of rational assessment
in general, and to examine alternative forms of knowledge and assessment with
their own specificity, systematicity and institutional legitimacy.
All that said, this article has suggested that one may nonetheless find in
Beck’s core concepts an essential guide to contemporary re-problematizations
of risk, rationality and security, one that helps us think through the genealogy
of enactment. Across multiple sectors, and over many decades, enactment was
invented and then redeployed in response to problematic situations in which
the archival statistical model proved inadequate to new problems. One can
identify common features of these problems and common responses across
multiple domains without suggesting that they are local manifestations of the
broader logic of risk society.
Part of the challenge, in this light, is to weigh the significance of enactment
as something more than an idiosyncratic and subjective response to situations
in which rational forms of assessment cannot be employed and something less
than the sign of a structural shift. In part, this is a question of tracing the
Stephen J. Collier: Enacting catastrophe 243
Downloaded by [New York University] at 08:56 07 February 2014
trajectory of its technical application and institutional acceptance a task that I
have tried to begin here. But there is another question to be asked, one that
concerns the quintessentially political question of collective security: how
might enactment relate to contemporary transformations of the security pact?
Here, it is helpful to keep in view the development of social insurance.
Techniques of risk spreading taken from other domains in particular from
long-distance shipping and institutions of mutual aid were first applied to
pathologies of the social in specific sectors, and in response to specific
problems such as physical incapacity or old age. Over time, these became
more general in their application, covering more domains and a greater share
of national populations. In this process, these knowledge forms emerged as
crucial elements in the modern security pact: technical concepts of social risk
became essential to the definition of political citizenship; the state’s side of
the modern social contract was fulfilled, in part, through insurance (Ewald,
2002). For enactment, we can see the beginning if not the end of a parallel
development. Enactment is also a form of knowledge about collective life
about the vulnerabilities and risks of individuals and groups. But it differs
fundamentally from the archival statistical knowledge of social insurance. It
comes to ‘know’ collective life not through the regular processes of
population or society, but through the uncertain interaction of potential
catastrophes with the existing elements of collective life. I have shown how,
over the last decades, enactment has emerged in multiple domains as a
technical means to relate the present to uncertain future events when the
archival statistical form has encountered a certain limit. A crucial problem
now is how enactment may shape the processes through which collective life
is reflected in political arrangements, a question concerning less rationality
per se and more what Foucault (2007) called the politics of truth.
Acknowledgements
I acknowledge the comments and suggestions of Carlo Caduff, Lyle Fearnley,
Andrew Lakoff, Turo-Kimmo Lehtonen, Paul Rabinow, Antti Silvast,
Grahame Thompson and two anonymous reviewers.
Notes
1 This article is part of a broader collaborative project with Andrew Lakoff on vital
systems security and it draws many concepts and questions from our work together.
2 My choice of both terms ‘archival statistical’ and ‘enactment’ is deliberate. I
refer to ‘archival statistical’ rather than, for example, ‘probabilistic’ or ‘risk-based’
knowledge because both of the latter may refer to various forms of knowledge and
assessment they are not confined to archival statistical forms. Indeed, both
‘probabilistic’ and ‘risk’ are important concepts in discussions of enactment-based
knowledge. I refer to ‘enactment’ rather than some obvious alternatives such as
244 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
modelling or simulation for three reasons: first, because experts often refer to
‘models’ (hazard models, for example) as sub-components of an enactment; second,
because enactment may take forms such as exercises that do not seem adequately
described as simulations or models; third, because these terms (especially simulation)
already carry substantial associations in critical social science discussions, whereas
enactment does not.
3 For a discussion of modes of veridiction, see Rabinow and Bennett (2007).
4 This emphasis on ‘recombinations’ can be found in certain readings of Foucault, in
particular Rabinow’s (2003). This aspect of Foucault’s approach is particularly clear in
the newly published Security, Territory, Population (Foucault, 2007). Beck, too, has
recently suggested a strategy of tracing recombinations: ‘That which has existed up
until now is not simply replaced or dissolved and does not simply appear as a mere
residual leftover; instead, it combines with new elements in different forms’ (Beck &
Lau, 2005, p. 541).
5 Beck’s general theses have appeared in innumerable publications. For a review of
sub-themes that are particularly relevant to the present argument see Beck (1992a).
6 Ericson (2005) points out that the definition of ‘risk’ sometimes slips in this
literature. Here I use ‘risk’ in the most generic sense, as an expression of the likelihood
and consequence of a future event, whether or not it is derived using archival statistical
techniques (see also O’Malley, 2004).
7 It should be clear that in this discussion ‘social insurance’ does not mean only
insurance provided by the state but insurance associated with particular forms of
knowledge about collective life and concerned with events that are ‘pathologies of the
social’.
8 There are legitimate questions to be asked about the extent to which this model
faithfully describes insurance as it has been practised over the last hundred years. Much
has been written about variations in the institutional set-up of insurance in different
countries and about the role of ‘estimation’ as opposed to archival statistical calculation
in the history of insurance (O’Malley, 2000; Pfeffer, 1974). Questions might also be
raised about whether social insurance should be taken as the paradigm for the modern
security pact. Other apparatuses of social modernity such as welfare (O’Malley, 2004)
and infrastructure development (Collier, 2006) work on different forms of knowledge,
assessment and intervention.
9 O’Malley (2003) has noted that it is not clear what Beck means by ‘control’ in this
discussion, since insurance serves only to mitigate financial harm, not to ‘control’ the
event.
10 When Beck was first articulating his work on risk society, his examples of these
‘uninsurable risks’ focused on environmental or technological catastrophes, or genetic
modification of organisms (Beck, 1992b, p. 101). He has more recently applied his
argument to natural disasters and catastrophic terrorism (Beck, 2002).
11 Other criteria of ‘insurability’ are not met by catastrophic risks. See Jaffee and
Russell (1997).
12 As noted above (see note 4), Beck’s more recent formulations qualify these claims
in important ways, although the basic argument concerning the erosion of the boundary
marker of legitimate expertise still seems crucial to his work (Beck & Lau, 2005). My
argument here is that this is best investigated as an empirical question rather than a
structural logic that can be ‘demonstrated’ in domain after domain.
13 Ericson and Doyle also make the interesting point that insurers initially took a
precautionary approach to coverage after 9/11 in the face of great uncertainty (see also
Baker 2002). As recent critics of the precautionary principle argue, however, precaution
does not necessarily relieve the need to estimate the risk of future events whose
likelihood or consequence is uncertain. One has, in other words, to estimate the risks (or
the costs) of precaution (Baker, 2002; Wiener, 2002).
Stephen J. Collier: Enacting catastrophe 245
Downloaded by [New York University] at 08:56 07 February 2014
14 It is not hard to understand why, from a Weberian perspective, insurance might be
exemplary. Because one had a large archive of information about past events, and
because these events could be measured in quantitative terms, very precise information
could be generated about the costs and benefits of a given course of action (Weber,
1978).
15 For a discussion of second strike and its relationship to new knowledge forms, see
Amadae (2003).
16 The link between total war and ‘vulnerability’ assessment is discussed in Collier
and Lakoff (2008b).
17 United States. Federal Civil Defense Administration (2002). Hereafter cited in the
text as CDUA.
18 Kozlowski and Mathewson (1995, p. 83) note that in the decades after World War
II ‘the U.S. was experiencing a period of low frequency and severity of natural
catastrophic events. Damaging hurricanes were scarce, especially in Florida, and a
major earthquake had not occurred since 1906. Modern fire fighting and construction
practices had minimized the threat of conflagration. As a result, the insurance industry
largely lost the discipline of measuring and managing exposures susceptible to
catastrophic loss.’ As a result, as Walker points out, ‘a blanket approach tended to be
adopted based on the perceived risk of occurrence of the hazard regionally or nationally
without respect for individual mitigating or extenuating circumstances’ (1999, p. 12).
19 The 1968 programme called for the federal government to set up an actuarial
framework, thus spurring the development of hazard models.
20 In a recent essay on the development of catastrophe risk modelling Michael Lewis
(2007) has traced the practice to Karen Clark. In fact, the history extends much farther
back than Lewis suggests, though Clark was a major figure. Among other things, Clark
articulated a critique of Friedman’s assumption that time-series data could be used to
estimate the frequency and severity of natural hazards. Clark (1986) argued that
deterministic hazard models had to be replaced with stochastic models that did not rely
on time-series data.
21 The history of OEP is the topic of a collaborative project that I am undertaking
with Andrew Lakoff, Brian Lindseth and Onur Ozgode.
22 The history of natural hazard models deserves more detailed study (but see
Kunreuther 1978). In pointing to the improvement and increasing acceptance of these
models, I do not mean to suggest that they did not continue to be beset with profound
uncertainties and disputes over their design. In this respect see Ericson and Doyle
(2004b).
23 As Friedman put the point, ‘Property characteristics do not remain constant.
Number, type and geographical distribution [of insured properties] change rapidly with
time. Their susceptibility to damage also changes, because of time related modifications
in building design, materials and methods of construction, building codes, and
insurance coverages’ (Friedman, 1984, p. 70).
24 Friedman summed up the dilemma: ‘If the length of the sample period of past loss
experience is increased, the effect of changing property characteristics is amplified. On
the other hand, if the length of the sample period is decreased there is less chance of
getting a non-biased estimate of frequency and magnitude of the natural hazard’
(Friedman, 1984, p. 70).
25 Pin mapping was unlike the forms of enactment described here in that it did not
model events or assess the ‘vulnerability’ of insured structures. It sought only to
understand the spatial concentration of exposure, so that individual companies could
limit their liabilities to any given event (Mathewson, 2001).
26 Some contemporaries disagreed with Friedman on this point (see note 20).
27 ‘For example,’ he offered in illustration, ‘emphasis should not be placed on what
the 1906 San Francisco earthquake originally cost, but what it would cost if a
246 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
comparable earthquake occurred today and affected the present type, number,
vulnerability and value of current properties’ (Friedman, 1984, p. 64).
28 Other more sophisticated approaches to vulnerability that drew on computerized
simulations were being developed in materials engineering. According to Walker (1999),
Friedman played an important role in incorporating these approaches into risk models.
29 First, it could offer an estimate of the average annual loss expected of a specific
insured property and, thus, a basis upon which premiums for that property could be
determined. Second the model could estimate the ‘catastrophe producing potential’ of
an event that is, the potential for an event to produce concentrated claims that could
bankrupt an insurer.
30 ‘World-wide, there was a sudden shortage of reinsurance capacity to meet the new
perception of PMLs [Probable Maximum Losses] making it more difficult to obtain
reinsurance and leading to big increases in reinsurance rates’ (Kozlowski and
Mathewson 1995, p. 84; see also Walker 1999, p. 16).
31 One of the volume’s co-authors, Howard Kunreuther, was an important figure in
the history of thinking about insurance and natural hazard modelling. Beginning in the
late 1960s, Kunreuther had written assessments of Federal disaster aid and of the new
Federal Flood Insurance programme (Kunreuther, 1968, 1973, 1974).
32 The Federal Emergency Management Agency (FEMA) initiated a project in the
early 1990s to create a standard loss estimation methodology for earthquakes, which
resulted in the open-source HAZUS model.
33 As the CTRMP group notes, ‘Because vulnerability concerns the likelihood that an
attack of a specific type and magnitude will be successful against a target, it concerns
matters that can, in principle, be carefully studied and for which rough estimates may
be reasonably good’ (Willis et al., 2005, p. 14).
34 For a similar analysis that emphasizes the assemblage of elements linked together
in these terrorism risk models, see Ericson and Doyle (2004a).
35 Howard Kunreuther recently made a similar point concerning the relative
acceptance of catastrophe modelling for natural disasters (relatively high) and terrorism
(low) at a conference at the New School University in New York (2 November 2007).
That said, disputes persist in the field of natural disaster risk assessment. He noted that
the major firms have responded very differently to recent catastrophes, some adjusting
their likelihood estimates using scenario-based projections of much more intense future
natural disasters and others adhering to an archival statistical approach.
36 ‘It is .. . curious’, O’Malley writes, ‘that Beck takes no interest in the innovative
forms that such risk spreading (or other governing technologies) might take and how
they come to ‘‘work’’. .. . .Perhaps this is because his own thesis relies upon the image
of catastrophes as ungovernable’ (2003, p. 276).
References
Amadae, S. M. (2003). Rationalizing
capitalist democracy: The Cold War
origins of rational choice liberalism.
Chicago, IL: University of Chicago
Press.
Anderson, D. R. (1976). All risks rating
within a catastrophe insurance system.
The Journal of Risk and Insurance, 43(4),
629 51.
Baker, T. (2002). Liability and insurance
after September 11: Embracing risk meets
the precautionary principle. University of
Connecticut Law School.
Beck, U. (1992a). Risk Society: Towards a
new modernity. London: Sage.
Beck, U. (1992b). From industrial society
to the risk society: Questions of survival,
social structure and ecological enlight-
enment. Theory, Culture and Society, 9,
97 123.
Beck, U. (1999). World risk society. Cam-
bridge: Polity Press.
Stephen J. Collier: Enacting catastrophe 247
Downloaded by [New York University] at 08:56 07 February 2014
Beck, U. (2002). The terrorist threat:
World risk society revisited. Theory, Cul-
ture & Society, 19(4), 39 55.
Beck, U. & Lau, C. (2005). Second
modernity as a research agenda: Theore-
tical and empirical explorations in the
meta-change of modern society. The
British Journal of Sociology, 56, 525 57.
Bijker, W. E. (2002). The Oosterschelde
storm surge barrier: A test case for Dutch
water technology, management, and pol-
itics. Technology and Culture, 43(3), 569
84.
Bougen, P. (2003). Catastrophe risk.
Economy and Society, 32, 253 74.
Clark, K. (1986). A formal approach to
catastrophe risk assessment and manage-
ment. Casualty Actuarial Society, 73.
Part, 2(140), 69 92.
Collier, S. J. (2005). Budgets and biopo-
litics. In S. J. Collier & A. Ong (Eds),
Global assemblages: Technology, politics,
and ethics as anthropological problems.
Malden, MA: Blackwell.
Collier, S. J. (2006). Infrastructure and
reflexive modernization. Paper presented
at Helsinki University.
Collier, S. J. (2007). Neoliberal reform and
public value: The travels of Buchanan’s
fiscal contractarianism. Paper presented at
meeting of the American Association of
Geographers, San Francisco.
Collier, S. J. & Lakoff, A. (2008a).
Distributed preparedness: Space, security
and citizenship in the United States.
Environment and Planning D: Society and
Space, 26(1).
Collier, S. J. & Lakoff, A. (2008b). How
infrastructure became a security problem.
In M. Dunn & K. S. Kristensen (Eds),
Securing ‘the homeland’: Critical infra-
structure, risk and (in)security. London:
Routledge.
Der Derian, J. (2003). War as game.
Brown Journal of World Affairs, 10(1), 37
48.
The Economist. (2007). Come rain or
shine: Hedge funds find a new way to
profit by taking on the weather gods. 8
February.
Ericson, R. (2005). Governing through
risk and uncertainty. Economy and Society,
34, 659 72.
Ericson, R. & Doyle, A. (2004a). Cata-
strophe risk, insurance and terrorism.
Economy and Society, 33, 135 73.
Ericson, R. & Doyle, A. (2004b). Un-
certain business: Risk, insurance and the
limits of knowledge. Toronto: University of
Toronto Press.
Ewald, F. (1991). Insurance and risk. In
G. Burchell, C. Gordon & P. Miller (Eds),
The Foucault effect: Studies in governmen-
tality. London: Harvester Wheatsheaf.
Ewald, F. (2002). The return of Des-
cartes’ malicious demon: An outline of a
philosophy of precaution. In T. Baker
(Ed.), Embracing risk: The changing culture
of insurance and responsibility. Chicago, IL:
University of Chicago Press.
Fearnley, L. (2005). ‘From chaos to
controlled disorder’: Syndromic surveil-
lance, bioweapons, and the pathological
future (Working paper). Anthropology of
the Contemporary Research Collabora-
tory.
Foucault, M. (2007). Security, territory,
population: Lectures at the Colle
`
ge de
France, 1977 78. New York: Palgrave
Macmillan.
Freeman, J. R. (1932). Earthquake da-
mage and earthquake insurance: Studies of a
rational basis for earthquake insurance, also
studies of engineering data for earthquake-
resisting construction. New York: McGraw-
Hill.
Friedman, D. G. (1984). Natural hazard
assessment for an insurance program. The
Geneva Papers on Risk and Insurance,
9(30), 57 128.
Grossi, P. & Kunreuther, H. (2005).
Catastrophe modeling: A new approach to
managing risk. New York: Springer.
Grossi, P., Kunreuther, H.,&Winde-
ler, D. (2005). An introduction to cata-
strophe models and insurance. In P.
Grossi & H. Kunreuther (Eds), Cata-
strophe modeling: A new approach to
managing risk. New York: Springer.
Hacking, I. (1990). The taming of chance.
Cambridge: Cambridge University Press.
Hacking, I. (1992). ‘Style’ for historians
and philosophers. Studies in History and
Philosophy, 23(1), 1 20.
Hacking, I. (2003). Risk and dirt. In R.
V. Ericson & A. Doyle (Eds), Risk and
248 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
morality. Toronto: University of Toronto
Press.
Jaffee, D. M. & Russell, T. (1997).
Catastrophe insurance, capital markets,
and uninsurable risks. The Journal of Risk
and Insurance, 64(2), 205 30.
Kozlowski, R. T. & Mathewson, S. B.
(1995). Measuring and managing cata-
strophe risk. Arlington, VA: Casualty Ac-
tuarial Society.
Kunreuther, H. (1968). The case for
comprehensive disaster insurance. Journal
of Law and Economics, April.
Kunreuther, H. (1973). Recovery from
natural disasters: Insurance or federal aid?.
Washington, DC: American Enterprise
Institute for Public Policy Research.
Kunreuther, H. (1974). Disaster insur-
ance: A tool for hazard mitigation. The
Journal of Risk and Insurance, 41(2), 287
303.
Kunreuther, H. (1978). Disaster insurance
protection: Public policy lessons. New York:
Wiley.
Kunreuther, H., Michel-Kerjan, E.,&
Porter, B. (2005). Extending catastrophe
modeling to terrorism. In Catastrophe
modeling: A new approach to managing risk.
New York: Springer.
Lakoff, A. (2007). Preparing for the next
emergency. Public Culture, 19(2).
Lakoff, A. (forthcoming). The generic
biothreat. Cultural Anthropology.
Lakoff, A., Klinenberg, E.,&Treskon,
M. (2007). Of risk and pork: Urban
security and the politics of rationality.
Unpublished manuscript, New York.
Lewis, M. (2007). In nature’s casino. The
New York Times, 26 August.
Mathewson, S. (2001). From pin maps to
the world wide web. Insurance Journal,5
March.
O’Malley, P. (2000). Introduction: Con-
figurations of risk. Economy and Society,
29, 457 9.
O’Malley, P. (2003). Governable cata-
strophes: A comment on Bougen. Econ-
omy and Society, 32, 275 9.
O’Malley, P. (2004). Risk, uncertainty and
government. London: GlassHouse.
Pfeffer, I. (1974). Residual risks in
Europe. The Journal of Risk and Insurance,
41(1), 41 56.
Rabinow, P. (1989). French modern:
Norms and forms of the social environment.
Cambridge, MA: MIT Press.
Rabinow, P. (2003). Anthropos today:
Reflections on modern equipment. Prince-
ton, NJ: Princeton University Press.
Rabinow, P. & Bennett, G. (2007). A
diagnostic of equipmental platforms.
Berkeley, CA: Anthropology of the
Contemporary Research Collaboratory.
Rose, N. S. (1999). Powers of freedom:
Reframing political thought. Cambridge:
Cambridge University Press.
Rose, N., O’Malley, P.,&Valverde, M.
(2006). Governmentality. Annual Review
of Law and Society, 2,83 104.
United States. Task Force on Federal
Flood Control Policy. (1966). A unified
national program for managing flood losses:
Communication from the President of the
United States transmitting a report.
Washington, DC: US Government
Printing Office.
Walker, G. (1999). Current developments
in catastrophe modeling. Paper given at
1999 General Insurance Convention.
Weber, M. (1978). Economy and society:
An outline of interpretive sociology. Berke-
ley, CA: University of California Press.
Weber, M. (2002). The Protestant ethic
and the ‘spirit’ of capitalism and other
writings. New York: Penguin.
Wiener, J. B. (2002). Precaution in a
multi-risk world. In D. J. Paustenbach
(Ed.), Human and ecological risk assess-
ment: Theory and practice (pp. 1509 31).
New York: Wiley.
Willis, H. H., Morral, A. K., Kelly, T.
K.,&Medby, J. J. (2005). Estimating
terrorism risk. Santa Monica, CA: Center
for Terrorism Risk Management Policy.
Stephen J. Collier received his PhD in anthropology at the University of
California, Berkeley. He has taught at Berkeley, at Columbia University and,
Stephen J. Collier: Enacting catastrophe 249
Downloaded by [New York University] at 08:56 07 February 2014
currently, at the New School, where he is an Assistant Professor in
International Affairs. He is the editor with Aihwa Ong of Global Assemblages
(Blackwell, 2005) and with Andrew Lakoff he is editing Biosecurity Interven-
tions: Global Health and Security in Practice (Columbia, 2008). He is
completing a book entitled Post-Soviet Social: Neoliberalism and Biopolitics in
the Russian Mirror. His current research concerns security, emergency and
system vulnerability.
250 Economy and Society
Downloaded by [New York University] at 08:56 07 February 2014
... In contemporary analytical understanding, resilience is treated more broadly as a logic, a way of thinking, or a policy discourse that describes a characteristic of a system (Folke 2016) and that is deployed in a variety of contexts, such as security (Aradau 2014, Folkers 2018 or urban governance (Wagenaar and Wilkinson 2015). Such approaches share considerable affinities with scholarship on preparedness for and anticipation of crisis-prone futures (e.g., Collier 2008, Roe and Schulman 2015, Lakoff 2017, Deville and Guggenheim 2018, Aykut et al. 2019, Samimian-Darash and Rotem 2019, Keck 2020, Folkers 2021. Given this proliferation, the multiplicity of ways to deploy resilience cannot be only seen to obfuscate its scientific accuracy, or to function https://www.ecologyandsociety.org/vol30/iss1/art6/ ...
... Here, we refer to both the means by which even calculable risks-assessable using biopo liti cal techniques like statistics, forecasting, and insurance-come to be governed by sensory pro cesses that do not depend on techniques of risk assessment, as well as the forms of imminent threat that exceed biopo liti cal calculation, even when they are the central focus of security logics like preparedness and preemption (B. Anderson 2010;Collier 2008;Samimian-Darash and Rabinow 2015). Vulnerable lives are hence "futureproofed" not only by making risks mea sur able and therefore governable, but also by cultivating, through forms of sensory training, anticipatory subjectivities attuned to the possibility of unpredictable events. ...
Article
Governing climate futures faces a double bind: the impending crises are both long term and acutely critical. Climate policy must thus anticipate futures as if they were available to be enacted in the present; that is, it must engage with prefigurative politics. Through an analytical reading of a climate policymaking exercise run with the leading policymakers of the City of Helsinki, this article advances the notion of prefiguration as consisting of three dimensions of the ‘as if’. It directs empirical inquiry simultaneously to the folding of futures onto the present as if they were already at hand, to the aligning of means and ends so that they contain each other, and to distinctions made between reality and fictions. The article describes the inconspicuous ways in which policymaking orients itself towards the future and carves out its conditions of possibility. Within them, policymakers are shown to reduce the prospect of alternative, prefigurative politics to problem-solving in a perpetual present that remains oblivious to the futures it implies.
Article
El presente artículo introductorio tiene como objetivo evidenciar cómo el hecho de seguir la pista de la preparación para la acción extiende y enriquece, conceptual y empíricamente, tanto las teorías de la acción como la antropología del conocimiento. Inicialmente, distinguimos la preparación para la acción de la acción planificada. Para ello, situamos nuestro interés en las preparaciones dentro de las investigaciones que, desde la sociología del trabajo hasta el estudio de las organizaciones, se han centrado en planes, prescripciones y formación de hábitos. En segundo lugar, esbozamos, basándonos en los nueve artículos que componen este dossier, cinco ejes de reflexión: embarcar en el campo etnográfico mediante las preparaciones; la relación con lo imprevisto; los saberes de las preparaciones; los mecanismos preparatorios; y la consistencia colectiva de las preparaciones.
Article
This introduction aims to demonstrate how following the trail of preparations to act extends and enriches conceptually and empirically both theories of action and the anthropology of knowledge. Firstly, we distinguish preparations for action from planned action. To do so, we situate our interest in preparations within the works that, from the sociology of work to the study of organizations, have focused on plans, prescriptions, and the formation of habits. Secondly, drawing on the nine articles that compose this special issue, we outline five axes of reflection: preparations as getting embedded in the field; the relationship with the unexpected; the knowledge of preparations; preparations' devices; and the collectives that preparations contribute to (trans)form.
Article
Dieser einführende Artikel zielt darauf ab, zu zeigen, wie sich das Interesse an Vorbereitungen für Handlungen sowohl konzeptionell als auch empirisch auf Handlungstheorien wie auch auf die Anthropologie des Wissens auswirkt. Zunächst unterscheiden wir die Vorbereitungen für eine Handlung von der geplanten Handlung. Dazu stellen wir unser Interesse an Vorbereitungen in jene Reihe von Arbeiten, die sich von der Soziologie der Arbeit bis zur Untersuchung von Organisationen mit Plänen, Vorschriften und der Bildung von Gewohnheiten beschäftigt haben. Zweitens skizzieren wir anhand der neun Artikel dieses Dossiers fünf Achsen der Reflexion: Vorbereitungen als Einstieg ins Gelände; das Verhältnis zum Unerwarteten; das Wissen über Vorbereitungen; die Vorrichtungen der Vorbereitungen; und die Kollektive, zu deren (Trans)Formation die Vorbereitungen beitragen.
Article
Cet article introductif vise à montrer que suivre la piste des préparations à l’action permet de prolonger et d’enrichir conceptuellement et empiriquement tant les théories de l'action que l’anthropologie des connaissances. Dans un premier temps, nous distinguons les préparations à l’action de l’action planifiée. Pour ce faire, nous inscrivons notre intérêt pour les préparations à la suite les travaux qui, de la sociologie du travail à l’étude des organisations, ont étudié les plans, les prescriptions et la constitution d’habitudes. Dans un deuxième temps, nous esquissons à partir des neuf articles qui composent ce dossier cinq axes de réflexion : les préparations comme embarquement sur le terrain ; le rapport à l’imprévu ; les savoirs des préparations ; les dispositifs des préparations ; et les collectifs que les préparations contribuent à (trans)former.
Article
Many contemporary efforts to govern global challenges are driven by combinations of numbers and futures. This special section proposes the novel concept of ‘quantified futures’ as a way of grasping this widespread entanglement. Because existing scholarship has largely treated quantification and futurisation as discrete governing technologies, their intersections have remained undertheorised and underexplored. In this introductory article, we discuss similarities between quantification and futurisation to build an integrated analytical framework that outlines how quantified futures operate across transnational policy domains by shaping the salience, scope and urgency of global challenges and their solutions. The special section at large cautions against overly optimistic expectations regarding the capacity of quantified futures to tackle global challenges. Rather, it underscores the need to enquire into the mutually reinforcing effects between, on the one hand, the growing use of quantified futures and, on the other hand, the increase and diversification of global challenges.
Article
This paper evaluates alternative measures for mitigating losses to residential structures from natural disasters. The first part focuses on the effect of liberalized federal disaster relief policies on the types of structures which will be built in hazard-prone areas. Both physical losses from disasters as well as intangible costs, such as the value of a human life, are considered in the evaluation of two alternative type structures. The second part of the paper discusses four extreme cost-bearing approaches to cover losses from natural disasters: (i) Total federal responsibility, (ii) Self insurance by the homeowner, (iii) Required insurance protection, (iv) Land use restrictions and building codes. The concluding section suggests a plan for mitigating future losses based on comprehensive disaster insurance coupled with land use measures and building codes regulations. A critical question which remains to be answered is whether disaster insurance should be voluntary or required.
Article
This article provides a comparative analysis of the insurance industry treatment of the "hard to place" or residual risk in various European countries as contrasted with the approaches employed in the United States. Based on a study of current practice and experience in all major western European countries, the principal finding is that there does not appear to be a residual market problem because of socioeconomic and political factors. Criteria employed in the analysis include urban blight, law enforcement, cultural homogeneity, equity ownership in durables, character of insurance regulation, government-industry co-operation, insurer density, market penetration by agents, government indemnities for catastrophic disasters, degree of consumerism, refinement of underwriting classifications and availability of voluntary pools for special risks. The article concludes that the residual risk problem can be removed from the public domain by cooperative action to grant insurance to all prospects with redistribution of the risk by means of reinsurance.
Article
Private and public programs for compensating individuals for property damage from catastrophes recently have increased significantly. The plethora of programs creates many problems. The author argues for a comprehensive catastrophe insurance system with an all risks policy and all risks rating. Necessary conditions for all risks rating are established. Benefits from an all risks system are examined. Guidelines for all risks rates are developed. A catastrophe insurance system, based on an all risks policy, returns to basic insurance principles. It employs a cooperative mix of industry and government, using the best of both.
Article
Insurers paid 1.9billiononpropertyclaimsarisingfromcatastrophesin1983.ResearchershaveestimatedthatannualInsuredcatastrophelossescouldexceed1.9 billion on property claims arising from catastrophes in 1983. Researchers have estimated that annual Insured catastrophe losses could exceed 14 billion. Certainly, the financial implications for the insurance industry of losses of this magnitude would be severe; even industry losses much smaller in magnitude could cause financial difficulties for insurers who are heavily exposed to the risk of catastrophic losses. The quantification of exposures to catastrophes, and the estimation of expected and probable maximum losses on these exposures pose problems for actuaries. This paper presents a methodology based on Monte Carlo simulation for estimating the probability distributions of property losses from catastrophes and discusses the uses of the probability distributions in management decision-making and planning.