ChapterPDF Available

Learning from Crisis: NASA and the Challenger Disaster

Authors:

Abstract

Introduction: did NASA learn from the Challenger disaster? On 1 February 2003, the Columbia Space Shuttle disintegrated during the final stages of its return flight to earth. The drama unfolded live on television: spectacular pictures of the doomed flight were punctuated by reactions of devastation and loss. It was in some ways a familiar drama. Seventeen years before (28 January 1986), Space Shuttle Challenger had exploded within 2 minutes of its launch. The Challenger disaster was etched in the minds of an entire generation of American schoolchildren, who watched the launch in their classes (the teacher Christa McAuliffe was on board to teach elementary school students from space). Both disasters were studied by a presidential commission. Both commissions were scathingly critical of the National Aeronautics and Space Agency (NASA). The Rogers Commission, which studied the causes of the Challenger disaster, criticised the space organisation for not responding adequately to internal warnings about the impending disaster. The Columbia Accident Investigation Board (CAIB) found that little had changed since the Challenger disaster: ‘By the eve of the Columbia accident, institutional practices that were in effect at the time of the Challenger accident - such as inadequate concern over deviations, a silent safety programme, and schedule pressure - had returned to NASA’ (CAIB 2003: 101). The inescapable conclusion emerging from the CAIB report is that NASA failed to learn the obvious lessons flowing from the Challenger disaster, which caused the demise of Columbia (see, e.g. Vaughan 2005).
Cambridge Books Online
http://ebooks.cambridge.org/
Governing after Crisis
The Politics of Investigation, Accountability and Learning
Edited by Arjen Boin, Allan McConnell, Paul 't Hart
Book DOI: http://dx.doi.org/10.1017/CBO9780511756122
Online ISBN: 9780511756122
Hardback ISBN: 9780521885294
Paperback ISBN: 9780521712446
Chapter
9 - Learning from crisis: NASA and the Challenger disaster pp. 232-254
Chapter DOI: http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge University Press
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
9 Learning from crisis: NASA and
the Challenger disaster
arjen boin
Introduction: did NASA learn from the Challenger disaster?
1
On 1 February 2003, the Columbia Space Shuttle disintegrated during
the final stages of its return flight to earth. The drama unfolded live on
television: spectacular pictures of the doomed flight were punctuated
by reactions of devastation and loss. It was in some ways a familiar
drama. Seventeen years before (28 January 1986), Space Shuttle Chal-
lenger had exploded within 2 minutes of its launch. The Challenger
disaster was etched in the minds of an entire generation of American
schoolchildren, who watched the launch in their classes (the teacher
Christa McAuliffe was on board to teach elementary school students
from space).
Both disasters were studied by a presidential commission.
2
Both
commissions were scathingly critical of the National Aeronautics and
Space Agency (NASA). The Rogers Commission, which studied the
causes of the Challenger disaster, criticised the space organisation for
not responding adequately to internal warnings about the impending
disaster. The Columbia Accident Investigation Board (CAIB) found
that little had changed since the Challenger disaster: ‘By the eve of
the Columbia accident, institutional practices that were in effect at
the time of the Challenger accident – such as inadequate concern over
deviations, a silent safety programme, and schedule pressure had
returned to NASA’ (CAIB 2003: 101). The inescapable conclusion
emerging from the CAIB report is that NASA failed to learn the obvious
1
I wish to thank the following people for their helpful comments on earlier drafts
of this chapter: Chris Ansell, Paul ‘t Hart, Stephen Johnson, Todd LaPorte,
Allan McConnell, Jos
¯
e Olmeda, Paul Schulman, and all the participants of the
ECPR workshop ‘Crisis and Politics’, held in Granada (14–19 April 2005).
2
Both the Challenger and the Columbia disasters have been researched by a large
number of academics as well. Diane Vaughan (1996) has written the best study
on the Challenger disaster. See Starbuck and Farjoun (2005) for a collection of
essays on the Columbia disaster.
232
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 233
lessons flowing from the Challenger disaster, which caused the demise
of Columbia (see, e.g. Vaughan 2005). The CAIB thus sketches a pic-
ture of a recalcitrant organisation that irresponsibly gambled with the
lives of its astronauts.
This chapter investigates if and what NASA learned in the wake of
the Challenger disaster and explores if and how the Challenger after-
math is related to the Columbia disaster. We begin by briefly outlining
NASA’s history of human space flight. The next section explains why
seemingly ‘hard’ assumptions about causes, risks and organisational
learning rarely hold up to scrutiny. We then revisit the Challenger disas-
ter and its aftermath, offering a reappraisal of NASA’s learning capacity
while reexamining the relation with the Columbia disaster. The chapter
concludes with more generic points about organisational learning after
crisis, with a specific focus on the role of commissions.
A brief history of NASA’s human space flight: from
Apollo to Columbia
The explosion of Space Shuttle Challenger,73seconds into flight,
undermined belief in America’s space agency. Unaccustomed to the risk
of disaster (it had been 19 years since the deadly Apollo fire of January
1967), politicians, journalists and the public at large anxiously watched
the hearings held by the Rogers Commission. The Rogers Commission
was deeply critical of NASA’s safety practices. It attacked both the
organisational risk paradigm (which determined how NASA officials
viewed risk) and the organisational procedures to deal with potential
problems. The Rogers Commission concluded that NASA’s risk def-
inition had become too wide and its practices too lenient. Before we
consider if and how NASA learned the lessons offered by the Rogers
Commission, we need to briefly describe the origins of NASA’s safety
culture.
3
The Apollo race: reconciling risk, resources and schedules
In 1958, President Eisenhower merged various aerospace and engineer-
ing centres of excellence under the NASA banner. The Russians had
3
There is an abundant literature on NASA’s history. In addition to the sources
cited in this chapter, the NASA website provides much helpful material.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
234 Arjen Boin
launched Sputnik and the United States could not afford to lose the
space race. In 1961, President Kennedy upped the ante by declaring
that the United States would bring a man to the moon and back before
the decade was over. NASA certainly had a challenge to meet. Three
weeks before Kennedy made his promise to the nation, Alan Shepard
had become the first astronaut in space (his flight in the Mercury air-
craft lasted no longer than 15 minutes). Although NASA had caught
up with the Russians, it was still a long way from landing astronauts on
the moon and bringing them back safely. The Apollo project was con-
strained by three factors: knowledge (nobody had done this before),
time (racing the Russians) and money (nobody knew how much it
would cost and Congress routinely cut the NASA budget).
The Apollo project critically depended on the ability of NASA lead-
ers to make the centres work together (McCurdy 1993). These centres
were notoriously independent. The Langley Research Center (estab-
lished in 1919) had a long history of aeronautics design and a very
peculiar way of working – ‘the Langley way’ (Murray and Cox 1989:
27). Langley personnel played a large role in designing the Apollo
spacecraft. The George C. Marshall Space Flight Center (MSFC) in
Huntsville, Alabama, housed Wernher von Braun’s rocket team. The
Germans had pioneered long-range ballistic missiles during World
War II: they built the V-1 and V-2 rockets that the Germans rained on
England.
4
They brought to NASA state-of-the-art knowledge of rocket
development. The MSFC designed the rockets (Saturn boosters) that
launched the Apollo and her crew into space. The launch facilities were
at Cape Canaveral, Florida. The Space Task Group oversaw the Apollo
project from its center in Houston (flight control was based there as
well). Jim Webb ‘ran’ NASA from his small Washington, DC, head-
quarters.
5
It soon became apparent that the decentralised centres were hard
to manage. NASA initially managed the centres ‘by committee’, which
amounted to facilitating and hoping for the best. The culture was infor-
mal and communication was based on sound engineering arguments.
This ‘loose anarchistic approach to project management’ became a
4
See Adams and Balfour (2004) for a very critical discussion of the role played
by von Braun’s team during the war. The authors discuss the ethics of having
these alleged war criminals developing the rockets that would bring Americans
to the moon.
5
Webb and his colleagues at headquarters also managed the other NASA centres.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 235
problem when test failures and huge budget overruns threatened the
success of the Apollo project (Johnson 2002: 102). NASA administra-
tor Webb realised that the way of managing the Apollo project was in
need of drastic change if NASA were to maintain political support and
succeed in its lunar mission (Johnson 2002: 130–2). Webb brought in
George Mueller as the new director of the Office of Manned Space-
flight (OMSF) in September 1963. Mueller would become known as
the father of space flight.
Mueller turned NASA around ‘from a loosely organised research
team to a tightly run development organisation’ (Johnson 2002: 142;
cf. Murray and Cox 1989). He introduced two crucial concepts that
continue to mark NASA’s culture to this day. First, he imposed a man-
agement technique known as ‘systems engineering’. Pioneered in the
U.S. Air Force, a set of procedures and project management techniques
was brought in to integrate the design processes of the various centres.
The procedures served to codify good scientific, engineering and man-
agerial practices that were developed in the separate centres. Building
on the shared engineering background, the procedures helped to define
and circumscribe the autonomy of the centres.
The second change was the imposition of the ‘all-up testing’ con-
cept. Both the Langley and German engineers subscribed to a conven-
tional engineering approach, which dictated endless tests of all parts
and the interaction between the parts. They learned through failure:
firing rockets, watching them explode, determining what went wrong,
redesigning the rocket – until the rocket was perfect. This time-proven
practice had two drawbacks. First, it would take a long time to do a
sufficient number of tests to create a statistical base for risk assessment.
Second, the test process could never completely resemble a space envi-
ronment. Once you strap people on top of the rocket, it has to work
the first time around.
The all-up testing principle marked the end of endless testing. The
new mantra was an ultrarational version of engineering logic: ‘design
it right, fabricate it per print, and the component will work’ (Murray
and Cox 1989: 103). Since ‘there was no way to make 0.999999 claims
on the basis of statistical evidence unless the engineers tested the parts
millions of times’, there really was no alternative (Murray and Cox
1989: 101). The all-up testing did imply a very distinct risk philosophy,
described by flight director Chris Kraft (immortalised in the movie
Apollo 13):
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
236 Arjen Boin
We said to ourselves that we have now done everything we know to do. We
feel comfortable with all of the unknowns that we went into this program
with. We know there may be some unknown unknowns, but we don’t know
what else to do to make this thing risk-free, so it is time to go. (cited in
Logsdon 1999: 23)
A philosophy of calculated risk: success and failure
NASA rejected the verisimilitude of quantitative risk analysis and
accepted the hard risk that every space flight can end in disaster. This
philosophy demanded an unwavering commitment to ‘sound engineer-
ing’. The technique of systems engineering offered the procedures to
maintain these high levels of engineering quality. The stunning success
of the moon landing affirmed this philosophy, while failures reinforced
the organisation’s commitment to this way of working.
NASA’s first tragedy arrived on 27 January 1967. Three astronauts
(the original moon crew) died when a fire broke out in the Apollo
capsule during a simulated test run at Cape Canaveral. The accident,
with the astronauts dressed in their space suits, took place in the Apollo
capsule on top of a Saturn rocket. A small spark caused an intense fire
and killed the trapped astronauts within seconds (the capsule was filled
with pure oxygen).
In the turmoil that followed, Apollo engineers were accused of
incompetence and negligence. A memo from General Electric, which
warned of this scenario, surfaced. In reacting, NASA placed more
emphasis on procedures to control individual quirks: ‘Never again
would individuals be allowed to take so much responsibility onto them-
selves, to place so much faith in their own experience and judgment’
(Murray and Cox 1989: 203).
The spectacular success of the 1969 moon landing proved to many
within NASA that the introduction of systems engineering had been the
correct strategy (Johnson 2002). The centres had been curtailed in their
freedom to run endless design-test-redesign cycles, yet they had been
left with enough freedom to design a spacecraft that worked very effec-
tively. The detailed rules were grounded in best practices; the emerging
philosophy therefore facilitated a surprisingly informal culture.
Just how effective and resourceful NASA culture had become was
perhaps demonstrated during the near-disaster that occurred some time
after the successful lunar mission, when Apollo 13 experienced an
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 237
explosion in space (Murray and Cox 1989; Kranz 2000). The adher-
ence of procedures enabled the engineers to figure out what had hap-
pened and what was possible. Yet it was the capacity to be flexible and
to depart from enshrined rules that gave rise to the level of improvisa-
tion that in the end saved the day (and the crew). The shared commit-
ment to sound engineering and the institutionalised practice of open
communication made it possible to solve this crisis in the nick of time.
Valued traditions vs. new disasters
After the Rogers Commission criticised NASA’s traditional approach
to safety, it took nearly 3 years before NASA would return to flight.
An impressive string of successes followed: NASA safely flew eighty-
seven shuttle flights, launched the Hubble telescope (and later repaired
it in space), the Mars Pathfinder, the Sojourner Rover and the Lunar
Prospector (McCurdy 2001). It aggressively cut costs through its ambi-
tious ‘Faster, Better, Cheaper’ programme (McCurdy 2001). However,
the many successes did not restore the prestige and admiration that
NASA enjoyed during the Apollo and early shuttle years. By the end
of the 1990s, NASA’s safety practices were scrutinised after a series of
spectacular failures Mars Climate Orbiter, Mars Polar Lander and
Deep Space 2 rank among the most visible. Several critical reports
described what Farjoun (2005) refers to as a period of ‘safety drift’.
Suffering from serial budget cuts, NASA had begun to erode its safety
margins (SIAT 2000). After shuttle flight STS-93 experienced serious
in-flight anomalies in July 1999, the entire fleet was grounded.
In reaction to the critical report of the Space Shuttle Indepen-
dent Assessment Team (SIAT 2000), NASA administrator Dan Goldin
declared a ‘shuttle crisis’ (Farjoun 2005). The ageing fleet had become
vulnerable. Safety procedures and practices had been eroded as a result
of labour shortage, and the SIAT report unearthed a worrying num-
ber of narrow misses. The agency had successfully brought down the
costs of shuttle launches (partially in order to fund the expensive Inter-
national Space Station), but the administrators now sensed that the
cost cutting had gone too far. Goldin convinced the Clinton adminis-
tration to increase the programme’s funding, which allowed NASA to
address a variety of safety concerns as identified by SIAT. After meet-
ing the short-term concerns of SIAT, the shuttles resumed their flight
schedules.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
238 Arjen Boin
By 2001, NASA’s political credibility had reached a low point as
a result of ‘failed investments and inadequate cost-control efforts’.
Congress and the White House effectively ‘put NASA on probation’
(Blount et al. 2005: 130; CAIB 2003). The appointment of Sean
O’Keefe (formerly deputy director of the White House Office of Man-
agement and Budget) as the new NASA administrator signalled that
NASA’s problems were viewed as managerial and financial at heart
(Farjoun 2005; McDonald 2005). O’Keefe prioritised the International
Space Station (ISS), which had to be completed before NASA could
move on to other human flight projects. The completion of ISS would
require a series of tightly scheduled shuttle flights.
The Columbia disaster (1 February 2003) thus came at the worst pos-
sible time for NASA, which was politically vulnerable. The Columbia
disaster instantly jeopardised the future of NASA’s human space pro-
gramme. The subsequent findings of the Columbia Accident Investi-
gation Board (CAIB) further eroded the agency’s remaining legitimacy
base (see also Klerkx 2004).
6
Politicians and media representatives
seemed to increase their vocal concerns about whether NASA still had
the ‘right stuff’ to fulfil its mission (Wolfe 2005).
The CAIB report wove two story lines into one blasting analysis.
The first line recaptured the findings of recent reports, which described
a severely eroded safety culture and an alleged susceptibility to comply
with irresponsible deadlines. The second line detailed the similarities
with the pre-Challenger period. The combined outlook suggested a
highly irresponsible organisation that had gambled with the lives of
astronauts in order to please the agency’s stakeholders. NASA, in other
words, had failed to learn from the Challenger disaster it had, in fact,
made things worse.
Learning from disaster
From an engineering perspective, learning from technological failure is
a fairly straightforward affair. If a bridge collapses or a space shuttle
6
Many critics doubted whether NASA should continue to fly the space shuttles at
all. After the next flight was plagued yet again by the foam problem, the shuttle
was officially kissed off (see President Bush’s 2005 space plan). Additional
problems – especially with President Bush’s political appointees, who were
accused of muzzling NASA’s climate scientists – have further increased the
criticism of NASA.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 239
explodes, it simply means that the original design the ‘null hypo-
thesis’ has been falsified (Petroski 1992). Learning, then, pertains
to the activity of redesigning. Learning has been successful if the
redesigned contraption functions according to plan. A shared belief in
the laws of physics and engineering underpins this notion of learning,
which is prevalent in many if not most organisations that deal with and
depend upon technology. This is not meant to suggest that engineers
cannot disagree. Quite on the contrary: the Apollo history is filled with
deep controversies between and within the various centres (Murray and
Cox 1989). However, it does mean that engineers tend to resolve such
controversies on the basis of engineering logic and the laws of physics.
Most engineers certainly those at NASA would have a hard time
considering a different way of learning.
The introductory chapter of this book explains why crisis-induced
learning tends to be of a less rational nature. The aftermath of a disas-
ter is dominated by political processes, which affect learning practices.
The outside world imposes itself through congressional hearings,
media inquiries and investigative committees – upon the organisation
that has ‘produced’ the disaster. Political elites, citizen outcries, victims’
relatives and media representatives create a climate in which organisa-
tional learning is subjugated, at least temporarily, to the lessons learned
of an outside body (a special committee or a standing investigative
body). Most of us may find it hard to trust the self-corrective poten-
tial of an organisation that has just caused a disaster (cf. Sagan 1993;
Perrow 1994).
The dynamics of the crisis aftermath fundamentally alters the learn-
ing process in at least three ways. First, it creates two domains of learn-
ing, each with its own characteristics. Second, it substantially widens
the scope and scale of potential lessons to be learned. Third, it funda-
mentally alters the evaluation of lessons learned. Let us briefly expand
on these notions.
Endogenous vs. exogenous learning
In the postcrisis phase, the venue for learning typically shifts away
from the responsible organisation or network. The accountability pro-
cess, which overrides organisational learning routines, dictates an inde-
pendent investigation. This does not negate intraorganisational learn-
ing processes that may have been triggered by the disaster. But the
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
240 Arjen Boin
organisation that has ‘produced’ the disaster must patiently await the
findings of this investigative body.
The postcrisis phase thus sees at least two separate domains of learn-
ing, each with its own rules, dynamics, interests and time horizons.
There is a voluminous literature that explains why it is hard for organ-
isations to learn from a crisis (Stern 1997; Boin et al. 2005: 117–22;
see also Parker and Dekker, this volume).
7
Learning processes in public
organisations are typically shaped in unpredictable ways by the pre-
vailing mix of laws, rules, routines, core values, bureaucratic rivalry
and leadership interests.
Investigative committees try to learn from the same disaster, but the
context in which they attempt to do so is very different from the con-
text in which public organisations operate. Committees operate under
time pressure and must typically produce a report before a certain date.
Moreover, investigative committees are often affected by accountabil-
ity concerns: even if they want to avoid finger pointing altogether, their
report will be perused to find the ‘guilty’ actors. Finally, we should note
that committees formulate recommendations that they will not have to
implement. This simple fact allows and may even induce com-
mittees to formulate sweeping recommendations (‘become a learning
organisation’) without taking into account organisational realities.
These differences may be further intensified when the official lessons
come to be perceived within the receiving organisation as a partisan
product. The members of a committee may have less direct knowledge
of the processes leading up to a crisis (they were not there when it
happened and they do not always understand the organisation or the
core processes of that organisation). Critics of an organisation may
have recognised the investigative committee as a promising venue to
push their aims and solutions. Ad hoc committees that are installed to
investigate tightly knit policy sectors (such as the space industry) may
prove especially vulnerable for inside biases. These commissions are
made up of ‘independent outsiders’, but there may be few available
experts without any preconceived notion about the organisation, its
core processes and technologies, or the disaster itself. These experts
often have a history with the organisation (if not, we may wonder
7
There may be various domains of learning (think of congressional
subcommittees, academic investigations, media inquiries and interest groups) in
which the crisis at hand is being subjected to learning processes.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 241
about their expertise). This may have serious effects on the lessons
learned and their prospects for implementation.
This is not meant to imply that ‘outside’ learning can never get to the
bottom of an organisational crisis, but it seems safe to predict that the
lessons learned and the recommendations made in both venues will
not be identical. This creates a source of potential friction between
the external investigator and the investigated organisation. Political
considerations force a public organisation to adopt grudgingly or
enthusiastically the lessons and recommendations offered to them by
the external committee. But if and how these lessons find their way into
the rules and routines of the organization, depends on the size of the
gap separating the imposed blueprint from the home-grown lessons.
Single-loop vs. double-loop learning
The potential of flunked crisis learning is heightened by the type of
lessons learned in the different learning venues. Most organisations
appear to be capable of ‘single-loop’ learning: organisational members
try to fix what was broken while preserving the overall structure and
the institutionalised ways of working. This type of learning fits the
postcrisis mood. Hurt and traumatised by a disaster, organisational
members tend to fall back on proven routines and shy away from wild
experiments.
One may expect organisational leaders to formulate lessons that are
known in the literature as ‘double-loop learning’ (Argyris and Sch
¨
on
1978). These are lessons that address the wider context in which the
single-loop lessons were allowed to occur. They may, for instance, tar-
get policy paradigms or institutional foundations, which few organisa-
tions can alter without entering a very different type of crisis. Organisa-
tional leaders vary in their willingness to adopt double-loop lessons. In
the absence of hard evidence, we may hypothesise that long-incumbent
leaders with a record to defend will prioritise preservation over reform
(Boin and ‘t Hart 2000). Incoming leaders and those who aspire to
move up the ladder are more likely to welcome crises as reform oppor-
tunities (if only to discredit successors or incumbents).
It appears that investigative committees have become increasingly
inclined to formulate double-loop lessons. The members of these ad
hoc committees tend to take a wider view, investigating both the imme-
diate causes of a disaster and the organisational context in which the
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
242 Arjen Boin
disaster has taken place. They put up for discussion the institutional
paradigms to which organisation members subscribe. Unless the disas-
ter has convinced the organisational members that their conceptual and
managerial foundations no longer suffice – and this is rarely the case
the lessons learned in both venues may thus be of a fundamentally
different nature.
The most ambitious form of learning is known in the literature as
‘deuterolearning’ learning to learn (Argyris and Sch
¨
on 1978). This is
a typical academic prescription: it is theoretically sound, but never clear
how it should be accomplished in the real and messy life of organisa-
tions. The class of so-called high-reliability organisations is often said
to harbor this learning ability. However, investigative committees often
couch ‘learning to learn’ type recommendations in vague and abstract
language. Such recommendations are easy to make, especially without
having to provide a manual.
Technical vs. political evaluation
From a purely technical perspective, the evaluation of lessons learned
is a relatively simple exercise: if the disaster that gave rise to the lessons
does not reoccur, the organisation has ‘learned its lesson’. But crisis-
induced learning is political, not technical, at heart. If the committee
succeeds in delivering an authoritative report its status being decided
upon by media, politicians and public opinion the crisis narrative, the
lessons and the recommendations may come to be seen as a benchmark
for organisational effectiveness.
Most committees produce single-loop recommendations that appear
to be ‘easy fixes’. After the Herald of Free Enterprise sunk in sight of
Brugge’s harbour (1987), the investigative committee recommended
various ways to make sure this particular type of ferry would not take
off with open doors. Investigations of prison riots routinely prescribe
better hardware (such as improved riot gear, impenetrable fences and
unbreakable glass). Even double-loop recommendations tend to appear
deceptively simple: improve training, hire better people, change the
culture – it all makes sense.
As a result, the organisation at the receiving end of such prescriptions
will have to be able to show that it learned the lessons offered by the
committee at any point in the distant future. Some recommendations
are simply imposed on an organisation through legal changes, policy
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 243
reformulations and budget amendments. Others must be implemented
in and by (parts of) the organisation. Even if an organisation adopts the
recommendations wholesale, it will have to make some accommoda-
tion with organisational characteristics – if not immediately, certainly
in the future. But this is the ideal scenario. Most public organisations
accept only part or none of the recommendations, even though they
embrace them publicly and promise to uphold them. The complexity
of implementation feeds the temptation of symbolic reform.
Symbolic adherence may shield the organisation from further outside
interference, but in the long run the organisation cannot escape from
it. As the crisis becomes a historic marker for the organisation, future
assessments will take into account how the organisation dealt with
the crisis. Future failures will evoke scrutiny of past behaviour. An
organisation may thus be forced to adopt externally formulated lessons
or face the consequences in the future. The report hangs as a sword of
Damocles above the future of the organisation.
NASA and the Challenger: the politics of learning revisited
On 29 September 1988, NASA resumed its human space programme
with the launch of space shuttle Discovery. Its safe return marked the
beginning of a successful series of nearly one hundred shuttle flights,
which tragically ended on 1 February 2003. This performance would
seem to indicate that NASA learned the lessons from the Challenger
disaster. However, the Columbia Accident Investigation Board (CAIB)
reached a different conclusion. This section addresses the apparent
tension between a successful flight record and the damaging CAIB
findings.
Internal vs. external learning: NASA’s responsive attitude
The Rogers Commission offered two types of findings: it detailed the
technical causes (the faulty O-rings) and the organisational causes (the
failure to detect the technical causes). NASA accepted, adopted and
implemented all recommendations offered by the Rogers Commission.
Nobody within NASA doubted that the O-rings had to be redesigned
before the shuttles could fly again.
Much more light separated the findings of the Rogers Commission on
NASA’s organisational functioning and NASA’s self-perception. The
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
244 Arjen Boin
Rogers Commission identified three types of organisational failure:
NASA’s safety culture, NASA’s organisational structure and NASA’s
schedule pressure were at fault. The commission’s findings thus directly
attacked what were widely perceived within NASA as the organisa-
tion’s cultural anchors and valued ways of operating.
Diane Vaughan’s (1996) analysis of the Challenger disaster sug-
gests that much of the Rogers Commission’s findings were misin-
formed. It appears, for instance, that the commission misunderstood
NASA’s safety system, especially the way NASA engineers dealt with
anomalies. Moreover, Vaughan demonstrates that the launch decision
was a tragic misunderstanding rather than a gross management error.
Vaughan shows that NASA routinely delayed flights if technical prob-
lems emerged, thus putting to rest the idea that NASA would prioritise
launch schedules over shuttle safety. From her extensive interviews
with NASA workers, it becomes clear that the Rogers findings did not
resonate with the lessons learned within NASA. To be sure, Vaughan
did not find NASA to be a perfect organisation. Her findings, how-
ever, lacked the ‘clear-cut character’ of the Rogers findings. Vaughan
described a rather effective safety culture, rooted in NASA’s lessons of
the past, which had nevertheless allowed this disaster to occur.
The findings of the Rogers Commission never became a cultural issue
within NASA, because the recommendations of the commission only
addressed the structural features of the organisation (entirely bypass-
ing the cultural problems). The commission recommended a more
centralised management structure (moving the shuttle management to
NASA headquarters), a deeper involvement of astronauts in the shuttle
programme’s management, the establishment of a Shuttle Safety Panel
and the establishment of an Office of Safety, Reliability and Quality
Assurance. It had very little to recommend with regard to NASA’s
safety culture (even though its findings identified cultural factors as the
main culprits).
This explains why NASA, despite very different views on the organ-
isational causes of the Challenger disaster, accepted and adopted the
recommendations put forward by the Rogers Commission. In Decem-
ber 1990, the Augustine Committee (advising on the future of the U.S.
space programme) not only commended NASA’s responsive attitude in
the wake of the Challenger disaster but also observed that NASA had
been ‘burdened by excessive layers of management that are the legacy
of the development era and recovery from the Challenger accident’
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 245
(Augustine Committee 1990: 30). In 1995, the Space Shuttle Manage-
ment Independent Review Team (the Kraft Commission) paid compli-
ment to the ‘remarkable performance’ of the space shuttle programme
and effectively prescribed that NASA roll back the changes adopted
after the Challenger disaster. As the Kraft Commission described it:
The performance of the machine as a space transportation system has been
remarkable given the difficult operating conditions and management envi-
ronment. The preflight operational parts of the program are excellent in
delivering, preparing, assembling, and readying the vehicle for flight. Opti-
mal flight designs and plans are developed and executed for diverse and
complex payload operations. Crew and flight controller readiness for both
nominal and contingency operations are unmatched. Over the last several
years, while performing seven to eight flights per year, the Shuttle Program
has continued its successful performance while incrementally reducing oper-
ating costs by approximately 25 percent. (Kraft Commission 1995: 11)
Single-loop vs. double-loop learning: Rogers opens
Pandora’s box
The recommendations formulated in the Rogers report seem to reflect
what academics refer to as ‘single-loop learning’: the commission pre-
scribed shuttle design fixes, organisational reparations (centralisation,
improved communication) and organisational fortifications (a few new
offices, more ex-astronauts in managerial positions). The Rogers Com-
mission did not recommend a complete overhaul of the way NASA
prepares, launches, flies and returns its shuttles. The latter would be
called ‘double-loop’ learning.
The findings of the Rogers Commission suggest that such double-
loop learning would be in order. In its report, the commission was
highly critical of the way NASA dealt with emerging risk: the organi-
sation irresponsibly broadened its risk definitions and failed to act on
clear warnings of impending danger. One of the commission members,
Nobel laureate Richard Feynman, was especially critical, both in pub-
lic appearances and in a personal appendix to the report. We can only
guess why there was such a disconnect between double-loop lessons
and single-loop recommendations. But even if the Rogers Commission
did not affect the heart of NASA’s safety system, it certainly did ini-
tiate a debate that would come back to haunt NASA years down the
road.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
246 Arjen Boin
The Rogers Commission, perhaps unintentionally, exposed the
rather particular risk conception that had taken root in NASA’s organ-
isational culture and determined its safety system. In its infant years
(1958–1963), NASA engineers sought to minimise risk by the familiar
design-test-redesign cycle, which was to be run until risk could be all
but ruled out. This time-honoured model of experimenting and testing
was abandoned after President Kennedy imposed a firm deadline on
the Apollo project. If NASA was to fly astronauts to the moon, the old
model of endless testing clearly did not suffice.
In response to this looming Catch-22 situation, NASA revolutionised
the management of human space projects by introducing a new testing
philosophy (all-up testing) and a new management philosophy (systems
management), which brought a heavy reliance on rules and procedures.
The subsequent successes of the Apollo project anchored this approach
into NASA’s organisational culture.
This new approach entailed a new risk philosophy, a development
that remained unnoticed or unappreciated outside NASA for a long
time.
8
This philosophy dictated that once the NASA engineers the
best in the world had applied their engineering logic to the design
and fabrication of a rocket, only real-life tests could prove whether
the rocket worked. This always entails a risk, because experimental
technology and the unforgiving conditions of space can and will inter-
act in unforeseen ways. One can only discover these ways by flying. If
progress is to be made, risks have to be taken. After flying the contrap-
tion, anomalies are discovered and fixed and it is flown again. The
more it is flown, the safer it becomes. The reverse is also true in this
conception: if it is not flown, nothing can be learned.
The Rogers Commission took issue with this institutionalised risk
conception, noting that a safe shuttle flight does not ‘prove’ everything
will work the next time. Whereas NASA worked on an experience basis
(the O-rings did not burn through completely, so the design worked
well), the Rogers Commission leaned towards a quantatively oriented
risk conception (the O-rings clearly did not live up to the design require-
ments, which means NASA and its contractors should go back to the
design table). Whereas NASA viewed its designs as hypotheses to be
8
The dominance of this risk philosophy is probably described best by the key
players themselves. For an in-depth conversation between key players, see
Logsdon (1999).
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 247
tested (cf. Petroski 1992), the Rogers Commission demanded proof
that the shuttle would be safe. The idea that anything could be proven
before a shuttle flight violated NASA’s risk conception (which was a
pillar of NASA’s safety system).
This clash between two ‘risk schools’ would resurface periodically
in the years following the Challenger disaster. In 1995, the Kraft report
declared that the shuttle had become more reliable as a result of more
than thirty safe flights. This embrace of NASA’s risk conception should
come as no surprise: Chris Kraft was a key player during the Apollo
years and a strong believer in this risk philosophy. In 1990, the Advisory
Committee on the Future of the U.S. Space Programme (known as the
Augustine Committee) presented a report that framed NASA’s risk
conception in historical terms:
The space program is analogous to the exploration and settlement of the New
World. In this view, risk and sacrifice are seen to be constant features of the
American experience. There is a national heritage of risk taking handed down
from early explorers, immigrants, settlers, and adventurers. It is this element
of our national character that is the wellspring of the U.S. space program. [. .]
If people stop taking chances, nothing great will be accomplished.
The SIAT (2000) report, consisting of outsiders, echoed the findings
of the Rogers Commission and roundly criticised NASA’s risk percep-
tions. The CAIB (2003) built on the SIAT report and outright rejected
NASA’s risk conception.
It is one thing to recommend double-loop learning, but it another to
offer an alternative that is both feasible and effective. Even if NASA’s
risk philosophy should be rejected as unacceptable, it is not clear what
the alternative would be. This question becomes especially pressing
in the light of new visionary plans (to the Moon, Mars and beyond)
announced by President Bush in 2005. Human space flight remains
inherently risky. As long as budget and time constraints exist, it is not
clear how NASA’s philosophy should be amended.
Technical vs. political assessment: follow the trail
In its analysis of the Columbia disaster, the CAIB drew a straight line
between the findings of the Rogers Commission (17 years old by then)
and its own findings. The CAIB reported finding the same conditions
that had caused the Challenger explosion still present in NASA. The
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
248 Arjen Boin
implication was obvious and damaging: NASA had failed to learn from
the most deadly disaster in its history. This ‘finding’ is at odds with the
findings of other commissions, which assessed NASA and the shut-
tle programme in the years following Challenger. Before NASA was
allowed to ‘return to flight’ in 1988, both the shuttle and the organisa-
tion were scrutinised. After all, the whole nation was watching as the
Discovery resumed the shuttle programme.
In 1990, the Augustine Committee reported its findings. It remarked
that NASA had suffered much criticism in recent years, but that the
organisation now had its house in order.
9
After reiterating that NASA
‘has the critical responsibility of doing everything it can to minimize
the human risk involved in meeting the nation’s space goals’ the com-
mittee stated ‘that we believe [NASA] has now firmly embraced [this
responsibility]’. The committee concluded that NASA had made ‘an
intense effort’ to redress the organisational vulnerabilities outlined by
the Rogers Commission. It observed that ‘a process appears to be
in place which surfaces concerns [with regard to launch safety] and
resolves them’. The committee hinted that NASA might have learned
too much from Challenger.Itfound that ‘the Shuttle launch operation
has evolved into a relatively slow and deliberate process’. Agreeing
that the ‘ultimate goal should be a safe operation’, the committee cau-
tioned that NASA should not be ‘burdened by excessive layers of man-
agement that are the legacy of the [ . . . ] recovery from the Challenger
accident’.
In 1995, the Space Shuttle Management Independent Review Team
produced its findings in what has become known as the Kraft report.
The team lamented the safety bureaucracy (‘duplicative and expensive’)
that had sprung up in response to the Challenger disaster (Kraft Com-
mission 1995: 16), concluding that the shuttle had become ‘a mature
and reliable system – about as safe as today’s technology will provide’
(Kraft Commission 1995: 1). Kraft found that ‘too many discrepancies
result in detailed analysis and testing’ (Kraft Commission 1995: 13).
The report went as far to suggest that the post-Challenger reforms had
made the shuttle less safe: ‘indeed, the system used today may make
the vehicle less safe because of the lack of individual responsibility it
brings about’ (Kraft Commission 1995: 17).
9
The committee remarked that ‘some parts of the media [. . .] by this time had
turned “NASA bashing” into a journalistic art’.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 249
The Kraft report prescribed ‘a change to a new mode of management
with considerably less NASA oversight’.
10
NASA should no longer treat
the shuttle as a developmental programme (which made the shuttle
much too expensive to operate) and should outsource shuttle opera-
tions to external contractors. NASA would then be able concentrate on
developing new space programmes. All in all, this report had a great
impact: a consortium of space contractors took over parts of the space
programme.
In 1999, the Space Shuttle Independent Assessment Team (SIAT)
scrutinised the shuttle programme after flight STS-93 had experienced
two serious in-flight anomalies. NASA grounded the fleet and waited
for the SIAT to produce what would turn out to be a very critical report.
Its findings fall within two categories. The bulk of the findings relate
to the erosion of safety practices, which were the apparent result of the
‘routinising’ and outsourcing that had fundamentally altered NASA
following the Kraft report. In addition, the SIAT criticised NASA’s risk
perception, essentially reopening the debate initiated (but never really
pursued) by the Rogers Commission.
The SIAT report makes clear that NASA’s altered shuttle programme
was suffering safety lapses. SIAT observed that the programme ‘had
undergone a massive change in structure in the last few years with
the transition to a slimmed down, contractor-run operation [ ...] This
has been accomplished with significant cost savings and without a
major incident’ (SIAT 2000:1)SIAT also concluded that the safety
programme had been eroded, making a disaster increasingly likely.
Moreover, the report concluded that the shuttles were increasingly suf-
fering age-related problems. It prescribed more resources and reinstat-
ing the safety and quality elements removed in response to Kraft. It also
proposed an overhaul of the ‘primary risk management strategy: more
consideration should be given to risk understanding, minimization, and
avoidance’.
NASA accepted the SIAT report. In response, the Clinton administra-
tion increased NASA’s resources (the ‘safety upgrades initiative’). The
incoming Bush administration, however, imposed a 34 percent reduc-
tion on this programme (Blount et al. 2005: 136; McDonald 2005).
10
More specifically, the Kraft Commission wanted to roll back the ‘independent
SR&QA element’, which had been instituted in response to the Rogers
recommendations.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
250 Arjen Boin
The CAIB report (2003)isacombination of two reports: the SIAT
report and the Rogers report.
11
It recaps the story of recent erosions
as documented by SIAT. The cause of the Columbia disaster, however,
had little to do with these safety erosions. The ‘foam problem’ had been
one of the oldest problems in the shuttle catalogue. The analogy with
Challenger seemed clear: why had NASA never solved the problem?
The integrated story line – ‘a broken safety culture’ and unacceptable
risk taking – created an image of a highly irresponsible organisation.
Conclusion: learning and the long shadow of
the Challenger disaster
Did NASA learn from the Challenger disaster? This chapter demon-
strates that the answer to such a seemingly straightforward question
depends on whom you ask and when. From the outset, we expected
postcrisis learning to be affected by the politics and dynamics of the
crisis aftermath. It turns out that the postcrisis phase lasts much longer
than imagined. The shadow cast by the Challenger disaster extends
well into the next century (cf. ‘t Hart and Boin 2001; Rosenthal et al.
2001).
During the first few years after the Challenger disaster, several exter-
nal bodies found that NASA had adopted the recommendations pre-
scribed by the Rogers Commission. These recommendations repre-
sented single-loop lessons: relatively easy fixes that did not require
NASA to change its institutionalised way of operating. NASA did not
adopt the double-loop lessons that could have been derived from the
Rogers report. Crucially, the Rogers Commission failed to translate
its double-loop findings into double-loop recommendations. We may
speculate that within NASA, very little enthusiasm existed to address
the institutional core of the organisation. The Challenger disaster, as
Vaughan (1996) shows, was not viewed as a result of lapses in NASA’s
safety system. On the contrary: it was viewed as a ‘normal accident’ (cf.
Perrow 1999)–adramatic yet inevitable hump on the road towards
a more reliable space vehicle. The disaster did not shake the belief in
the safety system that had served NASA so well in its proud history.
11
This analogy with the Challenger findings appears to have been furthered by
the influential role of Diane Vaughan as an advisor to the committee. See
Vaughan (2006) for her reflections on her time with CAIB. Moreover,
astronaut Sally Ride served on both the Rogers Commission and the CAIB.
On the dangers of historical analogies, see Br
¨
andstr
¨
om et al. (2004).
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 251
This is not an indicator of ‘organisational arrogance’, as the Rogers
Commission and CAIB wrote. It is the strong belief in a risk philosophy
for which no feasible alternatives are thought to exist.
It is this deeply entrenched risk philosophy that would serve as a
lighting rod for future commissions. NASA had never changed it since
Challenger; and several external committees had confirmed it since.
Nearly two decades after Challenger, CAIB framed its conclusions
around this philosophy and blasted NASA for not learning its lessons.
Ironically, the CAIB does not translate its double-loop findings into
double-loop lessons, which evokes, once again, the question whether a
feasible alternative exists. It does nevertheless provide us with a clear
lesson: however responsive an organisation may be in the aftermath of
a crisis, it does not negate the crisis itself. A disaster marks the history
of an organisation, providing benchmarks for future evaluation.
There is something inherently unfair about this finding. NASA con-
fronts the incredibly hard challenge of using experimental technology
to ferry humans back and forth to the most unforgiving environment
known. It does so with limited (and often shrinking) budgets. Political
and societal scrutiny is harsh. Expectations are high, while successful
performance is met with a yawn. Accidents are simply unacceptable.
In such an environment, perfect safety is an illusion. Perhaps the real
double-loop lesson is that space ambitions and the risk society do not
mix well.
12
We simply cannot have our cake and eat it.
The findings of this chapter caution against embracing the outcomes
of postcrisis reports without scrutinising the particular ways in which
lessons were reached and recommendations were formulated. Commis-
sions may authoritatively push intuitively acceptable findings, couching
them in selected social science findings. They may take on board aca-
demics and experts who may be tempted to push their (own) favorite
theory. The findings and recommendations may thus incur damag-
ing deficits that can only be revealed by close scrutiny and expert
discussion. In the rush towards political closure, there is not always
room for such debate.
The results of such a rush to judgement are rather serious for the
organisations and sectors involved. Long after the committees have
12
The Augustine Commission (1990) noted and accepted that space flight is
inherently risky; it predicted (on statistical grounds) another disaster within
thirty flights or so. The CAIB report pays lip service to the notion of inherent
risk in space adventures, but subsequently demands that NASA ask for ‘proof’
that it is safe to fly.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
252 Arjen Boin
been dissolved, organisations must work with their heritage. The find-
ings typically become enshrined as the sole accurate account of the cri-
sis and its roots. The recommendations that build on the findings may
become symbolic markers to measure progress in organisational com-
pliance. The room for organisational adaptation informed by organ-
isational experts may be limited. When external committees impose
their learning trajectory upon the crisis-affected organisation, the end
result may be less optimal than we like to believe.
References
Adams, G., and Balfour, D. 2004. Unmasking administrative evil. 2nd edn.
New York: Sharpe.
Argyris, C., and Sch
¨
on, D. A. 1978. Organizational learning: a theory of
action perspective. Amsterdam: Addison-Wesley.
Augustine Committee. 1990. Report of the advisory committee on the future
of the U.S. Space Program. http://history.nasa.gov/augustine/racfup1.
htm.
Blount, S., Waller, M. J. and Leroy, S. 2005. Coping with temporal
uncertainty: when rigid, ambitious deadlines don’t make sense. In Star-
buck, W. and Farjoun, M. (eds.) Organization at the limit: lessons from
the Columbia disaster. Oxford, UK: Blackwell, pp. 122–39.
Boin, A., and ‘t Hart, P. 2000. Institutional crises in policy sectors: an explo-
ration of characteristics, conditions and consequences. In Wagenaar, H.
(ed.) Government institutions: effects, changes and normative founda-
tions. Dordrecht: Kluwer Press, pp. 9–31.
Boin, A., ‘t Hart, P., Stern, E. K. and Sundelius, B. 2005. The politics of
crisis management: public leadership under pressure. Cambridge, UK:
Cambridge University Press.
Br
¨
andstr
¨
om, A., Bynander, F. and ‘t Hart, P. 2004. Governing by looking
back: historical analogies and crisis management. Public Administration
82(1):191–210.
Columbia Accident Investigation Board. 2003. Columbia accident investi-
gation report. Burlington, Ontario: Apogee Books.
Farjoun, M. 2005. Organizational learning and action in the midst of safety
drift: revisiting the space shuttle program’s recent history. In Starbuck, W.
and Farjoun M. (eds.) Organization at the limit: lessons from the Columbia
accident. Oxford, UK: Blackwell, pp. 60–80.
‘t Hart, P., and Boin, A. 2001. Between crisis and normalcy: the long
shadow of post-crisis politics. In Rosenthal, U., Boin, A. and Comfort,
L. K. (eds.) Managing crises: threats, dilemmas, opportunities. Springfield,
IL: Charles C. Thomas, pp. 28–46.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
Learning from crisis: NASA and the Challenger disaster 253
Johnson, S. B. 2002. The secret of Apollo: systems management in American
and European space programs. Baltimore, MD: Johns Hopkins University
Press.
Klerkx, G. 2004. Lost in space: the fall of NASA and the dream of a new
space age. New York: Pantheon Books.
Kraft Commission. 1995. Report of the space shuttle management indepen-
dent review team. www.fas.org/spp/kraft.htm.
Kranz., G. 2000. Failure is not an option: mission control from Mercury to
Apollo 13 and beyond. New York: Simon & Schuster.
Logsdon, J. M. (moderator). 1999. Managing the moon program: lessons
learned from project Apollo. Proceedings of an oral history workshop,
conducted July 21, 1999. Monographs in Aerospace History, Number 14.
Washington, DC: NASA.
McCurdy, H. E. 1993. Inside NASA: High technology and organizational
change in the U.S. space program. Baltimore, MD: Johns Hopkins Uni-
versity Press.
McCurdy, H. E. 2001. Faster, better, cheaper: low-cost innovation in the
U.S. space program. Baltimore, MD: Johns Hopkins University Press.
McDonald, H. 2005. Observations on the Columbia accident. In Starbuck,
W. , and Farjoun, M. (eds.) Organization at the limit: lessons from the
Columbia disaster. Oxford, UK: Blackwell, pp. 336–46.
Murray, C., and Cox, C. B. 1989. Apollo: The race to the moon. New York:
Simon and Schuster.
Perrow, C. 1994. The limits of safety: the enhancement of a theory of acci-
dents. Journal of Contingencies and Crisis Management 2(4):212–20.
Perrow, C. 1999. Normal accidents: living with high-risk technologies.
Princeton, NJ: Princeton University Press.
Petroski, H. 1992. To engineer is human: the role of failure in successful
design. New York: Vintage Books.
Presidential Commission on the Space Shuttle Challenger Accident. 1986.
Report to the president by the presidential commission on the space shuttle
Challenger accident. Washington, DC: Government Printing Office.
Rosenthal, U., Boin, A. and Bos, C. J. 2001. Shifting identities: the recon-
structive mode of the Bijlmer plane crash. In Rosenthal, U., Boin, A. and
Comfort L. K. (eds.) Managing crises: threats, dilemmas, opportunities.
Springfield, IL: Charles C. Thomas, pp. 200–215.
Sagan, S. D. 1993. The limits of safety: organizations, accidents, and nuclear
weapons. Princeton, NJ: Princeton University Press.
Space Shuttle Independent Assessment Team (SIAT). 2000. Report to asso-
ciate administrator. Washington, DC: NASA.
Starbuck, W., and Farjoun, M. (eds.) 2005. Organization at the limit: lessons
from the Columbia accident. Oxford, UK: Blackwell.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
P1: KNP
9780521885294c09 CUFX266/Boin 978 0 521 88529 4 January 4, 2008 13:58
254 Arjen Boin
Stern, E. K. 1997. Crisis and learning: a balance sheet. Journal of Contin-
gencies and Crisis Management 5:69–86.
Vaughan, D. 1996. The Challenger launch decision: risky technology, culture
and deviance at NASA. Chicago: University of Chicago Press.
Vaughan, D. 2005. System effects: on slippery slopes, repeating negative pat-
terns, and learning from mistakes? In Starbuck, W., and Farjoun, M. (eds.).
Organization at the limit: lessons from the Columbia disaster. Oxford, UK:
Blackwell, pp. 41–59.
Vaughan, D. 2006. NASA revisited: theory, analogy and public sociology.
American Journal of Sociology 112(2): 353–393.
Wolfe, T. 2005. The right stuff. New York: Black Dog and Leventhal Pub-
lishers.
Downloaded from Cambridge Books Online by IP 131.211.208.19 on Mon Jan 07 22:08:30 WET 2013.
http://dx.doi.org/10.1017/CBO9780511756122.009
Cambridge Books Online © Cambridge University Press, 2013
Article
Full-text available
The COVID-19 pandemic triggered a globally spread—but differently timed—implementation of school closures and other disruptive containment measures as governments worldwide intervened to curb transmission of disease. This study argues that the timing of such disruptive interventions reflects how governments balance the principles of precaution and proportionality in their pandemic decision-making. A theory is proposed of how their trade-off is impacted by two interacting institutional factors: electoral democratic institutions, which incentivise political leaders to increasingly favour precaution, and high state administrative capacity, which instead makes a proportional strategy involving later containment measures more administratively and politically feasible. Global patterns consistent with this theory are documented among 170 countries in early 2020, using Cox models of school closures and other non-pharmaceutical interventions. Corroborating the theorised mechanisms, additional results indicate that electoral competition prompts democratic leaders' faster response, and that this mechanism is weaker where professional state agencies have more influence over policymaking.
Article
Full-text available
This article investigates how governments shift blame during large-scale, prolonged crises. While existing research shows that governments can effectively diffuse blame through 'fuzzy' governance structures, less is known about blame diffusion patterns during severe crises when citizens widely expect governments to assume leadership. The article develops expectations on how blame diffusion patterns-consisting of blame-shifting onto lower-level government units, citizens and experts-look and differ in fuzzy governance structures (the political courant normal) and in consolidated governance structures (when governments are called on to consolidate responsibility). The article then tests this theoretical argument with a within-unit longitudinal study of the blame diffusion patterns employed by the Swiss Federal Council (FC) during press conferences held during the COVID-19 pandemic. The period under analysis (March-December 2020) is divided into three phases characterized by different governance structures due to the FC's enactment of emergency law. The analysis reveals that blame diffusion patterns vary considerably across phases and that blame spills out of the political system when fuzzy governance structures 'lose their bite'. These findings are relevant for our understanding of democratic governance under pressure.
Article
Full-text available
This study examined the mediated public diplomacy (MPD) contest between the United States (US) and Pakistan in promoting their preferred frames in foreign media during a conflict: the death of Osama bin Laden. MPD is the process through which a nation constructs favourable frames and communicates them via the mass media to build a positive image in a foreign country ( Entman, 2008 ) and to promote its foreign policy. Using frame building and Entman’s (2008) theory of MPD, the study first content analyzed news articles published in The New York Times of the US and The Dawn of Pakistan and official releases published by the governments in the two nations to investigate frames they constructed about the death of bin Laden. Next, it analyses news articles published in the foreign media of Australia, India, Saudi Arabia and Egypt to examine the extent to which these frames were adopted by newspapers in these foreign nations. Analysis shows that the US-sponsored frames (government and media) dominated the foreign media in Australia, India and Saudi Arabia. However, in Egypt, the US was not successful in promoting its preferred frames. The study concludes that in contested public diplomacy over a single conflicting event, political and cultural proximities to a foreign nation do not imply that a nation will be successful in dominating the media of the foreign nation. The study suggests that MPD is a complex process that involves multiple factors and their interactivity that determines successful frame building in the international arena.
Article
The aim of this article is to explore the nature of policy change in the domain of public finance (fiscal policy) in the wake of the coronavirus disease (Covid-19) pandemic as well as for a post-Covid era. It draws upon the literatures of path dependency and ideational change in public policy to consider three broad questions: (1) whether the pandemic really is a critical juncture for policy change; (2) whether the extant neoliberal austerity paradigm has faced lasting ideational displacement by Keynesianism; and (3) whether Covid-19 has really punctuated the existing fiscal policy equilibrium or rather served as a path-clearing accelerator of public finance trends that were already underway. The article then suggests three potential future trajectories: Keynesian, neoliberal, and mixed/other to consider how the path of policy change might materialize in the fiscal realm in the post-Covid era.
Chapter
Policy researchers often identify crises as one of the preconditions for structural policy change (Boin & ‘t Hart, 2000; J. Hall, 1993; Keeler, 1993).
Chapter
On May 12, 2008, an exceptional earthquake registering 8.0 on the Richter scale hit southwest China’s Sichuan Province, becoming the 21st deadliest earthquake in China (CPC 2008m; CPC in The third plenary session of the 17th central committee of the CPC, 2008g. http://www.npc.gov.cn/npc/xinwen/syxw/2008-10/20/content_1453636.htm). The earthquake was felt in more than 10 provinces, including Sichuan, Gansu, Shaanxi, Chongqing, Yunnan, Guizhou, and Hubei. The catastrophe affected approximately 30 million people, causing 69,226 deaths (as of August 21, 2008), injuring almost 375,000 people, leaving 18,000 missing and millions homeless, and relocating nearly 1.5 million residents (Xinhua Reporter in China sets stricter construction standards for schools after the earthquake, 2008f, December 28. http://en.people.cn/90001/90776/90882/6563083.html).
Chapter
Severe acute respiratory syndrome (SARS) struck China at the end of 2002 and the epidemic lasted for more than six months. SARS was the first outbreak of a readily transmissible disease in the twenty-first century (WHO, 2003b). From the detection of the first case on November 16, 2002, in Guangdong Province in South China, to Beijing’s removal from the World Health Organization’s (WHO’s) SARS list on June 24, 2003, the fight against the infectious disease lasted for nearly eight months.
Chapter
We know of plenty of examples where crises give rise to reform (such as the homeland security system after the 9/11 events in the USA and the emergency management system after the SARS crisis in China). We also know of crises that lead straight back to the status quo (such as the safety work management system after several extraordinarily serious safety accidents in China). This prompts an interesting question: What explains this variance in policy change? This book intends to answer this question in the context of China.
Chapter
This chapter offers the foundations of theory explaining the erosion and decline of public institutions.
Book
The book offers important insight relevant to Corporate, Government and Global organizations management in general. The internationally recognized authors tackle vital issues in decision making, how organizational risk is managed, how can technological and organizational complexities interact, what are the impediments for effective learning and how large, medium, and small organizations can, and in fact must, increase their resilience. Managers, organizational consultants, expert professionals, and training specialists; particularly those in high risk organizations, may find the issues covered in the book relevant to their daily work and a potential catalyst for thought and action. A timely analysis of the Columbia disaster and the organizational lessons that can be learned from it. Includes contributions from those involved in the Investigation Board report into the incident. Tackles vital issues such as the role of time pressures and goal conflict in decision making, and the impediments for effective learning. Examines how organizational risk is managed and how technological and organizational complexities interact. Assesses how large, medium, and small organizations can, and in fact must, increase their resilience. Questions our eagerness to embrace new technologies, yet reluctance to accept the risks of innovation. Offers a step by step understanding of the complex factors that led to disaster.
Article
Crisis management has become a defining feature of contemporary governance. In times of crisis, communities and members of organizations expect their leaders to minimize the impact of the crisis at hand, while critics and bureaucratic competitors try to seize the moment to blame incumbent rulers and their policies. In this extreme environment, policy makers must somehow establish a sense of normality, and foster collective learning from the crisis experience. In this uniquely comprehensive analysis, the authors examine how leaders deal with the strategic challenges they face, the political risks and opportunities they encounter, the errors they make, the pitfalls they need to avoid, and the paths away from crisis they may pursue. This book is grounded in over a decade of collaborative, cross-national case study research, and offers an invaluable multidisciplinary perspective. This is an original and important contribution from experts in public policy and international security.
Article
This ethnographic account of the rituals of risk and error after NASA's Columbia accident reveals the mechanisms by which so- ciological theory traveled across the disciplinary boundary to public and policy domains. The analysis shows that analogy was the in- stigator of it all, enabled by the social mechanisms of professional legitimacy, conversation, technologies, time, networks, and social support. It demonstrates the work sociologists do when theory trav- els from professional sociology to nonacademic audiences and what happens to the theory and the sociologist in the process. It reveals the tensions when professional sociology, critical sociology, public sociology, and policy sociology are joined. A study of sociology in the field, it shows how sociologists negotiate the meaning of their work in a nonacademic situation. Thus, this account contributes to research and theory on social boundaries, the diffusion of ideas, the sociology of scientific knowledge, and current debates about public sociology and the role of the sociologist, adding to the sociology of our own work.
Article
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.