Content uploaded by David Borys
Author content
All content in this area was uploaded by David Borys on Jul 07, 2014
Content may be subject to copyright.
VOLUME 1 ISSUE 1 OCTOBER 2009 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE 19
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
AbstrAct
It has been argued that OHS has developed and evolved through a technical age, a human factors age and a
management systems age or through a technical wave, a systems wave and a culture wave. A fourth age of safety has
been described as the integration age. As the limitations of OHS management systems and safety rules that attempt to
control behaviour are becoming evident, it is proposed that we are moving into a fifth age of safety, the ‘adaptive age’;
an age which transcends rather than replaces the other ages of safety. The adaptive age embraces adaptive cultures
and resilience engineering and requires a change in perspective from human variability as a liability and in need of
control, to human variability as an asset and important for safety. Embracing variability as an asset challenges the
comfort of management. However, the gap between work as imagined and work as performed and the failure of OHS
management systems and safety rules to adequately control risk mean that a new perspective is required.
IntroductIon
This paper presents a review of existing and emerging
approaches for managing occupational health and
safety (OHS) and puts forward the view that, under
certain circumstances, more adaptive approaches to
managing OHS are required.
Hale and Hovden (1998) have argued that OHS
has developed and evolved through three so-called
‘ages of safety’. The first age was a technical age,
the second a human factors age and the third a
management systems age. A different sequence of
development was put forward by Hudson (2007),
who suggested that safety has evolved through three
waves. The first was a technical wave, the second a
systems wave and the third a culture wave. Both of
these views suggest that the process of development
has been sequential. Glendon et al. (2006) posits an
alternative view, that each period of development
does not leave behind, but rather builds on, what has
gone before. He refers to this process of development
as the fourth age of safety or the ‘integration age’
where previous ways of thinking are not lost, but
remain available to be reflected upon as multiple,
more complex perspectives develop and evolve.
Notwithstanding the suggested integration age
(Glendon et al., 2006), it may be timely to introduce
the possibility that we are moving into a fifth age
of safety or an ‘adaptive age’. The adaptive age
transcends all other ages without discounting them,
whilst introducing the concept of ‘adaptation’, the
adaptive age goes beyond simply integrating the
past. This notion is informed by current discussions
around resilience engineering (Hollnagel, 2006)
Cite this article as: Borys, D., Else, D.
& Leggett, S., (2009) The fifth age of
safety: the adaptive age?, J Health &
Safety Research & Practice, (1)1, 19-27
1VIOSH Australia, University of
Ballarat, University Drive, Mt
Helen, Victoria, Australia
Correspondence: David Borys, VIOSH
Australia, University of Ballarat,
University Drive, Mt Helen, Victoria,
Australia, d.borys@ballarat.edu.au
Key words
resilience, mindfulness,
culture, adaption
The fifth age of safety: the adaptive age
DaviD Borys1
Dennis else1
susan leggett1
20 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE VOLUME 1 ISSUE 1 OCTOBER 2009
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
and ‘efficiency-thoroughness trade-offs’
(ETTO) (Hollnagel, 2009a) that take us
beyond the contemporary ways of thinking
about managing OHS that typically focus
on OHS management systems (OHSMS),
safety culture and safety rules.
beyond oHs mAnAgement
systems to AdAptIve cultures
Increasingly, the limitations of an over-
emphasis on documented management
systems have started to emerge. Robson
et al. (2005) in their systematic review of
health and safety management systems
found that “there is insufficient evidence
in the published, peer-reviewed literature
on the effectiveness of OHSMSs to make
recommendations either in favour of or
against OHSMSs” (p. 9). The 1999 Report
of the Longford Royal Commission into
the explosion at Esso’s Longford gas plant
in Victoria found that although Esso had
a world class OHSMS, the system had
taken on a life of its own, “divorced from
operations in the field” and “diverting
attention away from what was actually
happening in the practical functioning
of the plants at Longford” (Dawson &
Brooks, 1999, p. 200).
Similarly, Hopkins (2007), in his analysis
of the 1996 Gretley mine disaster concedes
that “experience is now teaching us that
safety management systems are not enough
to ensure safety” (p. 124). Further, a 2007
report commissioned by the New South
Wales Mines Advisory Council argued
that an OHSMS should be built on the
principles of mindfulness and not be a
“complex, paper-based OHS management
system” (p. xiii).
Reason (2000) contends that managers
believe that OHSMS sit apart from culture.
He suggests that an over-reliance on
systems and insufficient understanding of,
and insufficient emphasis on, workplace
culture, can lead to failure because “it is the
latter that ultimately determines the success
or failure of such systems” (p. 5).
Safety culture has emerged as a major
focus in improving OHS performance.
Hopkins (2005) argues that this stems in
part from recognition of the limitations
of OHSMS. In his analysis of the 1999
Glenbrook train crash involving a
commuter train and the Indian Pacific,
Hopkins identifies the danger of a culture of
rules, a culture of silos, a culture of on-time
running, together with the related dangers
of a culture that is risk-blind or risk-
denying. These are matters that are outside
the scope of traditional OHSMS and it may
be that OHSMS mask the emergence of
these cultures which become all too readily
available to see with hindsight.
Hopkins (2007) views safety culture as
one aspect of organisational culture, or more
particularly an organisational culture that is
focused on safety. Further, culture is viewed;
as a group, not an individual, phenomenon;
efforts to change culture, should, in the
first instance, focus on changing collective
practices (the practices of both managers and
workers) and the dominant source of culture
is what leaders pay attention to. Much of
Hopkins work draws on Reason’s (1997)
notion that a safe culture is an informed
culture and Weick and Sutcliffe’s (2001;
2007) principles of collective mindfulness.
Reason (1997) argues that culture can be
socially engineered by managers and that
a safe culture is an informed culture. He
argues that in navigating the safety space
between increasing vulnerability to risk and
increasing resistance to risk, organisations
should strive for maximum resistance
to risk (as opposed to the unobtainable
goal of ‘zero risk’). He goes on to argue
that there are three cultural drivers that
allow organisations to achieve maximum
resistance to risk: (i) Commitment reflected
in the provision of resources to mitigate
risk, even in tough times; (ii) Cognisance
reflected in an awareness of the dangers
that threaten operations; (iii) Competence
gained from an information system that
provides managers with an understanding
of where they are relative to the edge of
safety without having to fall over it first.
The latter point is achieved through the
engineering of an informed culture and in
VOLUME 1 ISSUE 1 OCTOBER 2009 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE 21
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
Reason’s view; an informed culture is a
safety culture. An informed culture is made
up of the four interlocking sub-cultures of a
reporting culture, a learning culture, a just
culture and a flexible culture.
Hudson suggests (2007) that safety
culture evolves and may be represented
by a five step ladder of distinct stages:
pathological, reactive, calculative,
proactive and generative. Progression up
the ladder is associated with increasing
trust, accountability and informedness
(as in Reason’s informed culture). What
remains unclear is how organisations move
from one step on the ladder to another.
An alternative view suggests that culture
is not homogeneous within organisations
and can be both differentiated and
fragmented (Richter & Koch, 2004).
Much as managers may espouse the safety
values associated with a single corporate
culture, organisations may consist of many
cultures based on professional groupings
(Gherardi et al., 1998; Schein, 1996) or
other communities of practice (Gherardi &
Nicolini, 2000).
The adaptive age requires an acceptance
by organisational leaders that groups of
workers may, through interaction with
one another and the tasks they perform
together, create their own shared meanings
about what it is to work safely. Under this
view that culture is ‘socially constructed’
(Gherardi & Nicolini, 2000), leaders do not
so much hope to engineer a single culture
but attempt to understand and influence
these differentiated and fragmented
cultures such that they are at least aligned
with the corporate culture (Martin, 2002).
Further, Weick and Sutcliffe (2007) argue
that where integrated cultures deny
ambiguity, differentiated and fragmented
cultures handle ambiguity better, a feature
more consistent with High Reliability
Organisations. The implication is that the
adaptive age requires adaptive cultures.
The notions of an adaptive age and
adaptive cultures may also require a
change in perspective in relation to the
causes of fatalities, injuries and disease
and a corresponding implicit awareness of
more than one perspective for preventing
fatalities injuries and disease. This change
in perspective is captured by Hollnagel
(2008a) who contrasts two perspectives on
safety: theory W and theory Z as shown in
Table 1. He argues that to improve safety,
a change in perspective is required towards
theory Z; a theory that accepts that
humans, because of their capacity to adapt
to demands, are an asset to the proper
functioning of modern organisations.
THEORY W : MANAGERIAL PERSPECTIVE
(TECHNOLOGICAL OPTIMISM)
THEORY Z: SYSTEMIC PERSPECTIVE
(TECHNOLOGICAL REALISM)
Things go right because people:
Systems are well designed and
scrupulously maintained
Procedures are complete and correct
People behave as they are expected
to – as they are taught
Designers can foresee and
anticipate every contingency
Things go right because people:
Learn to overcome design flaws
and functional glitches
Adapt their performance to meet demands
Interpret and apply procedures
to match conditions
Can detect and correct when things go wrong
Humans are a liability and variability is a threat.
The purpose of design is to constrain variability,
so that efficiency can be maintained.
Humans are an asset without which the
proper functioning of modern technological
systems would be impossible.
Table 1 Summarising the key perspective changes required in the adaptive age
Source: Hollnagel, 2008(b)
22 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE VOLUME 1 ISSUE 1 OCTOBER 2009
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
However, the need for adaptation is
contingent upon an understanding of the
complexity of the organisation (socio-
technical system) that is being managed.
In some organisations (systems), adapting
may be a pre-requisite for safe performance
whilst in others it may be disastrous. Dekker
(2001), for example, makes the point that
failing to adapt can be disastrous under
certain circumstances and he cites the case
of an aircraft which crashed into the sea
off the cost of Nova Scotia in 1998. In this
case, following procedures for dealing with
smoke and fire and not descending too fast,
rather than dumping fuel and descending
rapidly, led to the plane becoming
uncontrollable and crashing into the sea.
The dilemma here is that, under certain
circumstances, following procedures may
result in fatalities and injuries. However,
at another time and in a different context,
not following procedures may also lead to
fatalities and injuries. Thus adaptation is a
double-edged sword (Dekker, 2006). This
poses a challenge to how we are to think
about and action Hollnagel’s Theory Z. In
the adaptive age, Theory Z does not imply
mindless abandonment of procedures, or
a “free for all”, rather it requires a more
demanding standard of attention resulting
in a more subtle, nuanced and refined
appreciation of how OHS is managed that
embodies the capacity to be adaptive rather
than rule bound. To better understand
this dilemma, Hollnagel (2009a) offers a
two dimensional model of performance
variability and risk as shown in Figure 1.
LOW Risk of adverse outcomes HIGH
LOW Need for performance adjustments HIGH
Loose Coupling Tight
Tractable Manageability Intractable
• Chemical Plant
• Nuclear Power
• Manufacturing Plant
• University
Figure 1 Hollnagel’s Dimensions of performance variability and risk
VOLUME 1 ISSUE 1 OCTOBER 2009 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE 23
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
The first dimension in Hollnagel’s model
(Hollnagel, 2009a) is system ‘manageability’
or controllability. Within tractable systems
(simple, stable systems that are easy
to control) the need for adaptability is
low. By comparison, intractable systems
(complex systems subject to change) the
need for adaptability is high. The second
dimension is coupling (or the degree of
inter-dependence between parts of the
system). Tightly coupled systems are
characterised by more time dependant
processes, invariant sequences, little slack
and only one way to reach production
goals (Perrow, 1999). In tightly coupled
systems the risk of adverse outcomes is
high. Within loosely coupled systems it is
low. This results in four possible ways to
characterise an organisation (Hollnagel,
2009a); (i) a loosely coupled tractable
system where the work is routine, requires
little in the way of performance variability
and any performance variability that
is present will have negligible impact
upon performance; (ii) a loosely coupled
intractable system is less predictable and
the need for performance adjustments
will be higher, however, any performance
variability will have negligible impact
upon performance; (iii) a tightly coupled
tractable system also requires little in the
way of performance adjustments; however,
performance adaptations that are made
and that fail (Dekker, 2003) may quickly
result in unwanted consequences because
of tight coupling; and (iv) a tightly coupled
intractable system may require constant
performance adjustments to operate safely.
Therefore the ways of thinking about and
approaches to managing OHS must be at
least equal to the demands and complexity
of the socio-technical system associated
with the organisation’s activities. If it is
decided that the organisation is a tightly
coupled, intractable system, for example
nuclear power, then a more adaptive
response will be necessary. Alternatively,
if it is decided that the organisation is
a loosely coupled, tractable system, for
example, a manufacturing plant, then
fewer adaptive responses will be necessary.
beyond sAfety rules to
collectIve mIndfulness
Safety rules are often written on the
basis that greater control of workers’
behaviour will not only lead to a safer
workplace, but also act as a buffer
against prosecution in the case of
an accident. However, opinions are
emerging that more safety rules and
less variability in worker behaviour
does not necessarily equate with
improved safety performance. In some
cases, writing more rules following an
incident may lead to conflict between
the rule and the actions required to
undertake a task (Reason, 1997).
Hopkins (2005) prefers to complement
safety rules with a strategy of risk-
awareness which invites workers “to
attend to the risks they face and not
simply comply with rules in a mindless
fashion” (p. 18). This is supported
by examples from industry (Hale et
al., 2003; Jeffcott et al., 2006) and by
Dekker (2003) who argues that “rather
than simply increasing pressure to
comply, organisations should invest in
their understanding of the gap between
procedures and practice, and help
develop operators’ skill at adapting”
(p. 233). He goes on to propose that
organisations need to:
“(a) Monitor the gap between procedure and practice
and try to understand why it exists (and resist trying
to close it by simply telling people to comply).
(b) Help people to develop skills to judge when and
how to adapt (and resist telling people only that they
should follow procedures)” (p. 236).
This is captured by the term “Collective
Mindfulness” that is based on the premise
that “unvarying procedures can’t handle
what they didn’t anticipate” (Weick et al.,
1999, p. 86). Or to put it another way,
variability in human performance enhances
safety whilst unvarying performance can
undermine safety, particularly in complex
socio-technical systems.
24 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE VOLUME 1 ISSUE 1 OCTOBER 2009
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
In his analyses of the Esso Longford
gas plant in Victoria (Hopkins, 2001)
and the Gretley mine disaster (Hopkins,
2007) Hopkins is critical of the absence of
mindfulness among managers and identifies
the need for mindful leadership as one
strategy for averting disaster. In his analysis
of the BP Texas City explosion Hopkins
(2008) discusses how BP had embarked
upon a quest to become a High Reliability
Organisation (HRO) (to exhibit the
characteristics of collective mindfulness)
but was largely unsuccessful because they
focused on educating front line workers
to think differently without instituting
the organisational practices necessary to
support collective mindfulness.
Effective HROs organise themselves to
learn from failure rather than celebrating
success (Weick et al., 1999) and give strong
responses to weak signals (Weick & Sutcliffe,
2001, p. 4). In short, they are “complex
adaptive systems” (Weick et al., 1999, p.
117). HROs are adaptive because; they are
‘preoccupied with failure’ and treat “any lapse
as something wrong with the system” (p. 9);
they are ‘reluctant to simplify’ and strive to
simplify less and see more; they are ‘sensitive
to operations’ and encourage situation
awareness among front line workers; they
have a ‘commitment to resilience’ and do not
allow errors to disable them; and they exhibit
‘deference to expertise’ and move decision
making to those people on the front line with
the most expertise.
More recently, Reason (2008) has argued
that both individual mindfulness and
collective mindfulness are necessary for
“maintaining a state of intelligent wariness”
(p. 241). This view represents a departure
from the view expressed by Weick and
Hopkins, a view that emphasises collective
mindfulness over individual mindfulness.
Reason (2008, p. 31) defends the need
for individual mindfulness by posing the
question: “If we cannot make systems
immune to organisational accidents, what
can we do to improve the reliability and
error wisdom of those at the sharp end?”
The ‘sharp end’ refers to any person who
is directly interacting with the hazards in a
particular context and at a particular time. In
essence, it is these people that are the last line
of defence between safe and unsafe outcomes.
Therefore, providing people at the sharp end
with the skills of knowing when to adapt
is good for safety and when it could be life-
threatening. It may mean complementing
safety rules and procedures with what
Iszatt-White (2007, p. 452) refers to as
“heedfulness”. However, workers will need to
trust in the “efficacy and applicability” of the
safety rules if the rules are to over-ride workers
propensity to think that they can work safely
without following the safety rules (Iszatt-
White, 2007, p. 461). To enhance heedfulness,
Iszatt-White (2007, p. 463) argues that “the
HRO notions of heedfulness, mutual checking
and initiative offer a useful lens through
which to consider the shortcomings of rule-
based safety approaches”. This approach to
managing OHS is again indicative that we are
entering an adaptive age.
Providing that interventions designed
to encourage individual mindfulness
or heedfulness are complemented
with mindfulness or heedfulness at the
organisational level, then it represents a
worthwhile step forward particularly if one
is to adopt the perspective that variability in
performance is better for safety. Individual
mindfulness requires workers at the sharp
end to have the skills and knowledge to be
able to judge when and how to adapt to local
circumstances, and when not to adapt, and
is consistent with the third HRO principle
of being ‘sensitive to operations’. Some
organisations attempt to achieve this through
programs that encourage mindfulness or
what Hopkins refers to as “risk-awareness”
(Hopkins, 2005) in individual workers.
However, Borys (2009) in a study of one
program, found that the program was little
more than a ritual that focused on completing
paperwork rather than an incentive to think
carefully about risks. All that it managed
to achieve was a culture of completing
the paperwork, highlighting the need for
organisational practices to work in support
of individual mindfulness.
VOLUME 1 ISSUE 1 OCTOBER 2009 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE 25
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
from collectIve mIndfulness to
resIlIence engIneerIng
Contemporary approaches to safety have
attempted to establish safe systems and
ensure that managers and workers work
inside the boundaries of those safety
systems (Woods & Hollnagel, 2006). Thus
it is assumed that constraining human
performance is essential for safety. An
alternative paradigm that is emerging is that
safety is achieved by managers and workers
adapting to changing circumstances. In
this case, it is the variability in human
performance, relative to the situation,
that is essential for safety. Although this
paradigm emphasises adaptive practices,
these practices are designed to complement
not replace good safe design principles
whilst acknowledging that complex socio-
technical systems will always present
opportunities for surprise. Therefore,
under this alternative paradigm, safety is
understood as a “characteristic of how a
system performs” (Woods & Hollnagel,
2006, p. 347) and that resilience is a quality
that emerges from the functioning of the
system. Resilience engineering subscribes
to this alternative paradigm and in doing
so, is similar to collective mindfulness
and heedfulness as all three concepts
focus on the importance of performance
variability for safety. However, what sets
resilience engineering apart from collective
mindfulness is the focus on learning
from successful performance as well as
unsuccessful performance (Hollnagel,
2008c, 2009b) i.e. why things go right
and as well as why things go wrong.
The rationale for this perspective is that
failures and successes result from the
same underlying processes (Hollnagel,
2009b). Hollnagel (2008b) argues that “it
is necessary to study both successes and
failures and to find ways to reinforce the
variability that lead to successes as well as
dampen the variability that leads to adverse
outcomes” ( p. xii). Thus Hollnagel (2009b,
p. 117) states:
A resilient system is able effectively to adjust its
functioning prior to, during, or following changes
and disturbances, so that it can continue to perform
as required after a disruption or a major mishap, and
in the presence of continuous stresses.
Resilience engineering research has
focussed on intractable and tightly coupled
systems such as air traffic control centres
and hospital emergency departments
and led researchers to identify a range
of markers of resilience. While there is
no agreement on these, one marker that
has been referred to repeatedly in the
resilience engineering literature is the gap
between work as imagined and work as
actually done (Dekker, 2006; Dekker &
Suparamaniam, 2005). One reason for the
widening of this ‘gap’ is a phenomenon
known as “practical drift” (Snook, 2000).
Practical drift refers to a situation where,
over time, local work practices ‘drift’ away
from the original intent at the time of system
design, to more locally efficient work
practices. However, if the local practices
drift unnoticed and the degree of coupling
in the system switches from loose to tight
coupling, for example, circumstances may
change resulting in functions becoming
more time dependant (Perrow, 1999)
without a corresponding change in local
practices from task to rule focused, then
the results can be catastrophic. Such was
the case in the friendly fire shoot down of
a Blackhawk helicopter over northern Iraq
in 1994 (Snook, 2000). In this case, crews
were struggling to make sense of their
situation and in the time available, failed to
do so. Each level of the system, individual,
group and organisational, failed to identify
that local practice had uncoupled from the
written procedures. When there is slack in
the system, this is seen as being efficient, but
when circumstances change and revert to
being tightly coupled and time dependant,
like when attempting to identify if the
helicopters below you are friend or foe,
then the resultant decisions can be deadly.
The adaptive age demands that people at
all levels of the organisation need to be able
to distinguish between drift that is adaptive
and improves organisational performance
26 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE VOLUME 1 ISSUE 1 OCTOBER 2009
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
and drift that becomes dangerous.
The solution to drift is not attempting to
further restrict performance variability as
this simply sets up a new cycle of practical
drift. Rather, it is more appropriate to
monitor and detect drift toward failure
and attempt to estimate the distance
“between operations as they really go on,
and operations as they are imagined in
the minds of managers and rule-makers”
(Dekker, 2006, p. 78).
Therefore “drift into failure” can be used
as a metaphor for organisations wishing to
become more resilient. For organisations
this may mean making the gap between
work as imagined and work as actually
performed visible because the more the gap
remains hidden, the more likely it is that the
organisation will drift into failure. In fact
Dekker and Suparamanian (2005) go so far
as to say that the larger the gap “the less likely
that people in decision-making positions
are well calibrated to the actual risks and
problems facing their operation” (p. 3).
conclusIon
As the limitations of OHSMS and safety
rules that attempt to control behaviour are
becoming evident, it is time to consider that
we are moving into a fifth age of safety, the
‘adaptive age’; an age which transcends
rather than replaces the other ages of safety,
ages which include the dominant safety
paradigm that assumes that safety is achieved
by establishing safe systems and ensuring
that managers and workers work inside the
boundaries of those safety systems.
The adaptive age challenges the view
of an organisational safety culture and
instead recognises the existence of socially
constructed sub-cultures. The adaptive age
embraces adaptive cultures and resilience
engineering and requires a change in
perspective from human variability as a
liability and in need of control, to human
variability as an asset and important for
safety. In the adaptive age learning from
successful performance variability is as
important as learning from failure.
references
Borys, D. (2009). Exploring risk-awareness
as a cultural approach to safety: Exposing
the gap between work as imagined
and work as actually performed. Safety
Science Monitor, 13(2), 1-11.
Dawson, D. M., & Brooks, B. J. (1999).
The Esso Longford gas plant accident:
Report of the Longford Royal Commission.
Melbourne, Vic: Parliament of Victoria.
Dekker, S. (2001). Follow the procedure
or survive. Human Factors and
Aerospace Safety, 1(4), 381-285.
Dekker, S. (2003). Failure to adapt
or adaptations that fail: Contrasting
models on procedures and safety.
Applied Ergonomics, 34, 233-238.
Dekker, S. (2006). Resilience engineering:
Chronicling the emergence of confused
consensus. In E. Hollnagel, D. D. Woods &
N. Leveson (Eds.), Resilience engineering:
Concepts and precepts. Hampshire: Ashgate.
Dekker, S., & Suparamaniam, N. (2005).
Divergent images of decision making
in international disaster reilef work
(No. 2005-01). Ljungbyhed, Sweden:
Lund University School of Aviation.
Gherardi, S., & Nicolini, D. (2000).
The organizational learning of safety
in communities of practice. Journal of
Management Inquiry, 9(1), 7-19.
Gherardi, S., Nicolini, D., & Odella, F.
(1998). What do you mean by safety?
Conflicting perspectives on accident and
safety management in a construction
firm. Journal of Contingencies & Crisis
Management, 6(4), 202-213.
Glendon, A. I., Clarke, S. G., & McKenna,
E. F. (2006). Human safety and risk
management (2nd ed.). Boca Raton, FL.
Hale, A. R., Heijer, T., & Koornneef, F. (2003).
Management of safety rules: The case of
railways. Safety Science Monitor, 7(1), 1-11.
Hale, A. R., & Hovden, J. (1998). Management
and culture: the third age of safety. A
review of approaches to organizational
aspects of safety, health and environment.
In A. M. Feyer & A. Williamson (Eds.),
VOLUME 1 ISSUE 1 OCTOBER 2009 JOURNAL OF HEALTH & SAFETY RESEARCH & PRACTICE 27
THE FIFTH AGE OF SAFETY: THE ADAPTIVE AGE
Occupational injury: Risk prevention and
intervention. London: Taylor and Francis.
Hollnagel, E. (2006). Resilience: The
challenge of the unstable. In E. Hollnagel,
D. D. Woods & N. Leveson (Eds.),
Resilience engineering: Concepts and
precepts. Hampshire, England: Ashgate.
Hollnagel, E. (2008a). Human factors -
understanding why normal actions sometimes
fail. Paper presented at the Railway Safety
in Europe: Towards Sustainable Harmonised
Regulation 18th November, Lille, France.
Hollnagel, E. (2008b). Resilience
engineering in a nutshell. In E. Hollnagel, C.
P. Nemeth & S. Dekker (Eds.), Resilience
engineering perspectives, volume 1:
Remaining sensitive to the possibility of
failure. Hampshire, England: Ashgate.
Hollnagel, E. (2008c). Safety management:
Looking back or looking forward. In E.
Hollnagel, C. P. Nemeth & S. Dekker (Eds.),
Resilience engineering perspectives, volume
1: Remaining sensitive to the possibility of
failure. Hampshire, England: Ashgate.
Hollnagel, E. (2009a). The ETTO
principle. Surrey, England: Ashgate.
Hollnagel, E. (2009b). The four
cornerstones of resilience engineering.
In C. P. Nemeth, E. Hollnagel & S.
Dekker (Eds.), Resilience engineering
perspectives, volume 2: Preparation and
restoration. Surrey, England: Ashgate.
Hopkins, A. (2001). Lessons from
Longford: The ESSO gas plant explosion.
Sydney: CCH Australia Ltd.
Hopkins, A. (2005). Safety, culture
and risk. Sydney: CCH Australia.
Hopkins, A. (2007). Lessons from
Gretley: Mindful leadership and the
law. Sydney: CCH Australia.
Hopkins, A. (2008). Failure to learn: The BP
Texas City refinery disaster. Sydney: CCH.
Hudson, P. (2007). Implementing
safety culture in a major multi-national.
Safety Science, 45, p697-722.
Iszatt-White, M. (2007). An ethnography of
rule violation. Ethnography, 8(4), 445-465.
Jeffcott, S., Pidgeon, N., Weyman, A., &
Walls, J. (2006). Risk, trust and safety
culture in U.K. train operating companies.
Risk Analysis, 26(5), 1105-1121.
Martin, J. (2002). Organizational culture:
Mapping the terrain. Thousand Oaks CA: Sage.
Perrow, C. (1999). Normal accidents: Living
with high-risk technologies. Princeton,
New Jersey: Princeton University Press.
Reason, J. (1997). Managing the risks of
organizational accidents. Aldershot: Ashgate.
Reason, J. (2000). Beyond the
limitations of safety systems. Australian
Safety News, April, 54-55.
Reason, J. (2008). The human contribution:
Unsafe acts, accidents and heroic
recoveries. Surrey, England: Ashgate.
Richter, A., & Koch, C. (2004). Integration,
differentiation and ambiguity in safety
cultures. Safety Science, 42, 703-722.
Robson, L., Clarke, J., Cullen, K., Bielecky,
A., Severin, C., Bigelow, P., et al. (2005). The
effectiveness of occupational health and safety
management systems: A systematic review.
Toronto, Ontario: Institute for Work & Health.
Schein, E. H. (1996). Three
cultures of management: The key
to organizational learning. Sloan
Management Review, Fall, 9-29.
Snook, S. A. (2000). Friendly fire: The
accidental shootdown of U.S. Black
Hawks over northern Iraq. Princeton,
NJ: Princeton University Press.
Weick, K. E., & Sutcliffe, K. M.
(2001). Managing the unexpected.
San Francisco: Jossey-Bass.
Weick, K. E., & Sutcliffe, K. M. (2007).
Managing the unexpected (2nd ed.). San
Francisco CA: John Wiley & Sons.
Weick, K. E., Sutcliffe, K. M., & Obstfeld,
D. (1999). Organizing for high reliability:
Processes of collective mindfulness. Research
in Organizational Behaviour, 21, 81-123.
Woods, D. D., & Hollnagel, E. (2006).
Prologue: Resilience engineering concepts.
In E. Hollnagel, D. D. Woods & N. Leveson
(Eds.), Resilience engineering: Concepts and
precepts. Hampshire, England: Ashgate