ArticlePDF Available

A Comprehensive Model for Human Behaviour in Industrial Environments

Authors:

Abstract

The lack of an integrated approach to theories of human behaviour in the industrial environment forms an unaddressed problem for Psychology as a discipline. People behave as a totality whereas we only have local micro-models to explain specific behaviours in restricted environments. In this paper we present a comprehensive framework that makes explicit the relationships between a wide variety of existing and well-validated models in different areas of specialisation within the study of psychology. The model integrates the range from perceptual processing, through decision-making and its biases, to the modelling of selection of action and the implementation of intentions. Lower level theories, perception and decision-making are typically the preserve of Cognitive Psychology, while later elements are usually seen as part of Organisational, Social and Health Psychology. The lack of an integrated model makes it difficult for non-psychologists, typically those who wish to make use of the insights provided after more than 100 years of study, to understand how human behaviour is explained and might be directed in an industrial setting.
A Comprehensive Model for Human Behaviour
in Industrial Environments
A. Sneddon, P.T.W. Hudson, D. Parker, M. Lawrie, M. Vuijk & R. Bryden
University of Aberdeen
Leiden University
Manchester University
Shell International Exploration and Production
Abstract
The lack of an integrated approach to theories of human behaviour in the industrial
environment forms an unaddressed problem for Psychology as a discipline. People
behave as a totality whereas we only have local micro-models to explain specific
behaviours in restricted environments. In this paper we present a comprehensive
framework that makes explicit the relationships between a wide variety of existing
and well-validated models in different areas of specialisation within the study of
psychology. The model integrates the range from perceptual processing, through
decision-making and its biases, to the modelling of selection of action and the
implementation of intentions. Lower level theories, perception and decision-making
are typically the preserve of Cognitive Psychology, while later elements are usually
seen as part of Organisational, Social and Health Psychology. The lack of an
integrated model makes it difficult for non-psychologists, typically those who wish to
make use of the insights provided after more than 100 years of study, to understand
how human behaviour is explained and might be directed in an industrial setting.
The model was originally developed to understand how people behave safely, or
unsafely, in hazardous industrial settings, leaning upon work investigating high
reliability industries and advanced safety cultures. It appears to provide a robust
framework that allows extension to areas such as effective leadership behaviours,
situation awareness, real time decision-making under stress, as well as driving
behaviour.
A Comprehensive Model for Human Behaviour
in Industrial Environments
Introduction
The nature of high reliability organisations is a work environment which possesses the
capacity to be very hazardous. Managing these hazards to reduce the risk of accidents
dictates the way in which most jobs are performed. Supporting the individual worker
there or usually engineering safeguards, management systems, competence
programmes as well as rules and procedures. However, despite these preventative
mechanisms accidents still tragically occur and the question remains why? Research
into a number of high-hazard, high reliability industries has indicated that human
factors causes can be attributed to around 80% of accidents (Hoyos, 1995). In
particular, research into human factors causes of accidents in the oil and gas industry
(e.g. Rundmo, 1995; Rundmo, Hestad & Ulleberg, 1998) has indicated that lack of
care and attention is often reported as one of the main immediate human factors
causes (Mearns, Flin, Fleming & Gordon, 1997). Usually behind these immediate
causes are several underlying causes such as managerial decisions and other
leadership failures with their roots much deeper in the organisation.
One reason for the prevalence of human factors causes in incidents is that human error
is inevitable, because of the fundamental imitations of human cognitive architecture.
In high-reliability industries such as nuclear power, fire-fighting, aviation, and oil and
gas exploration errors can have potentially catastrophic repercussions, which is why
there are precautions to minimise the impact of errors. What Lawton (1998) showed
was accidents happen when unintentional acts (errors) combine with intentional acts
such as rule breaking.
Consider, for example, the work of a drilling crew on board an offshore rig. Much of
the overall drilling process is performed by heavy machinery and monitored by
computers, yet human crews are still required for the manual processes of, for
example, looking after the drill pipe – adding or removing drill strings, etc.
an incident will usually involve a error by one person whilst someone is taking a
calculated risk . The motivation for why people behave unsafely varies, but tend to
fall into one of two categories – unsafe behaviours of which they are aware, and
unsafe behaviours of which they are not aware. People may knowingly participate in
behaviours that are unsafe for a number of different reasons (this will be explained in
more detail later in the paper) – for example, they have developed a bad working habit
which has been reinforced by other workers (or possibly even supervisors), or has
gone unobserved or unchallenged, and so has developed into an acceptable practice.
However, as mentioned previously, people also behave in an unsafe manner, but are
unaware of the fact that they are doing so. It is therefore imperative to distinguish the
reasoning behind why people are behaving in this manner (i.e. safely both knowingly
and unknowingly) in order to attempt to modify the unsafe behaviour and rectify the
situation.
Most approaches to unsafe or at-risk behaviours have concentrated upon listing the
reasons why people perform unsafe acts, such as poor attitudes to safety, laziness,
corner cutting or, frequently, deliberate taking of risks by individuals. This
information is typically used to justify remedial measures intended to prevent such at-
risk behaviours in the workplace. Such models, however, do not explain why people
might have what appear to be such poor attitudes, or trade off safe but lengthy
activities with more dangerous short cuts. The approach taken in this model is quite
different, starting from asking what it would take to have people acting safely, as the
standard, and then ascertaining why they might deviate from normal safe behaviour,
demonstrating unsafe behaviours as pathologies of normal safe behaviour patterns.
The assumption made here is that people, even thrill seekers, are usually interested in
their own safety, as well as that of others; this assumption is not apparent in the usual
unsafe behaviour models.
The Safe Behaviour Model
‘Safe behaviour’ may be considered as an end product of a number of different steps
or stages, as it requires all of these to be followed successfully in order for the entirety
of behaving safely and avoiding an incident to be achieved. In conjunction with
Leiden and Manchester Universities, Shell Exploration and Production developed the
‘Safe Behaviour modelTM’, which outlines the four main steps necessary for safe
behaviour. Figure 1 details this model with four main stages, from Sensing to Action.
Figure 1. The Safe Behaviour Model
Stage 1 Sense – the perception of hazard and hazardous situations
The first stage that is necessary involves perception of information about the
environment. A person needs to be able to see, hear or otherwise receive information
that there is a hazard that requires a response. This perception may be direct, as when
one sees a snake, smells a noxious product or feels an unusual vibration that may
presage a disastrous event. If there is no such perception then it is, at best, difficult to
react to something that does not appear to exist as it is essentially invisible. In
addition sensory processes will be subject to habituation, making the visible invisible
again.
There is a second level of perception, that applies to what we can learn to recognise as
hazardous. In some cases this is possible. For instance, a vehicle may be immediately
recognisable as dangerous, because it is travelling too fast or is clearly likely to fall
apart, but experienced drivers also learn to recognise other, often more critical
dangers. An inexperienced driver is sometimes recognisable from the way they drive,
drivers with poor attitudes (far more dangerous in terms of the traffic statistics) often
show visibly recognisable behaviours such as close following. Notions of perceptual
‘tuning’ apply here, where the perceptual system becomes sensitised to configurations
that have become meaningful. Remedial measures at this level would consist of
sensitising perceptual systems, inducing perceptual learning, or of undoing the effects
of habituation. (Gigerenzer, & Selton, 2001)
Stage 2 Know – understanding which risks deserve attention
The raw ability to perceive a hazard or hazardous situation does not, on its own,
provide a sufficient impetus to behaviour. While specific hazards might, conceivably,
have the potential to cause substantial harm, in a great many cases the potential for
harm may (appear to) be negligible and therefore no special action would be merited.
What is perceived needs to be regarded as sufficiently important to warrant action and
in many cases there may appear to be more pressing needs. This means that hazards
have to be understood in terms of their likelihood to cause harm; the hazardous
objects, event or situation has to be reframed as a risk, by adding in information about
the frequency with which harm may actually be expected.
This information can change the weights people assign to possible outcomes, which
they can use to determine where to set their priorities. Thus, if a situation is judged to
be relatively risk free, in an environment of competing priorities for action, an at-risk
behaviour will be more likely to be chosen if it meets more pressing goals. As a
matter of experience people are unlikely to be influenced by one or two statements
that an action is dangerous if this conflicts with their personal experience over many
years that it is usually not. For instance, most people’s personal experience of years of
accident free driving often makes adhering to speed limits a losing competitor against
their certain expectation of a later arrival. Seen as an intuitive risk judgement they
multiply the potential consequences with their personal (and statistically reasonable)
expectation that they, as an individual, will not have an accident, this contrasts with
the certain outcome of increased journey time. The product of potential outcome and
probability, the risk, drives behavioural choices. There is a substantial literature on
this area, and it is clear that individual judgements of both potential outcome and
perceived probability are open to significant cognitive biases (Tversky & Kahenman,
1974; Kahneman, Slovic, & Tversky, 1982).
Stage 3Plan which action to implement
Just as it is necessary to sense hazards, albeit indirectly, and essential to treat hazards
once perceived as sufficiently important to merit taking some action, it is also
necessary for safe behaviour that people know what they can do to manage the risks
they appear to face. An individual’s behavioural repertoire may include possible
behaviours to cope with identified risks, but may also be limited to a restricted set of
actions. What a person confronted with a risk regarded as significant has to do is to
decide what behaviours are appropriate to manage that risk. They will have to make a
plan of action to be performed.
Failures at this level constitute the Rule-Based mistakes identified by Reason (1990),
when possible choices of action compete, or Knowledge-Based mistakes when they
have no specific behaviours in their repertoire and have to generate a fresh plan of
action to solve the problem, being confronted with a clear risk. Wagenaar and Hudson
(1997) distinguished two types of planning failure. In the first case an individual,
faced with a known problem, may only have a single possible plan of action, which is
a source of problems if the plan is inappropriate (“To a man with a hammer,
everything looks like a nail”). Alternatively people may generate more than one plan
but fail to think the consequences through adequately. Certainly, people under time
stress will be more likely to select the first plan that looks as if it might offer a
solution (The politician’s syllogism from the popular series Yes Minister goes: ”I
must do something, this is something, so I must do this”).
Stage 4Act and maintain the chosen behaviour
The final stage of the Safe Behaviour model involves actually implementing the
chosen behaviour, together with the requirement to continue to perform the chosen
action in the face of already identified risks. This can prove problematical if there are
pre-existing competing behaviours, possibly including non-behaviour (do nothing),
because those alternative choices, which are typically seen as at-risk behaviours, are
well established. This is the level at which most Behaviour-Based Safety Methods
(Geller, 2001; Daniels & Rosen, 2004) are aimed, when people can see the hazards,
know the risks, have the necessary plans of action in their repertoires but still perform
other, usually riskier, behaviours.
This final stage can prove to be problematical even when all the other stages have
been passed successfully. One reason may be that frequent reward of at-risk
behaviours, without negative consequences, leads to the formation of skill-based
behaviour (Rasmussen, 1983) which can by-pass the Know and Plan stages, becoming
a learned reflex triggered by environmental conditions, and only monitored
occasionally. This analysis suggests that people act with at-risk behaviours because
they are no longer driven by rational thought, but by direct response to the
environment within which they find themselves. This is the same conclusion that was
reached earlier for different reasons (Wagenaar, Hudson & Reason, 1990) in the
development of the Organisational Accident Model (Reason, 1997).
The model as presented is based upon a logical analysis of what it might take to
behave safely in the face of hazards. Is there any empirical evidence that such an
approach forms a good and fruitful basis for ensuring people behave safely?
Wagenaar and Groeneweg (1987) analysed 100 major maritime accidents, examining
57 of those in terms of the notion that people cause accidents by taking deliberate
risks, similar to the alternative models described above. Their results are shown in
Table I, where they came to the conclusion that at most 1% of all the accidents studied
were primarily caused by deliberate and explicit risk-taking behaviour. While their
categories are not immediately related to ours here, it is clear that 27% of problems
were at the Sense level, that finding options – plans of possible action - was a
significant factor (15%) and that calculations of risk or of successful outcomes of
plans were also substantial issues (36%) that prevented people from doing what, in
hindsight, would have appeared to be the ‘right’ thing. These data suggest strongly
that, in real life scenarios, people confronting hazards can fail at any point along the
route.
Table I. Where decisions about risky actions go wrong in 57 accidents at sea,
reported by the Raad voor de Scheepvaart (Netherlands Maritime Accident Board) in
1984 and 1985 (some accidents are represented in more than 1 line: from Wagenaar
& Groeneweg, 1987)
-------------------------------------------------------------------------------------------------------
Decision phase Number of accidents scenarios in which the decision went wrong
Absolute number (%)
_____________________________________________________________________
1 Receipt of information 17 21
2 Detection of problem 21 27
3 Finding options 12 15
4 Evaluation of consequences 19 36
5 Conscious acceptance of risk 1 1
All of these stages are necessary for a person, faced with a hazardous situation, to
perform the appropriate actions. The scientific understanding is not drawn from a
single part of Psychology, as the earliest stages reflects Perceptual Psychology, then
the inspiration is from work on biases in Cognitive Psychology and Economics,
followed by Social and Health Psychology when it comes to the implementation of
intentions. At the same time many of the issues covered are also to be found in the
area of Organisational Psychology, although there is usually a very different literature
in such analyses of the problems encountered in industrial safety, such as we are
discussing here.
One point that should be stressed is the complexity of the intellectual problem faced
by ordinary people in hazardous situations, especially when they are already so well
defended that a safe outcome is usually guaranteed. Shipping operations are typically
simple, and the safety measures rudimentary. Oil and gas exploration and production,
in contrast, is usually complex and highly defended with a wide range of overlapping
safety measures. Wagenaar and Hudson (1997) showed how many unsafe acts were
required to have a major accident in the two different industries; the results are shown
in Table II. While it took 2-3 unsafe acts (mean = 3.4) to lead to a major shipping
accident,
Table II. The number of unsafe acts necessary to cause an accident.
Number of 100 21 accidents in
unsafe acts shipping accidents oil & gas
exploration & production
_________________________________________________________________
0 4 0
1 3 0
2 46 1
3 22 2
4 14 2
5 5 1
6 2 1
7 2 1
>7 2 13
more than 50% of the oil and gas accidents required in excess of 7 contemporaneous
unsafe acts (mean = 8.0). In the oil industry, therefore, an individual unsafe act will
almost always not lead to an undesirable outcome, as an unforeseeable combination is
usually required for a major accident actually to occur. This is the message of
Reason’s (1990; 1997) Swiss Cheese model. Under these conditions people are likely
to downplay the potential negative consequences, actually reflecting their personal
history of managing the risks effectively, so that 20 years of personal experience that
a hazard has not become a problem for an individual weighs more heavily than
someone else’s advice that it is very dangerous. It is, of course, very different, seen
from the vantage point of a corporate centre, when millions of man hours are
aggregated, as opposed to the viewpoint of an individual who will, in a well defended
high risk industry, probably never personally experience a major accident such as a
fatality. The view from the centre is that an incident that occurs once in several
million working hours is actually quite frequent when the total world-wide workforce
may accumulate a million man hours in a single day. Nevertheless any one individual
only works about 80,000 hours in a lifetime, so it takes 13 people to accumulate in
their lifetimes the million hours the company achieves in a day. In short an individual
may never actually experience a serious accident at first or second hand, but even a
very safe but large company will be confronted with the negative consequences of
unsafe behaviours on a daily, or at least weekly, basis.
Interventions - Using the model to generate safer behaviours
A scientific model can be constructed simply to explain a particular phenomenon. The
Safe Behaviour model makes the different stages explicit and allows us to understand
why people might behave in ways observers, often blessed with hindsight, would find
odd and even dangerous. This model was, however, constructed primarily to provide a
framework for intervention, to prevent people from harming themselves and others. It
underlies two intervention tools in the suite of nine Hearts and Minds tools, Working
Safely (Shell, 2003) and Driving for Excellence (Shell, 2004), designed to eliminate
industrial accidents in two specific areas, unsafe working practices and on the road.
These two tools operate by inducing biases in favour of safe behaviour. To support
individuals the model stresses that each of the stages can be supported by processes of
looking, telling and listening involving (i) learning to look at a hazardous world and
recognise danger, (ii) learning how to tell people that what they are doing is unsafe,
and (iii) teaching people to accept such information without being offended or feeling
patronised. Another set of processes have to support the continuation of each stage, so
that people do not habituate perceptually, continue to rate risks highly despite their
actual personal daily experience, maintain and develop their repertoire of possible
actions and, finally, continue to behave in ways appropriate to the hazard. As it is in
everyone’s interest to ensure that people are supported in their efforts to act safely, it
is vital to ensure that the personal pay-offs for safe behaviour outweigh those for
unsafe behaviours.
The next section examines how a particular problem identified in many incident
investigations, a lack of situation awareness, can be understood in the context of the
Safe Behaviour model.
Situation Awareness
Numerous theories present in the domain of psychology all play their role in particular
stages of the model. However, one particular psychological construct that maps
neatly on the Safe Behaviour model is of situation awareness. In order for humans to
function effectively in daily life without mishap, they must have at least some sort of
rudimentary understanding of their surroundings in order to act accordingly and
successfully avoid any potential accidents. It is obvious that one critical factor in
preventing accidents should be maintaining an adequate understanding of the current
situation. This is needed in order to perceive the conditions of the environment, and
judge the consequences of any actions taken in relation to the safety of the work, in
order to avoid adverse events. By having full and correct understanding of the
situation, the potential risk involved in an action can more effectively be gauged and
in turn minimised, thus reducing the risk of an accident. However, if the
understanding of the situation is impaired, then the ability to predict the outcomes of
actions is more flawed as it is based on less accurate information, and due to this the
risks of an accident occurring are increased. This is supported by Orasanu, Dismukes
& Fischer (1993) who found that pilots who had experienced an accident had a
tendency to interpret cues wrongly, and also in many cases underestimated the risk
that was associated with a particular challenge, thus over-estimating their perceived
ability to deal with the situation. Consequently, the question is: what is this
mindfulness that we all appear (to a greater or lesser extent) to possess which allows
us to have successful passage through our day, avoiding most potential accidents?
How is it that people garner and use information from their constantly changing
environments to keep operations safe? In short, the answer to these questions is
‘Situation Awareness’ (SA). Definitions of situation awareness vary greatly, but the
most cited definition is provided by Endsley (1988, p97), who explains it as “...the
perception of the elements in the environment within a volume of space and time, the
comprehension of their meaning, and the projection of their status in the near future”
. In simple terms, SA is the ability to successfully pay attention to and monitor the
environment, and essentially ‘think ahead of the game’ to evaluate the risk of
accidents and ensure a safe working environment.
Sarter & Woods (1991, p50) state that situation awareness “…is based on the
integration of knowledge resulting from recurrent situation assessments”, i.e. by
continually appraising the situation and incorporating facts from it, situation
awareness is derived. These so-called ‘situation assessments’ comprise of three
levels: perception, comprehension/information integration, and projection (Endsley,
1995). Vuijk (2004) mentions that when people are unaware of their unsafe working
practice, that there is a problem with interpretation of information, or indeed the
senses themselves, whereas if the issue regards bad habits, then the stage requiring
attention is the planning/acting step. Let us consider how SA can help to explain why
people behave unsafely.
Referring back to the Safe Behaviour model, the three levels of SA clearly correspond
to the sense, know and plan stages: ‘sense’ equating to ‘perception’, ‘know’ referring
to ‘comprehension’, and ‘plan’ being associated with ‘projection’. The parallels
between these stages and the SA levels are perhaps more clearly understood by
examining them in more detail.
Level 1 SA: Perception This is the basal constituent of SA: in order for
SA to be achieved, objects and information in the surrounding environment (i.e.
stimuli) must be noticed. The stimuli must be mentally processed to determine to
which ones attention should be paid in a work task, and the surroundings should be
continually monitored to attend to changing stimuli. This is very closely aligned to
the Safe Behaviour stage of ‘sense’ which regards the perception of the hazard. This
stage does not involve merely seeing the hazard with our eyes, but we should also
employ our other senses, for example smell (for the perception of dangerous gases in
the atmosphere), touch (to feeling unusual vibrations) and our hearing (to avoid heavy
machinery which we cannot see). Attentional processing (La Berge, 1995; Schiffrin
& Schneider, 1977) is intrinsically linked to the theory of SA, but attention is bound
by the limits of the working memory (Baddeley & Hitch, 1974). The fact that
attention is limited can be a problem, as a person is unable to attend closely to every
single detail of his/her environment. This is an important issue to consider in safety
critical environments, given that it is often the case whereby factors outside the
central focus of attention are those that have the potential to be harmful.
Consequently, critical elements may be missed in the observation/perception stage,
leading to an incorrect mental model (the representations of objects, people and tasks
that people hold in their minds of the understanding of the various roles and relevance
of the items concerned) being formed. Possession of a poor or incorrect mental model
can increase accident risk as there is no ‘template’ to guide actions. This is further
supported by Klimoski and Mohammed (1994, p405) who state that mental models
“…provide a conceptual framework for describing, explaining, and predicting future
system states”, and so if this was to be impaired, then consequently the resultant SA
would similarly be impaired.
In summary, sense and perception relate to the same psychological construct, as they
both involve paying attention to the environment, and ‘picking up’ on the information
relevant to the task. The situation should be monitored in order to keep track of and
perceive any changes to the situation (should they occur), as this will affect the
workers’ SA.
Level 2 SA: Comprehension This involves the combination, interpretation,
storage and retention of the aforementioned information (Endsley, 2000b) to form a
picture of the situation whereby the significance of objects/events are understood,
essentially deriving meaning from the elements perceived. The degree of
comprehension that is achieved will vary from person to person, and Endsley (1995)
maintains that the level attained is an indication of the skill and expertise held by the
operator. Applying this level specifically to the Safe Behaviour model stage of
‘know’, it means understanding how the information that you’ve perceived (be it, for
example, visual information from seeing something, or information from other senses
such as smelling gas) relates to you and what you are doing, and how the information
can help to influence your actions to behaving in a safe manner. Success at the
‘know’ stage can only occur of the meaning assigned to the information is correct.
For example, the information can correctly be perceived (i.e. success has been
achieved at the ‘sense’ stage), but its importance or meaning to the overall situation or
goal can be misunderstood (Endsley, 1999). This is often the case when a situation is
encountered for the first time, and a worker’s expectation of the state of affairs is not
known. For example, a worker may see a reading on a dial in the control room of an
offshore platform, thus has perceived the information, but does not understand how it
connects with the task he is performing, and so has not achieved the correct
comprehension level relevant to his work. Also, it may be that certain objects or
events can be classed as hazards in one situation, but harmless in another, and so this
also requires correct comprehension (or knowing) in order to make the distinction
between them and act accordingly.
Level 3 SA: Projection The final level of SA is projection, which occurs
as a result of the combination of levels one and two. This stage is extremely
important, as it means possessing the ability to use information from the environment
to predict possible future states and events (Endsley, 1988, 1995; Sarter & Woods,
1991). Having the ability to correctly forecast possible future circumstances is vital,
as time is made available to dispel potential discords and formulate a suitable action
course to meet goals (Endsley, 1995, 2000; Stanton et al, 2001). Endsley (2000b, p7)
states that “…experienced operators rely heavily on future projections” as this
“allows for timely decision making”, adding that “It is the mark of a skilled expert”.
Just as level 1 SA (perception) mirrors the Safe Behaviour model’s stage of sense, and
level 2 SA (comprehension) corresponds with stage 2 in the model (know), the third
level of SA (projection) maps perfectly onto the third stage of the Safe Behaviour
model – plan. In accordance, whilst projection involves using the information you
have assimilated during the previous stages of SA to prepare an appropriate plan of
action, ‘plan’ entails doing likewise. By understanding how the information is
associated (both directly and indirectly) with the work being performed, then workers
can use the data to plan not only for their immediate work, but also for future work,
and can develop appropriate strategies and working patterns. In addition, workers can
use the information to aid in the planning of colleagues’ work as well. In other words,
as long as the information is shared, we can warn others about the hazards and
dangers associated with particular objects, situations or processes of which other
workers were possibly not aware.
The fourth stage of the model is act and maintain. This stage is essentially about
keeping doing the safe behaviours that were planned in the previous stages in order to
manage the hazards in the workplace. This can be considered in conjunction with
process occurring throughout the process of achieving safe behaviour – Look, speak
and listen (sharing information with others, and listening to the information they have
which can help your situation) and maintaining all the stages.. When taken as a
whole, it overall sums up the modus operandi of situation awareness, as the
understanding of a situation is being repeatedly updated through continual situation
assessments (Endsley, 2000b). These situation assessments are the cognitive
processes of gaining and interpreting new knowledge, be it knowledge gained from
the worker own senses, or information shared from colleagues. The new knowledge
can then be applied to feed into the representation that workers have of the current
situation, continually updating their awareness. This feedback loop is essential to
maintain accurate SA, as new information must be incorporated, as workplaces in
safety critical environments rarely remain static.
The importance of good quality SA in maintaining a safe working environment can be
seen when we look at the human element root causes of accidents in safety critical
environments. As stated previously, lack of care and attention has been found as one
of the main reported human factors causes of accidents in the oil and gas industry
(Mearns, Flin, Fleming & Gordon, 1997; Flin, Mearns, O’Connor & Bryden, 2000).
Since attention is central to SA, this may provide a focus area for addressing some of
the main immediate causes of incidents, this should then be followed by a deeper
investigation of the reasons “why” as discussed so far in the paper, in order to address
the underlying causes.
Errors in SA can occur at any of the three levels:
At the base level of SA (perception, or ‘sense’ in the Safety Behaviour model), errors
can easily occur, most notably arising from the reality that some element of the
environment has been discerned incorrectly, or perhaps even not at all. This may be
the fault of the individual concerned (i.e. he/she simply did not pay enough attention,
and so did not notice the event/action/object), or it may have been a characteristic of
the environment (e.g. an object was obscuring the particular feature that was missed
from view) (Endsley, 1995; 1999). While this means that there will be enhanced SA
for those elements of the environment to which attention was allocated, the resulting
SA for the entire situation will be poorer, as the representation that the worker holds
will not have incorporated information from elements which received little/no
attention. Prior expectations about how a particular situation should be can also have
an impact here. It may be that workers are so used to working in a particular
environment, doing a particular task, that the work is so routine that they become
complacent, feeling that they already know exactly how things should ‘turn out’. This
can easily lead to information being missed, as full attention is not being paid to the
situation.
Errors in Level 2 SA involve the failure of certain aspects of the situation to be
correctly understood in relation to the targeted goals, i.e. there is an error at the
‘know’ stage. For example, the information can be perceived accurately by those
concerned, but its importance or meaning to the overall goals desired is not
adequately appreciated (Endsley, 1999). This is often the case when a situation is
encountered for the first time, and an expectation of the state of affairs is not known.
The worker may therefore perceive all of the information available, but not recognise
the relevance of it, so not integrate it properly to his/her awareness.
The final stage at which an error to SA can occur is at the level of projection (plan).
At this level, future occurrences need to be anticipated in order for the best and most
effective course of action to achieve the goals, deal with hazards, and maintain a safe
environment to be decided upon. The most common cause for failure of SA at this
level is simply due to the fact that some people appear to be very poor at mental
projection (Klein, 1989b; Endsley, 1999). The prediction of future events and actions
is a task that places a great deal of demand on both the attentional and memory
systems of the brain, and so those with a poor mental projection will inevitably have
more trouble in this task. It may be that workers do not think ahead of their tasks and
only concern themselves with what is happening currently, thus do not project at all,
or alternatively do attempt to forecast situations, but the schemes that they develop do
not account for all possible eventualities.
Previous analyses of the causes of accidents have shed some light on the role of SA in
incidents. Recent accident analyses by Jones and Endsley (1995, 1996) support the
notion that SA is a major issue in aviation accidents. They investigated aviation
incidents, analysing data contained within the Aviation Safety Reporting System
(ASRS) database from January 1986 to May 1992, searching for the term ‘situational
awareness’, and found 143 incidents that met the criteria for the study (i.e. contained
enough information, cited problems with SA, and were submitted to the database by
either the pilot or controller involved). It was discovered that of the accidents
attributed to situation awareness error, 76.3% were categorised as perception errors,
20.3% were comprehension errors, and 3.4% were projection errors. Of the perception
error, the majority were found to be due to the operator not attending to all the
required information, and thus did not perceive (or sense) the information in the first
instance. A recent study by Sneddon, Mearns, Flin and Bryden (2004) also found a
very similar pattern in oil and gas exploration and production. They investigated a
multinational oil and gas exploration company’s accident database of drilling
incidents (all well operations, onshore and offshore) occurring within a ten-month
period, and found that where SA was the root cause (135 incidents), the incidents
largely were due to problems with perception. In parallel with Jones and Endsley’s
research, 66.7% of incidents were due to errors in perception (sense), and 20% were
found to be triggered by problems with comprehension (know). This research shows
that the vast majority of incidents appear to have their cause rooted in errors occurring
at the perception and comprehension levels, or the sense and know stages.
These results corroborate Wagenaar and Groeneweg’s (1987) study reported in Table
I, but there may be a difference related to the differences found in Table II. Workers
in the oil and gas industry may be much better trained to start with, having a better
knowledge and planning ability, so that their failures load much more heavily on the
initial sense component, as found by Sneddon et al (2004). It therefore appears that
people are essentially not even picking up on the information in the first instance, or
perhaps doing so but then misunderstanding it’s importance, or immediately
disregarding it, thus not taking it into account in their work practices. In order to
increase safe behaviour, the areas of sense and know must be focused upon to
facilitate increasing perceptions and understanding of the hazard in the first instance.
Conclusion
While the SA framework provides a fundamental base for the Safe behaviour model,
it is not the only theory that has been drawn upon in order to understand why people
behave safely or unsafely. Many other theories and models found in psychology play
an important role in explaining human behaviour in the workplace, nevertheless the
model described here represents a much more unified approach, synthesising
knowledge garnered from the whole gamut of Psychology. There appears to be some
empirical evidence that the model is valid for more than just theoretical reasons, as
well as making it clear what kind of interventions can be used to increase safe
behaviours in hazardous but well defended systems.
References
Baddeley, A. D., & Hitch, G. J. (1974). Working memory. In G. H. Bower (Ed.), The
psychology of learning and motivation (Vol. 8). London: Academic Press.
Daniels, A. C., & Rosen, T., A. (2004) Performance management: Changing
Behaviour that drives organizational effectiveness. Performance management
publishers
Endsley, M. R. (1988). Situation Awareness Global Assessment Technique (SAGAT).
Paper presented at the Proceedings of the National Aerospace and Electronics
Conference (NAECON), New York.
Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems.
Human Factors, 37(1), 32-64.
Endsley, M. R. (1999). Situation awareness and human error: Designing to support
human performance, Proceedings of the High Consequence Systems Surety
Conference. Albuquerque, NM: Sandia National Laboratory.
Endsley, M. R. (2000a). Situation models: An avenue to the modeling of mental
models, Proceedings of the 14th Triennial Congress of the International Ergonomics
Association and the 44th Annual Meeting of the Human Factors and Ergonomics
Society. Santa Monica, CA: HFES.
Endsley, M. R. (2000b). Theoretical underpinnings of situation awareness: A critical
review. In M. R. Endsley & D. J. Garland (Eds.), Situation Awareness Analysis and
Measurement (pp. 3-33). Mahwah, NJ: Lawrence Erlbaum Associates.
Flin, R., Mearns, K., O'Connor, P., & Bryden, R. (2000). Measuring safety climate:
Identifying the common features. Safety Science, 34, 177-192.
Geller, S, E. (2001) The psychology of safety handbook CRC Press
Gigerenzer, G. & Selton, R. (2001) Perceptual tuning learning Bounded reality: The
adaptive toolbox. Cambridge Massachusetts, MIT press
Hoyos, C., (1995). Occupational safety: Progress in understanding basic aspects of
safe and unsafe behaviour. Applied Psychology: An International Review, 44, 233-250
Jones, D. G., & Endsley, M. R. (1995). Investigation of situation awareness errors.
Paper presented at the Eighth International Symposium on Aviation Psychology,
Columbus, OH.
Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in
aviation. Aviation, Space and Environmental Medicine, 67(6), 507-512.
Klimoski, R., & Mohammed, S. (1994). Team mental model: Construct or metaphor?
Journal of Management, 20(2), 403-437.
Kahneman, D., Slovic, P., & Tversky, A. (1982) Judgment under uncertainty:
Heuristics and biases. New York. Cambridge University Press,
La Berge, D. (1995). Attentional processing: The brain's art of mindfulness.
Cambridge, MA: Harvard University Press.
Lawton. R. (1998) Not working to rule: Understanding procedural violations at work.
Safety Science, 28 77-95
Mearns, K., Flin, R., Fleming, M., Gordon, R. (1997). Human and organisational
factors in offshore safety ( OTH 543). Suffolk: HSE Books.
Orasanu, J., Dismukes, R. K., & Fischer, U. (1993). Decision errors in the cockpit.
Paper presented at the Proceedings of the Human Factors and Ergonomics Society
37th Annual Meeting, Santa Monica, CA.
Rasmussen, J. (1983). Skills, rules and knowledge: Signals, signs and symbols, and
other distinctions in human performance models. IEEE Transactions and Systems,
Man and Cybernetics, 3, 257-268.
Rundmo, T. (1995). Perceived risk, safety status, and job stress among injured and
non-injured employees on offshore petroleum installations. Journal of Safety
Research, 26(2), 87-97.
Rundmo, T., Hestad, H., & Ulleberg, P. (1998). Organisational factors, safety
attitudes and workload among offshore oil personnel. Safety Science, 29, 75-87.
Sarter, N. B., & Woods, D. D. (1991). Situation awareness: A critical but ill-defined
phenomenon. The International Journal of Aviation Psychology, 1(1), 45-57.
Shell International Exploration and Production (2003) Working Safely. Energy
Institute. London.
http://www.energyinst.org.uk/heartsandminds/worksafe.cfm
Shell International Exploration and Production (2005) Driving for Excellence. Energy
Institute. London
http://www.energyinst.org.uk/heartsandminds/driving.cfm
Shiffrin, R. M., & Schneider, W. (1977). Controlled and automatic human
information processing: II. Perceptual learning, automatic attending, and a general
theory. Psychological Review, 84, 127-190.
Sneddon, A., Mearns, K., Flin, R., & Bryden, R. (2004). Safety and situation
awareness in offshore crews, Proceedings of The Seventh SPE International
Conference on Health, Safety, and Environment in Oil and Gas Exploration and
Production. Richardson, TX: SPE.
Stanton, N. A., Chambers, P.R.G., J. Piggott. (2001). Situational awareness and
safety. Safety Science, 39(3), 189-204.
Tversky, A., & Kahenman, D. (1974) Judgement under uncertainty: Heuristics and
biases. Science, 185, 1124-1131
Vuijk, M. (2004) Safe behaviour at work. Leiden University.
Wagenaar, W.A., & Groeneweg, J. (1987). Accidents at sea. Multiple causes and
impossible consequences. Journal of Man-Machine Studies, 27, 587-598.
Wagenaar, W.A. & Hudson, P.T.W. (1998) Industrial Safety. In P.J.D.Drenth,
H.Thierry & C.J.de Wolff (Eds.) Handbook of Work and Organizational Psychology.
Psychology Press, Hove, England. pp 65-88
Wagenaar, W.A., Hudson, P.T.W., & Reason, J.T. (1990). Cognitive failures and
accidents. Applied Cognitive Psychology, 4, 273-294
Article
This research aimed to investigate whether commercial drivers' situation awareness ability is related to their frequency and magnitude of caused accidents and penalty points. For the purpose of measuring drivers' situation perception and interpretation capacities, two tests named 'situation awareness test' and 'hazard perception test' were developed. The tests were based on the data from 299 commercial drivers (test drivers) using a driving simulator. The outcome of drivers' performance on situation awareness and hazard perception tests was designed to be categorized into 5 grades, classifying the best as grade 1 and the worst as grade 5. As the result, low grades on situation awareness test had positive relationship with accumulated penalty points, frequency of accidents and safety index. Grades on hazard perception test were also positively correlated with accident frequency and safety indices. This suggests situation awareness ability of commercial drivers is significantly related to traffic violations and accident causing tendencies.
Article
Safety is a top foundation of world-class drilling and well operations performance. Working safely is a condition of employment for everyone who works in specific rig both offshore and onshore. Every year, workers suffer approximately 150,000 caught or crushed by injuries that occur when body parts get caught between two objects or entangled with machinery. These hazards are also referred to as "pinch points." The physical forces applied to a body part caught in a pinch point can vary and cause injuries ranging from bruises, cuts, and scalping to mangled and amputated body parts, and even death. The best way to get people motivated is to let them do what they want to do, and in a true HSE culture you will have achieved this goal. The core of "Hearts and Minds" is having everyone interested in success, not just one or two individuals with their own private agenda. Too many initiatives have failed because the "Hearts and Minds" had not been won, no matter how worthy the original proposal may have been.
Article
Full-text available
Much work involved on offshore installations has the capacity to be hazardous, and despite many rules and regulations in place to ensure that accident risk is kept to a minimum, accidents still occur. One factor known in other industries (e.g. aviation) to contribute to the occurrence of accidents is a reduction in the ‘situation awareness’ (SA) of those concerned. Good SA is essential when work is potentially hazardous, as workers must accurately discern and monitor conditions if they are to reduce accidents. Accident analyses have shown that a team can lose their shared awareness of the situation when it is vital to the safety of their operation. This may be particularly relevant to drill crews given the interactive and hazardous nature of their work. In this way, lack of/reduced SA may be a predictor of the likelihood of an accident occurring. This paper presents a brief history of SA, an overview of the study, a preliminary review of an accident database, and results from interviews with onshore and offshore oil and gas industry personnel.
Article
Full-text available
Situation awareness (SA) lies at the heart of all human decision making and performance. A large portion of errors attributed to human operators actually accrue from errors in their situation awareness. A taxonomy of SA error is presented that describes the many factors that can lead to SA errors. With an understanding of the cognitive mechanisms that underlie SA errors, recommendations are made for designing systems that enhance operator SA rather than induce SA errors.
Article
Full-text available
The concept of a mental model is well ensconced in the human factors literature. Mental models are used to explain a wide variety of human behavior and designers are directed to design to fit the mental model of the user. Its real utility in practical application remains beyond our grasp, however, due to significant difficulties in creating usable representations of these mental models that can provide descriptive and predictive capabilities. This paper explores the concept of mental models and a highly related concept, the situation model. Through the situation model, a method for abstracting user knowledge and creating usable models of their mental model is explored. This technique shows promise as a means of more effectively capturing this elusive construct.
Article
Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.
Article
The role of human and organizational factors in predicting accidents and incidents has become of major interest to the UK offshore oil and gas industry. Some of these factors had been measured in an earlier study focusing on the role of risk perception in determining accident involvement. The current study sought to extend the methodology by focusing on perceptions of organizational factors that could have an impact on safety. A self-report questionnaire was developed and distributed to 11 installations operating on the UK Continental Shelf. A total of 722 were returned (33% response rate) from a representative sample of the offshore workforce on these installations. The study investigated the underlying structure and content of offshore employees' attitudes to safety, feelings of safety and satisfaction with safety measures. Correlations and step-wise regression analysis were used to test the relationships between measures. The results suggest that 'unsafe' behaviour is the 'best' predictor of accidents/near misses as measured by self-report data and that unsafe behaviour is, in turn, driven by perceptions of pressure for production.