MITIGATING COGNITIVE BIASES IN
Michael J. Kinsey*, Arup, China
Max Kinateder, National Research Council Canada, Canada
Steven M.V Gwynne, Movement Strategies, UK
Danny Hopkin, OFR Consultants, UK
Fire engineering has developed into a mainstream engineering discipline within the building design
process. Building fire codes are increasingly complex, comprising thousands of requirements regarding
a wide range of topics that must be considered. Fire engineers are required to possess increasingly
complex knowledge about a variety of subjects, along with expertise in their application. This has been
magnified with the proliferation of performance-based methods using a range of computational tools.
This coupled with increased project performance pressures, raises the potential for errors in judgement.
Errors in judgement may be caused by limitations in a given resource (e.g. time, information available,
knowledge, etc) and/or neglect/over-focus on specific information (at the expense of other and more
relevant information) through cognitive biases. This paper initially provides a broad overview of general
decision-making including the use of heuristics and cognitive biases. Examples of cognitive biases are
presented which may be linked to errors in fire engineer decision-making. The study considers several
fire engineering decision contexts where cognitive biases may exist which are associated with fire code
application, modelling/calculations, probabilistic risk assessments, general fire engineering practice,
and perceptions based on experience. Potential measures to mitigate some of these biases and prompt
better decision-making are discussed. Those that may benefit from awareness of such biases and
mitigation measures include not only practicing fire engineers, but also building developers, fire code
committees, evacuation/fire/structural fire modelling developers, approving authorities, and fire
The principal role of a fire engineer is to use scientific and engineering principles, fire codes and
expert judgment, based on an understanding of fire, building, and people to ensure a building design
provides a sufficient level of fire safety. Fire engineers may be involved in a broad set of tasks requiring
a range of knowledge and highly specialised expertise. Although a fire engineer may possess
prerequisite expertise, there are many possible constraints one might need to consider that might affect
their performance. For instance, a fire engineer may be required to develop the necessary
skills/knowledge or be supported by other fire engineers (who already have the necessary
skills/knowledge) on a project in order to complete a given task. Indeed, where a fire engineer is lacking
in-depth technical expertise or experience about a given subject, they may rely on the fire codes for
guidance, potentially without a complete awareness of any underlying basis or assumptions for the
guidance. There may also be limited time in which a fire engineer is required to conduct a task in order
to meet project deadlines or limited available people to work on a given task. Typically, many of the
tasks have some level of uncertainty associated that may interact in non-additive ways. There may be
multiple ways to complete a given task and potentially multiple suitable outcomes, all of which a fire
engineer must contend with when assessing suitability. Assuming the possession of the necessary
knowledge and technical expertise to conduct such tasks, fire engineers may still make mistakes, lapses
in judgement or make inconsistent assumptions - which can all be caused by cognitive biases. This may
also be compounded by the fire engineer being subject to routine personal issues including boredom,
fatigue, illness, situational and interpersonal distractions, all of which can affect their ability to
consistently perform and make suitable decisions . Furthermore, with fires being rare events (which
may in part be attributed to preventative measures defined by fire engineers) , there is a lack of feedback
in the building design process regarding failures in methods or decision-making within the fire
engineering field. Most designed buildings will never have a fire or if they do, it will not be serious
enough to truly test the full range of fire safety features. Such feedback is critical to enhancing the
decision-making process within fire engineering. Indeed Van Coile, et al.  note that the “collective
experience of the profession” (i.e. guidance developed via consensus through code/standards
committees which represent the views of the profession etc) forms the foundation upon which most
traditional fire engineering building designs are accepted. This is contrary to many other types of
building engineering which are routinely put to the test in normal usage demonstrating the level of their
The paper initially provides an overview of general decision-making and cognitive biases theory.
Potential cognitive biases which may exist within fire engineering are then posited. A range of potential
cognitive bias mitigation measures are then proposed. This paper is intended to identify some means of
reducing potential sources of mistakes and lapses in decision-making which fire engineers may exhibit
through cognitive biases within the building design process. Due to the multi-disciplinary nature of the
subject matter, the authors represent research and commercial engineering practice, including two
practising fire engineering consultants and researchers in psychology and human behaviour in
2. DECISION-MAKING AND COGNITIVE BIASES
Research in decision science proposes that decision-making can be thought of as if employing two
separate, though interacting, systems of thinking: an automatic system and a reflective system [3-7].
The automatic system occurs without a person being directly aware of it. It is used for most decisions,
is relatively quick, and is based on a person recognising given situations or types of situation and
associating it with a stored response – effectively pattern matching the scenario with stored examples
and then identifying associated stored responses. Commonly it is adopted in familiar situations or where
a quick response is required; i.e. where conscious decision-making processes are overly complex and/or
would not be quick enough. This system is used to generate impressions, feelings, and inclinations. It
is possible to train the automatic system to identify suitable responses when given patterns in a
context/situation are identified (e.g. in the form of heuristics, see below). Such training may be received
formally (e.g., in the classroom), or may be developed over time through repeated experience (e.g.,
seasoned firefighting knowledge through repeatedly attending fire incidents). The automatic system
distinguishes the surprising from the normal, and influences expectation in a given context. It allows
experts (e.g., firefighters, airline pilots, etc.), to competently operate in complex situations under time
pressure. Where information is lacking, the automatic system extrapolates based on plausible options
without factual input, potentially giving the illusion of validity to a decision.
The reflective system occurs in the conscious and allows people to address situations in a more
deliberative manner. It is more consciously effortful to use, relatively slow and limited by working
memory. In comparison to the automatic system, it requires more information, cognitive resources, and
focus to complete the decision-making process. Unlike the automatic system, its capacity to process
information is limited by working memory. Commonly, it is adopted in unfamiliar situations (e.g.,
wayfinding in an unfamiliar building, becoming aware of unexpected changes in an environment, etc.),
where a task requires reflection or requires a systematic process (e.g., solving/approximating a long
mathematical calculation, identifying the appropriate strategy to attack a fire in a complex scenario,
etc.), and/or where time is not considered a limiting factor in an decision[3-7]. Neither the automatic
of reflective system is guaranteed to produce correct or incorrect answers. The key differences between
the automatic and reflective systems is that the latter occurs consciously and requires the use of working
memory, whereas the former occurs nonconsciously and does not require working memory [6,7].
Irrespective of which system of decision-making is primarily used, the process is bounded in terms of
information available, time available, and an individual’s mental resources to process such information.
To compensate for such limitations and manage uncertainty or complexity with a decision, people may
employ heuristics in the decision-making process.
Heuristics are defined as relatively simple rules of thumb  which can be applied to complex decisions
that can result in a suitable response. Such rules of thumb may be developed by creating associations
between a given situation and past experience of a learned response, which results in the adoption of a
given action based on the association, shortcutting the reflective aspects of decision-making. The
development of a rule of thumb involves substituting the decision being made for a similar simpler
decision which can be used to produce an outcome. It may also be possible to deliberately prime an
individual to adopt heuristics through training to enable speedy responses to a situation; however, the
heuristics may not be appropriate if used outside of the intended set of scenarios, placing greater
importance on the pattern matching capability of the individual and the quality of the information
available. Heuristics are extremely common and can be used in a variety of situations. For instance,
experienced fire incident commanders who are capable of making quick and relevant decisions during
a fire incident under time pressure. They do this through the use of certain heuristics called recognition
primed decision-making . During a fire incident, automatic triggers of a fire commander’s memory
(of previously experienced incidents), matches a given set of criteria of the current fire incident together
with the learned appropriate response pattern. This type of decision-making process may be labelled as
‘expert intuition’, which in part demonstrates the nonconscious nature in which it occurs . However,
if heuristics are employed inappropriately (e.g. due to incomplete information) systematic errors can
inadvertently be produced. This can be made worse through the existence of cognitive biases, where
information is inappropriately processed or overly focused upon at the expense of more relevant
information in the decision-making process. The adoption of heuristics based on partial or incomplete
information (leading to mismatches between scenario and response) may occur in the following
situations: where a person has no previous experience of making a decision in a particular situation;
there is no learned response (e.g. a person has not learned or been trained what to do in a given situation);
or in novel situations with limited resources at hand (e.g., time, mental capacity, information, etc.).
Cognitive biases can occur in a wider variety of scenarios ranging from economic/political forecasting
 to fire evacuations . The extent to which a cognitive bias may negatively impact a decision
(whether it is automatic, reflective, and/or based on a heuristic) can also vary widely from having no
consequential impact to having life threatening implications.
3. FIRE ENGINEERING COGNITIVE BIASES
The following section includes potential cognitive biases with examples which may exist within
fire engineering during the building design process. The list is derived from biases identified in general
decision science literature and applied to fire engineering activities through identification of different
cognitive bias types which are listed in Table 1.
Table 1: Cognitive Bias Types
Cognitive Bias Type
Action bias 
The tendency to think that value can only be realized
through action as opposed to practicing restraint.
The tendency to rely too heavily, or "anchor" on a past
reference or on one trait or piece of information when
Authority bias [7,12]
The tendency to do (or believe) things that a person in an
authoritative position states, believes, or does.
Availability bias [7,12]
The tendency to focus on the most salient or
Clustering illusion 
The tendency to erroneously consider the inevitable
"streaks" or "clusters" arising in small samples from
random distributions to be non-random; e.g. taking
correlation for causation.
Confirmation bias 
The tendency to search for or interpret information in a
way that confirms one's preconceptions or desires.
Default bias 
The tendency to follow a default option.
Familiarity bias [12,13]
The tendency to favour something which is familiar over
something which is not.
Halo affect 
The tendency for another person/organizations perceived
positive traits to "spill over" from one area to another.
Hindsight bias 
The tendency to perceive events that have already
occurred as having been more predictable than they
actually were before the events took place.
Incentive bias 
The tendency to make a given decision based on an
incentive derived from an individual’s own interests.
Optimism bias 
The tendency to underestimate the likelihood of
experiencing a negative event.
Probability neglect 
The tendency to disregard the likelihood/probability of
an outcome when making a decision.
Risk aversion bias 
The tendency to favour a more certain outcome over one
which involves a higher level of uncertainty.
Status quo bias 
The tendency to agree with the decision of a group.
Sunk cost bias 
The tendency to continue a behaviour as a result of
previously invested time, money, or effort.
Survivorship bias [16,17]
The tendency to concentrate on the things which have
survived or are successful in achieving a given outcome
whilst overlooking those which did not.
Based on the cognitive bias types listed above, examples of cognitive biases which may exist within
fire engineering are proposed in Table 2. Unless explicitly stated, all cognitive bias examples are those
exhibited by a fire engineer in a given decision-making context. The titles of the cognitive biases (which
are specific to fire engineering) are proposed for each example in bold. These titles were chosen to
closely follow the associated cognitive bias type to assist with identification. However, in some
instances the titles do deviate slightly from the cognitive bias type to be more descriptive and fire
engineering focussed. The cognitive biases are categorised according to the elements of the fire
engineering process that they might affect decision-making, namely:
• Fire codes (i.e. how fire codes are interpreted and applied)
• Modelling/calculations (i.e. how fire/evacuation/structural models are used and results
• Probabilistic risk assessments (i.e. how probabilistic fire risk assessments are conducted and
• General fire engineering practice (i.e. how decisions within the general fire engineering
processes may be influenced)
• Perceptions based on experience (i.e. how perceptions of a fire engineer may be influenced
by past experience)
The list of cognitive bias examples is not exhaustive and not based on empirical data specific
to fire engineering but is instead based on a broader context; i.e. deriving parallels between
general biases identified within decision science literature and experience within fire safety
engineering practice by the authors. The example biases have been identified and refined based on
communication with practicing fire engineers and other professionals in the field (predominantly
colleagues of the authors) to provide additional confidence in their existence and associated
definitions. The extent to which the cognitive biases may influence an eventual building design
may vary significantly and are likely to depend on the nature of the project and the fire engineer
involved. As such the extent to which the cognitive biases exist, how common they are, or
influencing factors, is unclear. However, even the potential for their existence warrants
attention given the safety critical nature of a fire engineers work.
Table 2: Postulated Fire Engineering Cognitive Biases
Specific Bias Example
Application to Fire Engineering
Fire Code Biases
Code Familiarity Bias: Believing that a building designed in accordance
with a familiar prescriptive fire code is safer than one designed using less
familiar performance-based methods.
Code Authority Bias: Believing that a building designed in accordance
with fire codes is safe due to being written by an authoritative governing
body (“The building is safe because it is code compliant.”) without
questioning the suitability of the underlying assumptions for a given
Risk aversion bias
Code Risk Aversion Bias: Choosing to follow fire codes due to a belief
that it will reduce financial/ legal/ approval risks.
Default Value Bias: Using default values within a model / calculation
without questioning the suitability or underlying assumptions.
Default Tool/Model/Method Selection Bias: Using a given
tool/model/method because an organisation, individual, or approving
authority has an existing tradition/affiliation/experience/obligation to use
it, irrespective of the suitability of the tool/model/method for a specific
Model Familiarity Bias: Conducting a modelling analysis using a
familiar model (e.g. have used on a past project, was taught at university,
etc) without consideration of using an alternate model which may be
Probabilistic Risk Assessment Biases
Score Anchoring Bias: Selecting/Revising the frequency/severity scores
within a probabilistic risk assessment which is similar/the same as a past
assessment/scenario/original value without suitable consideration to the
specific differences between each assessment/scenario.
Past Incident Score Bias: Over estimation of the likelihood and/or
severity of a fire scenario due to a recent major fire incident being salient
in memory (i.e. more easily ‘available’ in memory).
Probability Neglect Score Bias: Overly focusing an analysis or
mitigating measures based on the consequence of an event, and giving
no/little consideration to the probability of it occurring.
General Fire Engineering Practice Biases
Physical System Bias: Focusing on the specification/design of physical
fire systems in a building design whilst paying little/no attention to non-
physical systems e.g. management procedures, evacuee behaviour, etc.
Past Incident Availability Bias: Overly focusing on assessing a given
aspect of the building design whilst neglecting other parts of the building
design due to it being a major contributing factor in a recent major fire
incident (i.e. it being more easily ‘available’ in memory).
Engineer Financial/Time Incentive Bias: Conducting an assessment of
a building fire safety design with less consideration of negative fire safety
issues given project financial/time incentives or constraints.
Fire Safety Confirmation Bias: Favoring information which supports
ones assumptions about a building’s level of fire safety whilst paying less
attention to the reasons contradicting these assumptions.
Practitioner Halo Affect: Being inclined to trust the work of a given fire
practitioner due to having past positive experience in an unrelated
project, leading to less scrutiny of their future work.
Chartered/Registered Fire Engineer Halo Affect: An approving
authority being more likely to trust the opinion of a fire engineer who is
chartered/registered giving little/no consideration to the specific
experience or capability of the engineer to perform a given item of work
and/or leading to less scrutiny of their work.
Staff Authority Bias: Being more likely to trust information from a
senior member of staff due to them being in a position of authority and
neglecting the validity of the information.
Status quo bias
Meeting/Group Status Quo Bias: Tendency to adhere to the current or
accepted views of others in a meeting or group despite having a differing
opinion and/or giving little/no consideration to the technical basis of the
Project Planning Optimism Bias: Being inclined to think a future fire
engineering project which is being bid for will not overrun in cost/time
while giving insufficient consideration to aspects of the project which
may cause increased cost/time; e.g. decrease performance, unexpected
events which cause delays, etc.
Probability Neglect Assessment Bias: Disregarding the likelihood of a
fire occurring and overly focusing on the outcome of the fire.
Sunk cost bias
Engineering Sunk Cost Bias: Continuing to conduct an item of work or
following the same methodology which is clearly not working/failing
due to having already heavily invested/committed in terms of
Perceptions Based on Experience Biases
Experience/ Research Bias: Overly focusing on the fire safety aspects
of a design which an individual is familiar/done past research/has
significant experience (i.e. it being more easily ‘available’ in memory)
Incident Hindsight Bias: Believing a past fire incident was foreseeable
and preventable based on post-incident fire analysis, neglecting
information which was not known or unlikely to be known by the
relevant stakeholders prior to the incident.
Incident Clustering Illusion: Identifying a false pattern of failure from
a series of past fires given that the incidents share a particular attribute
(e.g. building type), ignoring the impact and interaction of other factors
present in each building, unduly simplifying the assessment of the
incidents and exaggerating the impact of the attribute highlighted.
Incident Action Bias: Believing that action must be taken following a
major fire incident (e.g. updates to the fire code, changes in legislation,
etc) giving little/no consideration to practicing restraint or appreciating
that the incident was extremely rare and very unlikely to occur again.
4. COGNITIVE BIAS MITIGATION MEASURES
Based on the fire engineering biases identified in the previous section, a series of mitigation
measures are now proposed to reduce the likelihood of such biases occurring in fire engineering and
reducing their impact. These have been grouped into the following categories: (1) promoting bias
awareness and conscious decision-making; (2) providing incentives; (3) developing nudges, and (4)
developing procedural methods. These are derived from cognitive bias mitigation measures identified
in psychological literature with applications outside of fire engineering. The measures are discussed in
terms of potential effectiveness, applicability (which biases it can potentially mitigate), method of
implementation, and cost (time/money/resources involved with setting them up). No studies or
empirical data has been collected regarding the extent to which such measures may reduce the likelihood
or impact the cognitive biases specifically within fire engineering. As such the extent these measures
may influence cognitive biases within fire engineering requires further investigation. It should also be
highlighted the proposed methods are not exhaustive, may not be appropriate/possible to apply in
certain situations, and may only possible to implement by select people/organisations. Furthermore, it
may be possible to classify certain methods into more than one category.
4.1 PROMOTING BIAS AWARENESS AND CONSCIOUS DECISION-MAKING
Being aware of cognitive biases and promoting conscious awareness of the steps involved in an
explicit thought process when making a decision (i.e., shifting the decision-making process from the
automatic system to the reflective system) has been shown to mitigate cognitive biases and poor
decision-making in general [21,22]. Klein  broadly discusses the features of improving conscious
decision-making as including:
⚫ Awareness of requirements of learning process
⚫ Recognition of limitations of memory
⚫ Ability to appreciate perspective
⚫ Capacity for self-critique
⚫ Ability to select strategies
Promoting awareness of cognitive biases within fire engineering could provide several benefits within
the decision-making process. Fire engineers with effective decision-making skills would be aware of
common mistakes caused by cognitive biases and can also learn from past decisions where cognitive
biases may have occurred. They could recognize cognitive biases, for example in the design process
and appreciate different perspectives which may provide insight for a given decision. They would be
able to critique their own decisions to identify cognitive biases and adopt appropriate strategies for
complex decisions in order to mitigate them. The following are potential measures to promote
awareness and conscious decision-making in order to mitigate cognitive biases within fire engineering:
⚫ Training: Providing training to communicate the existence of cognitive biases in fire
engineering and associated mitigation measures for making key decisions within the fire
engineering process. There are a large number of cognitive biases which might impact
a wide range of fire engineering decisions as highlighted in the previous section. It is
currently difficult to estimate the effect size of an individual bias or the combined
impact of several interacting biases. As such, it is perhaps unreasonable to require an
individual to remember all cognitive biases as part of any training. Instead, the training
should focus on promoting broad awareness of cognitive biases and a reflective
approach to decision-making to mitigate against cognitive biases occurring.
⚫ Delayed response: Leaving more time to think about a decision to reduce the need of the
automatic system of thinking which can result in overly quick decisions being made which
may then be reversed justified. For example, defer providing feedback regarding a fire
engineering query/complex decision until after a meeting rather than giving a response in the
⚫ Verbalizing: Talking out loud or with others about why a given decision is being made may
promote conscious decision-making through the use of the reflective system.
⚫ Cognitive bias checklist: Adopting the use of a cognitive bias checklists within a decision-
making process e.g. evacuation modelling, fire strategy report development, etc., to ensure
they are not overlooked.
Incentives refer to methods encouraging active thinking about a given problem through reward or
punishment. Providing incentives has been shown to promote better decision-making and potentially
reduce cognitive biases in certain settings . Generally, the value of the incentive increases with the
associated value of identifying/correcting a given decision. Potential incentives to promote better
decision-making and mitigate cognitive biases within the fire engineering process may include:
⚫ Rewards: Providing financial rewards (e.g. bonus) or annual leave (e.g. leave early on a Friday)
when significant errors are identified in a project (e.g. “good catch reward”).
⚫ Token incentives for groups: Grouping people into teams and providing points to people in
each team when errors are identified in projects. The team with the most points at the end of a
given time period is rewarded.
⚫ Recognition: Celebrating projects which went well and also highlighting projects which they
did not go well with lessons learnt through internal company newsletters.
In effect, these activities are intended to modify the working culture in given contexts such that
problems should be highlighted and seen as a learning process, rather than hidden or seen as a failure.
A key aspect of any incentive system within fire engineering is to clearly define what constitutes errors
in judgement in a given decision-making process. For fire engineering the incentives could be classified
according to the biases associated with given tasks shown in Table 2. It is also important to define a
standard relative measure of incentives in terms of how much incentive is provided for identification of
a given bias: the larger the bias and subsequent consequence, the large the incentive should be. The
specification of the incentive should be provided before or at the beginning of a project/item of
work/working contract so that all stakeholders are informed and ideally be included in formally agreeing
to the incentives.
Nudges represent small, simple, nonconscious and typically low-cost methods to (a) guide (increase
the likelihood of) a positive decision, rather than dictate the response; and (b) mitigate potential biases
[24, 30]. Nudges would typically be adopted where fire engineers make decisions using the automatic
system for relatively small or routine tasks though can also be for decisions using the reflective system
or for larger task. Potential nudges which may be adopted within fire engineering to promote better
decision-making and mitigate cognitive biases are listed below:
⚫ Changing modelling defaults: Changing the default values in fire/evacuation/structural fire
models (which would be used each time the modelling software is opened unless previously
edited) to being highly conservative to encourage users to change the values to be more realistic
and subsequently justify their selection – and consciously be aware of the new settings adopted.
If users do not change the defaults, then the simulation will be more conservative so less likely
to present a potential safety risk . This nudge would only be possible by model developers
or where a model allows users to change default values.
⚫ Automated checkpoints in software tools: Whenever a software user is required to make a
decision, these should be made explicit and made salient (e.g. pop-up window asking: “Do you
want to use the default values?”). This nudge would only be possible by model developers or
where an organisation has access to software tool source code.
⚫ Commitment devices: A commitment device is a way to oblige an individual into a given
action that they might not want to do but is good for them . Potential commitments devices
which may be adopted within fire engineering to promote better decision-making and mitigate
cognitive biases are listed below:
o Create larger obstacles to temptations than the temptation itself:
▪ Pay fire engineering fees upfront before a project commences in order to
mitigate against financial bias.
▪ Enforce minimum times to answer queries or to complete a project to
mitigate against not having enough time to fully assess or complete a detailed
analysis for a query/project.
o Have the fire engineer make a publicized commitment to an action so reputation
could be affected:
▪ Conduct internal ‘third party’ reviews within an organization of a fire
engineering analysis with any mistakes anonymously shared with the wider
▪ Make all fire strategy documentation (documentation which fire engineers
prepare as part of the building design process which describe the fire safety
features of the building and how they are deemed to provide a sufficient level
of fire safety) subject to public scrutiny.
▪ Publishing thought leadership / good practice articles in fire engineering trade
magazines/journals for scrutiny or endorsement. These can be used as
evidence in a review if a fire engineer does not uphold the published
▪ Create a structure whereby professional registration is mandatory and provide
a mechanism for poor practice / incidents to be anonymously reported. The
reporting could then have implications for the renewing of licensing if found
to be valid. This would need to be orchestrated by societies/institutions which
authorize/issue professional fire engineer registrations in a given country e.g.
Society of Fire Protection Engineers (SFPE), Institution of Fire Engineers
▪ Develop and encourage participation in fire engineering community groups
where common practice (good and bad) is better recognized.
o Make a monetary contract to increase the benefit of keeping a promise:
▪ Include a bonus fee for a fire engineering project in a holding account which
is only released a given number of years after the building is built and shown
to present no fire safety issue inherent in the design.
⚫ Reframing Information: Reframing refers to changing the method by which information is
communicated between people to promote better decision-making and mitigate against
cognitive biases/misinterpretation. Potential methods of reframing within fire engineering
which could promote better decision-making and mitigate cognitive biases are listed below:
o Presenting all high fire risk aspects of a fire strategy nearer the beginning of a report to
ensure they are noticed by the relevant stakeholders. Developing standard industry
wide reporting guidance for such information would be advantageous by fire
engineering societies/institutions within each country.
o Considering severe but unlikely fire/evacuation scenarios which could cause the fire
strategy to fail (i.e. cause loss of life), in order to present the limitations of a given fire
The low-cost nature of nudges being voluntary generally means the cost involved in their use is typically
low. They can also be applied to a wide variety of decisions types. Once a specific nudge has been
developed it is imperative that it be tested first on a sample of individuals to ensure it has the desired
affect before being implemented on a large scale; i.e. across projects or organizations.
Procedural measures are those which are adopted as part of standard practice within an organisation to
carry out specific processes within fire engineering in order to mitigate cognitive biases. These can
potentially represent a wide range of activities involved in checking, communicating with peers,
providing feedback, or consistently performing a well-defined task given a decision under specified
conditions. Indeed, a number of the measures previously categorized in the preceding sections could
also be classified as procedural measures. Potential other procedural measures which may be adopted
within fire engineering to promote better decision-making and mitigate cognitive biases are listed below:
⚫ Modelling checklists: Adopting the use of quality assurance checklists within
evacuation/fire/structural fire modelling to ensure aspects of the modelling process are not
overlooked. As part of this it is important that the user establishes how they have met the
requirement as part of the check, rather than just signify that they have met the requirement.
The checks might therefore be posed in the form of questions which a user must provide
answers to as opposed to a simple Boolean checkbox.
⚫ Encourage the questioning of senior staff: After a senior member of staff has stated a
decision, requesting junior members of staff to test the approach, suggest alternatives, and
‘game’ how the approaches could go wrong. This creates a situation that promotes an
individual to think about an issue, makes it socially acceptable for criticism to be made, and
(hopefully) builds confidence in the senior staff who may veer towards a more standard
approach without such input.
⚫ Peer review input/checking: Requiring all fire engineering documentation to be initially
checked by a peer and requiring specific feedback (e.g. project manager) during the process –
as opposed to only at the end of the process.
⚫ Fire code justification: Requiring a fire engineer to justify why adopting a prescriptive fire
code compliant aspect of design is suitable and under what conditions it may not be suitable.
⚫ Third party fire strategy reviews: Requiring anonymised third-party reviews of fire strategy
documentation which is submitted to approving authorities. It should be noted that this already
exists within the fire engineering field but is not common practice for all projects, is typically
limited to projects where a certain aspect of the fire strategy contains specialised analysis e.g.
fire/evacuation/structural fire modelling, and is typically not anonymous.
⚫ Mandatory fire/evacuation modelling for large projects: Requiring all projects which
involve a large number of people to include mandatory fire/evacuation modelling assessments
to demonstrate all people can evacuate before untenable conditions occur rather than it be
assumed this is the case through compliance with prescriptive fire codes.
⚫ Mandatory probabilistic fire risk assessment for uncommon buildings: Where a building
is defined as being ‘uncommon’ i.e. an aspect(s) of the design is defined as not being
fundamentally appropriate to apply standard fire design guidance, demonstration that the
probability and severity of a fire are both tolerable and as low as reasonably practicable
⚫ Pre-mortem meetings: The concept of pre-mortem meetings was developed by Klein .
Its broad principle is that when an important decision is being made, a meeting is held with
the relevant stakeholders. Within the meeting, a hypothetical situation is proposed where the
desired decision was selected, but after some time was found to be a total disaster. The people
in the meeting must then discuss the potential reasons why the choice was a disaster. After
which they then decide how the disaster would occur by identifying the potential causes and
how they might be mitigated. The group then decide if the choice should still be taken. Posing
such a hypothetical situation makes it more acceptable to freely discuss negative aspects of a
decision by reducing whereby everyone just states positive views to support the group. Pre-
mortem meetings could be adopted in a wide range of important fire engineering decisions
including the following:
o Deciding to bid on a project (e.g. identifying the most likely tasks that may involve
costs or time overruns).
o Deciding to use performance-based analysis to support a noncompliance with
prescriptive fire codes.
o Deciding whether to use a given material in the building design which could
significantly influence a fire.
⚫ Initial assessment: Independently deriving potential approaches to a fire engineering problem
before consulting others to mitigate against anchoring to their views. Similarly, when
consulting the opinion of others about a fire engineering decision, avoiding discussing your
proposed solution or asking leading questions until they given their thoughts in order to
mitigate against anchoring their views.
⚫ ‘What if’ assessment: A systematic consideration of failure of each fire safety system(s) /
provision(s) and assessment of the consequence to demonstrate a design is not unduly
sensitivity to a single/small number of system failures before unacceptable levels of severity
occur during a fire. This is to mitigate against unlikely but high consequence fire scenarios
being given little/no consideration.
⚫ Devil’s advocate project review: Conduct project reviews whereby a fire strategy or analysis
is presented to colleagues who are not involved in the project to get general feedback. Some
of the colleagues are tasked with deliberately trying to find problems with the fire strategy (i.e.
they play Devil’s Advocate) to mitigate against status quo bias.
Measures to implement all methods can be clearly defined/documented and effectively be disseminated
within an organisation. However, implementation and compliance with procedural measures would
incur some cost in terms of time to perform which would vary depending on the measure.
5. LIMITATIONS AND PROPOSED RESEARCH
Several limitations of the present work help outline the path for future research. First and foremost,
the biases identified here have been drawn from basic and applied research from a number of different
disciplines but have not specifically been studied in fire engineering. While there are good reasons to
assume that these might be applicable to fire engineering, these would ideally be tested empirically to
confirm their existence, measure the extent to which they occur, and provide guidance as to which
biases are most prevalent. Such empirical validation would also allow the development of targeted
mitigation measures and associated more efficient allocation of resources. Furthermore, the nature of
certain biases means they would be challenging to identify/quantify outside of hypothetical scenarios
with surveys or measured within experimental conditions due to it being difficult to isolate influencing
factors. Such data collection methods would be consistent with a number of previous empirical studies
that have used hypothetical scenarios within surveys or controlled experimental conditions to identify
cognitive biases . However, this does raise the question of validity of findings derived from such
studies of whether the behaviour can reliably be observed in real world conditions .
To illustrate an avenue towards evidence-based empirical studies on cognitive bias, an example is
presented in which first evidence for a specific bias is collected and then a potential remedy is tested.
Default Value Bias suggests that fire engineers may tend to use default values of a model without
questioning the suitability or underlying assumptions. To test this hypothesis, experienced fire engineers
could be tasked to simulate a range of scenarios under varying conditions (e.g., time pressure,
appropriateness of default values). If, for example, fire engineers are more likely to use default values
within a model under time pressure, even in scenarios when this is not appropriate, this could be seen
as evidence for the Default Value Bias. The next step would then be to introduce a mitigation measure
to one or several subgroup of the participants and test if the measure would reduce the bias (case-control
Finally, a refinement of the proposed list of biases is desirable. For instance, it is conceivable that some
of the biases discussed here are conceptually overlapping and/or significantly more prevalent/dominant
than other biases. In turn, although the number of biases proposed for a variety of fire engineering
decision-making context is already quite extensive, it is by no means exhaustive and intended to be
illustrative. Future research efforts should therefore try to identify redundancies and also biases that
may be unique to fire engineering. Such validation may not only be of purely academic interest but
might also provide insights into the potential costs associated with biases within commercial practice.
The paper presents a general description of decision-making and cognitive biases. Potential fire
engineering cognitive biases are presented along with a range of potential mitigating methods. The
biases and mitigation measures discussed in the paper are focused on technical work within the field of
fire engineering rather than associated with general biases within the associated workplace environment
e.g. gender/age biases etc. It should be highlighted that the insights presented within the paper come
from a review of general decision science literature, observations within the general fire engineering
field, and users of fire engineering calculations/models. As such, many aspects of it are
speculative/theoretical. This limited empirical basis highlights the need for data to be collected
regarding identification and prioritisation of the most common/influential fire engineering cognitive
biases along with associated mitigation measures. Readers are therefor cautioned that whilst such fire
engineering biases may exist within the field, the extent to which they occur is uncertain and that it
should not be assumed that they always occur throughout the fire engineering field. The key objective
of this article is to promote awareness of the existence of potential fire engineering cognitive biases and
opportunities of associated mitigation measures. Those that may benefit from awareness of such biases
and mitigation measures include not only practicing fire engineers, but also building developers, fire
code committees, evacuation/fire/structural fire modelling developers, approving authorities, and fire
The authors would like to thank Peter Johnson (Arup, Australia), Dr. Claire Cooper (Emergency
Management Victoria, Australia), and Prof. Norman Groner (John Jay College, US) for their invaluable
insights and suggestions regarding the fire engineering biases within the paper.
The research and contents expressed in this paper represent those of the authors and not necessarily
those of the organisation which they are employed/associated with. The study was conducted entirely
voluntarily without any funding to promote impartiality and mitigate associated biases.
1. Lewis, M. (2016). The undoing project: A friendship that changed the world. Penguin UK.
2. Van Coile, R., Hopkin, D., Lange, D., Jomaas, G., Bisby, L., (2019). The Need for
Hierarchies of Acceptance Criteria for Probabilistic Risk Assessments in Fire Engineering.
Fire Technology, Volume 55, Issue 4
3. Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: a failure to disagree.
American psychologist, 64(6), 515.
4. Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In
Communication and persuasion (pp. 1-24). Springer, New York, NY.
5. Sun, R. (2001). Duality of the mind: A bottom-up approach toward cognition. Psychology
6. Dolan, P., Hallsworth, M., Halpern, D., Dominic, K., Vlaev, I. (2010). MINDSPACE:
Influencing Behaviour Through Public Policy, Institute for Government, Cabinet Office.
7. Kahneman, D. (2011). Thinking, fast and slow (Vol. 1). New York: Farrar, Straus and
8. Bar-Eli, M., Azar, O. H., Ritov, I., Keidar-Levin, Y., & Schein, G. (2007). Action bias
among elite soccer goalkeepers: The case of penalty kicks. Journal of Economic
Psychology, 28(5), 606-621.
9. Kinsey, M. J., Gwynne, S. M. V., Kuligowski, E. D., & Kinateder, M. (2019). Cognitive
biases within decision making during fire evacuations. Fire Technology, 55(2), 465-485.
10. Wood, T., Porter, E. (2018). The Elusive Backfire Effect: Mass Attitudes’ Steadfast
Factual Adherence. Political Behavior pp. 1-29. doi:10.1007/s11109-018-9443-y.
11. Heath, C. (1999). On the social psychology of agency relationships: Lay theories of
motivation overemphasize extrinsic incentives. Organizational behavior and human
decision processes, 78(1), 25-62.
12. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases.
Science, 185(4157), 1124-1131.
13. Park, C. W., & Lessig, V. P. (1981). Familiarity and its impact on consumer decision biases
and heuristics. Journal of consumer research, 8(2), 223-230.
14. Oswald, M.E., Grosjean, S. (2004). Confirmation Bias. In Pohl, Rüdiger F. (Eds.),
Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and
Memory (pp. 79–96). Hove, UK: Psychology Press.
15. Colman, A. (2003). Oxford Dictionary of Psychology. p. 77. New York: Oxford University
16. Taleb, N. N. (2007). The black swan: The impact of the highly improbable (Vol. 2).
17. Elton, E. J., Gruber, M. J., & Blake, C. R. (1996). Survivor bias and mutual fund
performance. The review of financial studies, 9(4), 1097-1120.
18. Huh, Y. E., Vosgerau, J., & Morewedge, C. K. (2014). Social defaults: Observed choices
become choice defaults. Journal of Consumer Research, 41(3), 746-760.
19. Bernstein, D. M., Erdfelder, E., Meltzoff, A. N., Peria, W., & Loftus, G. R. (2011).
Hindsight bias from 3 to 95 years of age. Journal of Experimental Psychology: Learning,
Memory, and Cognition, 37(2), 378.
20. Helweg-Larsen M, Shepperd JA (2001). Do moderators of the optimistic bias aﬀect
personal or target risk estimates? A review of the literature. Personal Soc Psychol
21. Croskerry, P., Singhal, G., & Mamede, S. (2013). Cognitive debiasing 1: origins of bias
and theory of debiasing. BMJ Qual Saf, 22(Suppl 2), ii58-ii64.
22. Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–
developmental inquiry. American psychologist, 34(10), 906.
23. Klein, G. A. (2017). Sources of power: How people make decisions. MIT press.
24. Sunstein, C.R, Richard H. Thaler, (2008). Nudge: improving decisions about health,
wealth, and happiness (Revised and expanded ed.). New Haven, Conn.: Yale University
25. Gwynne, S. M. V., & Kuligowski, E. (2010). The faults with default. In Proceedings of
the Conference Interflam2010, Interscience Communications Ltd: London (pp. 1473-
26. Bryan, G. Karlan, D., Nelson, S. (2010) Commitement Devices. Annual Review of
27. British Standards Institute. (2019). PD 7974-7 Application of the fire safety engineering
principles to the design of buildings. Probabilistic risk assessment.
28. Klein, G. (2007). Performing a project premortem. Harvard business review, 85(9), 18-19.
29. Gigerenzer, G. (1991). How to Make Cognitive Illusions Disappear: Beyond “Heuristics
and Biases”. European Review of Social Psychology. Vol 2, pp 83-115.
30. Sunstein, C. (2014). Nudging: A Very Short Guide, Journal of Consumer Policy. 583.