ArticlePDF Available

Burning Biases: Mitigating Cognitive Biases In Fire Engineering


Abstract and Figures

Fire engineering has developed into a mainstream engineering discipline within the building design process. Building fire codes are increasingly complex, comprising thousands of requirements regarding a wide range of topics that must be considered. Fire engineers are required to possess increasingly complex knowledge about a variety of subjects, along with expertise in their application. This has been magnified with the proliferation of performance-based methods using a range of computational tools. This coupled with increased project performance pressures, raises the potential for errors in judgement. Errors in judgement may be caused by limitations in a given resource (e.g. time, information available, knowledge, etc) and/or neglect/over-focus on specific information (at the expense of other and more relevant information) through cognitive biases. This paper initially provides a broad overview of general decision-making including the use of heuristics and cognitive biases. Examples of cognitive biases are presented which may be linked to errors in fire engineer decision-making. The study considers several fire engineering decision contexts where cognitive biases may exist which are associated with fire code application, modelling/calculations, probabilistic risk assessments, general fire engineering practice, and perceptions based on experience. Potential measures to mitigate some of these biases and prompt better decision-making are discussed. Those that may benefit from awareness of such biases and mitigation measures include not only practicing fire engineers, but also building developers, fire code committees, evacuation/fire/structural fire modelling developers, approving authorities, and fire engineering researchers/students.
Content may be subject to copyright.
Michael J. Kinsey*, Arup, China
Max Kinateder, National Research Council Canada, Canada
Steven M.V Gwynne, Movement Strategies, UK
Danny Hopkin, OFR Consultants, UK
Fire engineering has developed into a mainstream engineering discipline within the building design
process. Building fire codes are increasingly complex, comprising thousands of requirements regarding
a wide range of topics that must be considered. Fire engineers are required to possess increasingly
complex knowledge about a variety of subjects, along with expertise in their application. This has been
magnified with the proliferation of performance-based methods using a range of computational tools.
This coupled with increased project performance pressures, raises the potential for errors in judgement.
Errors in judgement may be caused by limitations in a given resource (e.g. time, information available,
knowledge, etc) and/or neglect/over-focus on specific information (at the expense of other and more
relevant information) through cognitive biases. This paper initially provides a broad overview of general
decision-making including the use of heuristics and cognitive biases. Examples of cognitive biases are
presented which may be linked to errors in fire engineer decision-making. The study considers several
fire engineering decision contexts where cognitive biases may exist which are associated with fire code
application, modelling/calculations, probabilistic risk assessments, general fire engineering practice,
and perceptions based on experience. Potential measures to mitigate some of these biases and prompt
better decision-making are discussed. Those that may benefit from awareness of such biases and
mitigation measures include not only practicing fire engineers, but also building developers, fire code
committees, evacuation/fire/structural fire modelling developers, approving authorities, and fire
engineering researchers/students.
The principal role of a fire engineer is to use scientific and engineering principles, fire codes and
expert judgment, based on an understanding of fire, building, and people to ensure a building design
provides a sufficient level of fire safety. Fire engineers may be involved in a broad set of tasks requiring
a range of knowledge and highly specialised expertise. Although a fire engineer may possess
prerequisite expertise, there are many possible constraints one might need to consider that might affect
their performance. For instance, a fire engineer may be required to develop the necessary
skills/knowledge or be supported by other fire engineers (who already have the necessary
skills/knowledge) on a project in order to complete a given task. Indeed, where a fire engineer is lacking
in-depth technical expertise or experience about a given subject, they may rely on the fire codes for
guidance, potentially without a complete awareness of any underlying basis or assumptions for the
guidance. There may also be limited time in which a fire engineer is required to conduct a task in order
to meet project deadlines or limited available people to work on a given task. Typically, many of the
tasks have some level of uncertainty associated that may interact in non-additive ways. There may be
multiple ways to complete a given task and potentially multiple suitable outcomes, all of which a fire
engineer must contend with when assessing suitability. Assuming the possession of the necessary
knowledge and technical expertise to conduct such tasks, fire engineers may still make mistakes, lapses
in judgement or make inconsistent assumptions - which can all be caused by cognitive biases. This may
also be compounded by the fire engineer being subject to routine personal issues including boredom,
fatigue, illness, situational and interpersonal distractions, all of which can affect their ability to
consistently perform and make suitable decisions [1]. Furthermore, with fires being rare events (which
may in part be attributed to preventative measures defined by fire engineers) , there is a lack of feedback
in the building design process regarding failures in methods or decision-making within the fire
engineering field. Most designed buildings will never have a fire or if they do, it will not be serious
enough to truly test the full range of fire safety features. Such feedback is critical to enhancing the
decision-making process within fire engineering. Indeed Van Coile, et al. [2] note that the “collective
experience of the profession” (i.e. guidance developed via consensus through code/standards
committees which represent the views of the profession etc) forms the foundation upon which most
traditional fire engineering building designs are accepted. This is contrary to many other types of
building engineering which are routinely put to the test in normal usage demonstrating the level of their
The paper initially provides an overview of general decision-making and cognitive biases theory.
Potential cognitive biases which may exist within fire engineering are then posited. A range of potential
cognitive bias mitigation measures are then proposed. This paper is intended to identify some means of
reducing potential sources of mistakes and lapses in decision-making which fire engineers may exhibit
through cognitive biases within the building design process. Due to the multi-disciplinary nature of the
subject matter, the authors represent research and commercial engineering practice, including two
practising fire engineering consultants and researchers in psychology and human behaviour in
Research in decision science proposes that decision-making can be thought of as if employing two
separate, though interacting, systems of thinking: an automatic system and a reflective system [3-7].
The automatic system occurs without a person being directly aware of it. It is used for most decisions,
is relatively quick, and is based on a person recognising given situations or types of situation and
associating it with a stored response effectively pattern matching the scenario with stored examples
and then identifying associated stored responses. Commonly it is adopted in familiar situations or where
a quick response is required; i.e. where conscious decision-making processes are overly complex and/or
would not be quick enough. This system is used to generate impressions, feelings, and inclinations. It
is possible to train the automatic system to identify suitable responses when given patterns in a
context/situation are identified (e.g. in the form of heuristics, see below). Such training may be received
formally (e.g., in the classroom), or may be developed over time through repeated experience (e.g.,
seasoned firefighting knowledge through repeatedly attending fire incidents). The automatic system
distinguishes the surprising from the normal, and influences expectation in a given context. It allows
experts (e.g., firefighters, airline pilots, etc.), to competently operate in complex situations under time
pressure. Where information is lacking, the automatic system extrapolates based on plausible options
without factual input, potentially giving the illusion of validity to a decision.
The reflective system occurs in the conscious and allows people to address situations in a more
deliberative manner. It is more consciously effortful to use, relatively slow and limited by working
memory. In comparison to the automatic system, it requires more information, cognitive resources, and
focus to complete the decision-making process. Unlike the automatic system, its capacity to process
information is limited by working memory. Commonly, it is adopted in unfamiliar situations (e.g.,
wayfinding in an unfamiliar building, becoming aware of unexpected changes in an environment, etc.),
where a task requires reflection or requires a systematic process (e.g., solving/approximating a long
mathematical calculation, identifying the appropriate strategy to attack a fire in a complex scenario,
etc.), and/or where time is not considered a limiting factor in an decision[3-7]. Neither the automatic
of reflective system is guaranteed to produce correct or incorrect answers. The key differences between
the automatic and reflective systems is that the latter occurs consciously and requires the use of working
memory, whereas the former occurs nonconsciously and does not require working memory [6,7].
Irrespective of which system of decision-making is primarily used, the process is bounded in terms of
information available, time available, and an individual’s mental resources to process such information.
To compensate for such limitations and manage uncertainty or complexity with a decision, people may
employ heuristics in the decision-making process.
Heuristics are defined as relatively simple rules of thumb [7] which can be applied to complex decisions
that can result in a suitable response. Such rules of thumb may be developed by creating associations
between a given situation and past experience of a learned response, which results in the adoption of a
given action based on the association, shortcutting the reflective aspects of decision-making. The
development of a rule of thumb involves substituting the decision being made for a similar simpler
decision which can be used to produce an outcome. It may also be possible to deliberately prime an
individual to adopt heuristics through training to enable speedy responses to a situation; however, the
heuristics may not be appropriate if used outside of the intended set of scenarios, placing greater
importance on the pattern matching capability of the individual and the quality of the information
available. Heuristics are extremely common and can be used in a variety of situations. For instance,
experienced fire incident commanders who are capable of making quick and relevant decisions during
a fire incident under time pressure. They do this through the use of certain heuristics called recognition
primed decision-making [2]. During a fire incident, automatic triggers of a fire commander’s memory
(of previously experienced incidents), matches a given set of criteria of the current fire incident together
with the learned appropriate response pattern. This type of decision-making process may be labelled as
‘expert intuition’, which in part demonstrates the nonconscious nature in which it occurs [2]. However,
if heuristics are employed inappropriately (e.g. due to incomplete information) systematic errors can
inadvertently be produced. This can be made worse through the existence of cognitive biases, where
information is inappropriately processed or overly focused upon at the expense of more relevant
information in the decision-making process. The adoption of heuristics based on partial or incomplete
information (leading to mismatches between scenario and response) may occur in the following
situations: where a person has no previous experience of making a decision in a particular situation;
there is no learned response (e.g. a person has not learned or been trained what to do in a given situation);
or in novel situations with limited resources at hand (e.g., time, mental capacity, information, etc.).
Cognitive biases can occur in a wider variety of scenarios ranging from economic/political forecasting
[7] to fire evacuations [9]. The extent to which a cognitive bias may negatively impact a decision
(whether it is automatic, reflective, and/or based on a heuristic) can also vary widely from having no
consequential impact to having life threatening implications.
The following section includes potential cognitive biases with examples which may exist within
fire engineering during the building design process. The list is derived from biases identified in general
decision science literature and applied to fire engineering activities through identification of different
cognitive bias types which are listed in Table 1.
Table 1: Cognitive Bias Types
Cognitive Bias Type
Action bias [8]
The tendency to think that value can only be realized
through action as opposed to practicing restraint.
Anchoring [7,12]
The tendency to rely too heavily, or "anchor" on a past
reference or on one trait or piece of information when
making decisions.
Authority bias [7,12]
The tendency to do (or believe) things that a person in an
authoritative position states, believes, or does.
Availability bias [7,12]
The tendency to focus on the most salient or
emotionally-charged outcome.
Clustering illusion [12]
The tendency to erroneously consider the inevitable
"streaks" or "clusters" arising in small samples from
random distributions to be non-random; e.g. taking
correlation for causation.
Confirmation bias [14]
The tendency to search for or interpret information in a
way that confirms one's preconceptions or desires.
Default bias [18]
The tendency to follow a default option.
Familiarity bias [12,13]
The tendency to favour something which is familiar over
something which is not.
Halo affect [12]
The tendency for another person/organizations perceived
positive traits to "spill over" from one area to another.
Hindsight bias [19]
The tendency to perceive events that have already
occurred as having been more predictable than they
actually were before the events took place.
Incentive bias [11]
The tendency to make a given decision based on an
incentive derived from an individual’s own interests.
Optimism bias [20]
The tendency to underestimate the likelihood of
experiencing a negative event.
Probability neglect [12]
The tendency to disregard the likelihood/probability of
an outcome when making a decision.
Risk aversion bias [12]
The tendency to favour a more certain outcome over one
which involves a higher level of uncertainty.
Status quo bias [15]
The tendency to agree with the decision of a group.
Sunk cost bias [7]
The tendency to continue a behaviour as a result of
previously invested time, money, or effort.
Survivorship bias [16,17]
The tendency to concentrate on the things which have
survived or are successful in achieving a given outcome
whilst overlooking those which did not.
Based on the cognitive bias types listed above, examples of cognitive biases which may exist within
fire engineering are proposed in Table 2. Unless explicitly stated, all cognitive bias examples are those
exhibited by a fire engineer in a given decision-making context. The titles of the cognitive biases (which
are specific to fire engineering) are proposed for each example in bold. These titles were chosen to
closely follow the associated cognitive bias type to assist with identification. However, in some
instances the titles do deviate slightly from the cognitive bias type to be more descriptive and fire
engineering focussed. The cognitive biases are categorised according to the elements of the fire
engineering process that they might affect decision-making, namely:
Fire codes (i.e. how fire codes are interpreted and applied)
Modelling/calculations (i.e. how fire/evacuation/structural models are used and results
Probabilistic risk assessments (i.e. how probabilistic fire risk assessments are conducted and
results assessed)
General fire engineering practice (i.e. how decisions within the general fire engineering
processes may be influenced)
Perceptions based on experience (i.e. how perceptions of a fire engineer may be influenced
by past experience)
The list of cognitive bias examples is not exhaustive and not based on empirical data specific
to fire engineering but is instead based on a broader context; i.e. deriving parallels between
general biases identified within decision science literature and experience within fire safety
engineering practice by the authors. The example biases have been identified and refined based on
communication with practicing fire engineers and other professionals in the field (predominantly
colleagues of the authors) to provide additional confidence in their existence and associated
definitions. The extent to which the cognitive biases may influence an eventual building design
may vary significantly and are likely to depend on the nature of the project and the fire engineer
involved. As such the extent to which the cognitive biases exist, how common they are, or
influencing factors, is unclear. However, even the potential for their existence warrants
attention given the safety critical nature of a fire engineers work.
Table 2: Postulated Fire Engineering Cognitive Biases
Bias Type
Fire Code Biases
Familiarity bias
Authority bias
Risk aversion bias
Modelling/Calculation Biases
Default bias
Default bias
Familiarity bias
Probabilistic Risk Assessment Biases
[7, 12]
Availability bias
Probability neglect
General Fire Engineering Practice Biases
Availability bias
Availability bias
Incentive bias
Confirmation bias
Halo affect
Halo affect
Authority bias
Status quo bias
Optimism bias
Probability neglect
Sunk cost bias
Perceptions Based on Experience Biases
Availability bias
Hindsight bias
Clustering illusion
Action bias
Based on the fire engineering biases identified in the previous section, a series of mitigation
measures are now proposed to reduce the likelihood of such biases occurring in fire engineering and
reducing their impact. These have been grouped into the following categories: (1) promoting bias
awareness and conscious decision-making; (2) providing incentives; (3) developing nudges, and (4)
developing procedural methods. These are derived from cognitive bias mitigation measures identified
in psychological literature with applications outside of fire engineering. The measures are discussed in
terms of potential effectiveness, applicability (which biases it can potentially mitigate), method of
implementation, and cost (time/money/resources involved with setting them up). No studies or
empirical data has been collected regarding the extent to which such measures may reduce the likelihood
or impact the cognitive biases specifically within fire engineering. As such the extent these measures
may influence cognitive biases within fire engineering requires further investigation. It should also be
highlighted the proposed methods are not exhaustive, may not be appropriate/possible to apply in
certain situations, and may only possible to implement by select people/organisations. Furthermore, it
may be possible to classify certain methods into more than one category.
Being aware of cognitive biases and promoting conscious awareness of the steps involved in an
explicit thought process when making a decision (i.e., shifting the decision-making process from the
automatic system to the reflective system) has been shown to mitigate cognitive biases and poor
decision-making in general [21,22]. Klein [23] broadly discusses the features of improving conscious
decision-making as including:
Awareness of requirements of learning process
Recognition of limitations of memory
Ability to appreciate perspective
Capacity for self-critique
Ability to select strategies
Promoting awareness of cognitive biases within fire engineering could provide several benefits within
the decision-making process. Fire engineers with effective decision-making skills would be aware of
common mistakes caused by cognitive biases and can also learn from past decisions where cognitive
biases may have occurred. They could recognize cognitive biases, for example in the design process
and appreciate different perspectives which may provide insight for a given decision. They would be
able to critique their own decisions to identify cognitive biases and adopt appropriate strategies for
complex decisions in order to mitigate them. The following are potential measures to promote
awareness and conscious decision-making in order to mitigate cognitive biases within fire engineering:
Training: Providing training to communicate the existence of cognitive biases in fire
engineering and associated mitigation measures for making key decisions within the fire
engineering process. There are a large number of cognitive biases which might impact
a wide range of fire engineering decisions as highlighted in the previous section. It is
currently difficult to estimate the effect size of an individual bias or the combined
impact of several interacting biases. As such, it is perhaps unreasonable to require an
individual to remember all cognitive biases as part of any training. Instead, the training
should focus on promoting broad awareness of cognitive biases and a reflective
approach to decision-making to mitigate against cognitive biases occurring.
Delayed response: Leaving more time to think about a decision to reduce the need of the
automatic system of thinking which can result in overly quick decisions being made which
may then be reversed justified. For example, defer providing feedback regarding a fire
engineering query/complex decision until after a meeting rather than giving a response in the
Verbalizing: Talking out loud or with others about why a given decision is being made may
promote conscious decision-making through the use of the reflective system.
Cognitive bias checklist: Adopting the use of a cognitive bias checklists within a decision-
making process e.g. evacuation modelling, fire strategy report development, etc., to ensure
they are not overlooked.
Incentives refer to methods encouraging active thinking about a given problem through reward or
punishment. Providing incentives has been shown to promote better decision-making and potentially
reduce cognitive biases in certain settings [24]. Generally, the value of the incentive increases with the
associated value of identifying/correcting a given decision. Potential incentives to promote better
decision-making and mitigate cognitive biases within the fire engineering process may include:
Rewards: Providing financial rewards (e.g. bonus) or annual leave (e.g. leave early on a Friday)
when significant errors are identified in a project (e.g. “good catch reward”).
Token incentives for groups: Grouping people into teams and providing points to people in
each team when errors are identified in projects. The team with the most points at the end of a
given time period is rewarded.
Recognition: Celebrating projects which went well and also highlighting projects which they
did not go well with lessons learnt through internal company newsletters.
In effect, these activities are intended to modify the working culture in given contexts such that
problems should be highlighted and seen as a learning process, rather than hidden or seen as a failure.
A key aspect of any incentive system within fire engineering is to clearly define what constitutes errors
in judgement in a given decision-making process. For fire engineering the incentives could be classified
according to the biases associated with given tasks shown in Table 2. It is also important to define a
standard relative measure of incentives in terms of how much incentive is provided for identification of
a given bias: the larger the bias and subsequent consequence, the large the incentive should be. The
specification of the incentive should be provided before or at the beginning of a project/item of
work/working contract so that all stakeholders are informed and ideally be included in formally agreeing
to the incentives.
Nudges represent small, simple, nonconscious and typically low-cost methods to (a) guide (increase
the likelihood of) a positive decision, rather than dictate the response; and (b) mitigate potential biases
[24, 30]. Nudges would typically be adopted where fire engineers make decisions using the automatic
system for relatively small or routine tasks though can also be for decisions using the reflective system
or for larger task. Potential nudges which may be adopted within fire engineering to promote better
decision-making and mitigate cognitive biases are listed below:
Changing modelling defaults: Changing the default values in fire/evacuation/structural fire
models (which would be used each time the modelling software is opened unless previously
edited) to being highly conservative to encourage users to change the values to be more realistic
and subsequently justify their selection and consciously be aware of the new settings adopted.
If users do not change the defaults, then the simulation will be more conservative so less likely
to present a potential safety risk [25]. This nudge would only be possible by model developers
or where a model allows users to change default values.
Automated checkpoints in software tools: Whenever a software user is required to make a
decision, these should be made explicit and made salient (e.g. pop-up window asking: “Do you
want to use the default values?”). This nudge would only be possible by model developers or
where an organisation has access to software tool source code.
Commitment devices: A commitment device is a way to oblige an individual into a given
action that they might not want to do but is good for them [26]. Potential commitments devices
which may be adopted within fire engineering to promote better decision-making and mitigate
cognitive biases are listed below:
o Create larger obstacles to temptations than the temptation itself:
Pay fire engineering fees upfront before a project commences in order to
mitigate against financial bias.
Enforce minimum times to answer queries or to complete a project to
mitigate against not having enough time to fully assess or complete a detailed
analysis for a query/project.
o Have the fire engineer make a publicized commitment to an action so reputation
could be affected:
Conduct internal ‘third party’ reviews within an organization of a fire
engineering analysis with any mistakes anonymously shared with the wider
Make all fire strategy documentation (documentation which fire engineers
prepare as part of the building design process which describe the fire safety
features of the building and how they are deemed to provide a sufficient level
of fire safety) subject to public scrutiny.
Publishing thought leadership / good practice articles in fire engineering trade
magazines/journals for scrutiny or endorsement. These can be used as
evidence in a review if a fire engineer does not uphold the published
proposed ideas.
Create a structure whereby professional registration is mandatory and provide
a mechanism for poor practice / incidents to be anonymously reported. The
reporting could then have implications for the renewing of licensing if found
to be valid. This would need to be orchestrated by societies/institutions which
authorize/issue professional fire engineer registrations in a given country e.g.
Society of Fire Protection Engineers (SFPE), Institution of Fire Engineers
(IFE), etc.
Develop and encourage participation in fire engineering community groups
where common practice (good and bad) is better recognized.
o Make a monetary contract to increase the benefit of keeping a promise:
Include a bonus fee for a fire engineering project in a holding account which
is only released a given number of years after the building is built and shown
to present no fire safety issue inherent in the design.
Reframing Information: Reframing refers to changing the method by which information is
communicated between people to promote better decision-making and mitigate against
cognitive biases/misinterpretation. Potential methods of reframing within fire engineering
which could promote better decision-making and mitigate cognitive biases are listed below:
o Presenting all high fire risk aspects of a fire strategy nearer the beginning of a report to
ensure they are noticed by the relevant stakeholders. Developing standard industry
wide reporting guidance for such information would be advantageous by fire
engineering societies/institutions within each country.
o Considering severe but unlikely fire/evacuation scenarios which could cause the fire
strategy to fail (i.e. cause loss of life), in order to present the limitations of a given fire
The low-cost nature of nudges being voluntary generally means the cost involved in their use is typically
low. They can also be applied to a wide variety of decisions types. Once a specific nudge has been
developed it is imperative that it be tested first on a sample of individuals to ensure it has the desired
affect before being implemented on a large scale; i.e. across projects or organizations.
Procedural measures are those which are adopted as part of standard practice within an organisation to
carry out specific processes within fire engineering in order to mitigate cognitive biases. These can
potentially represent a wide range of activities involved in checking, communicating with peers,
providing feedback, or consistently performing a well-defined task given a decision under specified
conditions. Indeed, a number of the measures previously categorized in the preceding sections could
also be classified as procedural measures. Potential other procedural measures which may be adopted
within fire engineering to promote better decision-making and mitigate cognitive biases are listed below:
Modelling checklists: Adopting the use of quality assurance checklists within
evacuation/fire/structural fire modelling to ensure aspects of the modelling process are not
overlooked. As part of this it is important that the user establishes how they have met the
requirement as part of the check, rather than just signify that they have met the requirement.
The checks might therefore be posed in the form of questions which a user must provide
answers to as opposed to a simple Boolean checkbox.
Encourage the questioning of senior staff: After a senior member of staff has stated a
decision, requesting junior members of staff to test the approach, suggest alternatives, and
‘game’ how the approaches could go wrong. This creates a situation that promotes an
individual to think about an issue, makes it socially acceptable for criticism to be made, and
(hopefully) builds confidence in the senior staff who may veer towards a more standard
approach without such input.
Peer review input/checking: Requiring all fire engineering documentation to be initially
checked by a peer and requiring specific feedback (e.g. project manager) during the process
as opposed to only at the end of the process.
Fire code justification: Requiring a fire engineer to justify why adopting a prescriptive fire
code compliant aspect of design is suitable and under what conditions it may not be suitable.
Third party fire strategy reviews: Requiring anonymised third-party reviews of fire strategy
documentation which is submitted to approving authorities. It should be noted that this already
exists within the fire engineering field but is not common practice for all projects, is typically
limited to projects where a certain aspect of the fire strategy contains specialised analysis e.g.
fire/evacuation/structural fire modelling, and is typically not anonymous.
Mandatory fire/evacuation modelling for large projects: Requiring all projects which
involve a large number of people to include mandatory fire/evacuation modelling assessments
to demonstrate all people can evacuate before untenable conditions occur rather than it be
assumed this is the case through compliance with prescriptive fire codes.
Mandatory probabilistic fire risk assessment for uncommon buildings: Where a building
is defined as being ‘uncommon’ i.e. an aspect(s) of the design is defined as not being
fundamentally appropriate to apply standard fire design guidance, demonstration that the
probability and severity of a fire are both tolerable and as low as reasonably practicable
(ALARP) [27].
Pre-mortem meetings: The concept of pre-mortem meetings was developed by Klein [28].
Its broad principle is that when an important decision is being made, a meeting is held with
the relevant stakeholders. Within the meeting, a hypothetical situation is proposed where the
desired decision was selected, but after some time was found to be a total disaster. The people
in the meeting must then discuss the potential reasons why the choice was a disaster. After
which they then decide how the disaster would occur by identifying the potential causes and
how they might be mitigated. The group then decide if the choice should still be taken. Posing
such a hypothetical situation makes it more acceptable to freely discuss negative aspects of a
decision by reducing whereby everyone just states positive views to support the group. Pre-
mortem meetings could be adopted in a wide range of important fire engineering decisions
including the following:
o Deciding to bid on a project (e.g. identifying the most likely tasks that may involve
costs or time overruns).
o Deciding to use performance-based analysis to support a noncompliance with
prescriptive fire codes.
o Deciding whether to use a given material in the building design which could
significantly influence a fire.
Initial assessment: Independently deriving potential approaches to a fire engineering problem
before consulting others to mitigate against anchoring to their views. Similarly, when
consulting the opinion of others about a fire engineering decision, avoiding discussing your
proposed solution or asking leading questions until they given their thoughts in order to
mitigate against anchoring their views.
What if assessment: A systematic consideration of failure of each fire safety system(s) /
provision(s) and assessment of the consequence to demonstrate a design is not unduly
sensitivity to a single/small number of system failures before unacceptable levels of severity
occur during a fire. This is to mitigate against unlikely but high consequence fire scenarios
being given little/no consideration.
Devil’s advocate project review: Conduct project reviews whereby a fire strategy or analysis
is presented to colleagues who are not involved in the project to get general feedback. Some
of the colleagues are tasked with deliberately trying to find problems with the fire strategy (i.e.
they play Devils Advocate) to mitigate against status quo bias.
Measures to implement all methods can be clearly defined/documented and effectively be disseminated
within an organisation. However, implementation and compliance with procedural measures would
incur some cost in terms of time to perform which would vary depending on the measure.
Several limitations of the present work help outline the path for future research. First and foremost,
the biases identified here have been drawn from basic and applied research from a number of different
disciplines but have not specifically been studied in fire engineering. While there are good reasons to
assume that these might be applicable to fire engineering, these would ideally be tested empirically to
confirm their existence, measure the extent to which they occur, and provide guidance as to which
biases are most prevalent. Such empirical validation would also allow the development of targeted
mitigation measures and associated more efficient allocation of resources. Furthermore, the nature of
certain biases means they would be challenging to identify/quantify outside of hypothetical scenarios
with surveys or measured within experimental conditions due to it being difficult to isolate influencing
factors. Such data collection methods would be consistent with a number of previous empirical studies
that have used hypothetical scenarios within surveys or controlled experimental conditions to identify
cognitive biases [12]. However, this does raise the question of validity of findings derived from such
studies of whether the behaviour can reliably be observed in real world conditions [29].
To illustrate an avenue towards evidence-based empirical studies on cognitive bias, an example is
presented in which first evidence for a specific bias is collected and then a potential remedy is tested.
Default Value Bias suggests that fire engineers may tend to use default values of a model without
questioning the suitability or underlying assumptions. To test this hypothesis, experienced fire engineers
could be tasked to simulate a range of scenarios under varying conditions (e.g., time pressure,
appropriateness of default values). If, for example, fire engineers are more likely to use default values
within a model under time pressure, even in scenarios when this is not appropriate, this could be seen
as evidence for the Default Value Bias. The next step would then be to introduce a mitigation measure
to one or several subgroup of the participants and test if the measure would reduce the bias (case-control
Finally, a refinement of the proposed list of biases is desirable. For instance, it is conceivable that some
of the biases discussed here are conceptually overlapping and/or significantly more prevalent/dominant
than other biases. In turn, although the number of biases proposed for a variety of fire engineering
decision-making context is already quite extensive, it is by no means exhaustive and intended to be
illustrative. Future research efforts should therefore try to identify redundancies and also biases that
may be unique to fire engineering. Such validation may not only be of purely academic interest but
might also provide insights into the potential costs associated with biases within commercial practice.
The paper presents a general description of decision-making and cognitive biases. Potential fire
engineering cognitive biases are presented along with a range of potential mitigating methods. The
biases and mitigation measures discussed in the paper are focused on technical work within the field of
fire engineering rather than associated with general biases within the associated workplace environment
e.g. gender/age biases etc. It should be highlighted that the insights presented within the paper come
from a review of general decision science literature, observations within the general fire engineering
field, and users of fire engineering calculations/models. As such, many aspects of it are
speculative/theoretical. This limited empirical basis highlights the need for data to be collected
regarding identification and prioritisation of the most common/influential fire engineering cognitive
biases along with associated mitigation measures. Readers are therefor cautioned that whilst such fire
engineering biases may exist within the field, the extent to which they occur is uncertain and that it
should not be assumed that they always occur throughout the fire engineering field. The key objective
of this article is to promote awareness of the existence of potential fire engineering cognitive biases and
opportunities of associated mitigation measures. Those that may benefit from awareness of such biases
and mitigation measures include not only practicing fire engineers, but also building developers, fire
code committees, evacuation/fire/structural fire modelling developers, approving authorities, and fire
engineering researchers/students.
The authors would like to thank Peter Johnson (Arup, Australia), Dr. Claire Cooper (Emergency
Management Victoria, Australia), and Prof. Norman Groner (John Jay College, US) for their invaluable
insights and suggestions regarding the fire engineering biases within the paper.
The research and contents expressed in this paper represent those of the authors and not necessarily
those of the organisation which they are employed/associated with. The study was conducted entirely
voluntarily without any funding to promote impartiality and mitigate associated biases.
1. Lewis, M. (2016). The undoing project: A friendship that changed the world. Penguin UK.
2. Van Coile, R., Hopkin, D., Lange, D., Jomaas, G., Bisby, L., (2019). The Need for
Hierarchies of Acceptance Criteria for Probabilistic Risk Assessments in Fire Engineering.
Fire Technology, Volume 55, Issue 4
3. Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: a failure to disagree.
American psychologist, 64(6), 515.
4. Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In
Communication and persuasion (pp. 1-24). Springer, New York, NY.
5. Sun, R. (2001). Duality of the mind: A bottom-up approach toward cognition. Psychology
6. Dolan, P., Hallsworth, M., Halpern, D., Dominic, K., Vlaev, I. (2010). MINDSPACE:
Influencing Behaviour Through Public Policy, Institute for Government, Cabinet Office.
7. Kahneman, D. (2011). Thinking, fast and slow (Vol. 1). New York: Farrar, Straus and
8. Bar-Eli, M., Azar, O. H., Ritov, I., Keidar-Levin, Y., & Schein, G. (2007). Action bias
among elite soccer goalkeepers: The case of penalty kicks. Journal of Economic
Psychology, 28(5), 606-621.
9. Kinsey, M. J., Gwynne, S. M. V., Kuligowski, E. D., & Kinateder, M. (2019). Cognitive
biases within decision making during fire evacuations. Fire Technology, 55(2), 465-485.
10. Wood, T., Porter, E. (2018). The Elusive Backfire Effect: Mass Attitudes’ Steadfast
Factual Adherence. Political Behavior pp. 1-29. doi:10.1007/s11109-018-9443-y.
11. Heath, C. (1999). On the social psychology of agency relationships: Lay theories of
motivation overemphasize extrinsic incentives. Organizational behavior and human
decision processes, 78(1), 25-62.
12. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases.
Science, 185(4157), 1124-1131.
13. Park, C. W., & Lessig, V. P. (1981). Familiarity and its impact on consumer decision biases
and heuristics. Journal of consumer research, 8(2), 223-230.
14. Oswald, M.E., Grosjean, S. (2004). Confirmation Bias. In Pohl, Rüdiger F. (Eds.),
Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and
Memory (pp. 7996). Hove, UK: Psychology Press.
15. Colman, A. (2003). Oxford Dictionary of Psychology. p. 77. New York: Oxford University
16. Taleb, N. N. (2007). The black swan: The impact of the highly improbable (Vol. 2).
Random house.
17. Elton, E. J., Gruber, M. J., & Blake, C. R. (1996). Survivor bias and mutual fund
performance. The review of financial studies, 9(4), 1097-1120.
18. Huh, Y. E., Vosgerau, J., & Morewedge, C. K. (2014). Social defaults: Observed choices
become choice defaults. Journal of Consumer Research, 41(3), 746-760.
19. Bernstein, D. M., Erdfelder, E., Meltzoff, A. N., Peria, W., & Loftus, G. R. (2011).
Hindsight bias from 3 to 95 years of age. Journal of Experimental Psychology: Learning,
Memory, and Cognition, 37(2), 378.
20. Helweg-Larsen M, Shepperd JA (2001). Do moderators of the optimistic bias affect
personal or target risk estimates? A review of the literature. Personal Soc Psychol
21. Croskerry, P., Singhal, G., & Mamede, S. (2013). Cognitive debiasing 1: origins of bias
and theory of debiasing. BMJ Qual Saf, 22(Suppl 2), ii58-ii64.
22. Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive
developmental inquiry. American psychologist, 34(10), 906.
23. Klein, G. A. (2017). Sources of power: How people make decisions. MIT press.
24. Sunstein, C.R, Richard H. Thaler, (2008). Nudge: improving decisions about health,
wealth, and happiness (Revised and expanded ed.). New Haven, Conn.: Yale University
25. Gwynne, S. M. V., & Kuligowski, E. (2010). The faults with default. In Proceedings of
the Conference Interflam2010, Interscience Communications Ltd: London (pp. 1473-
26. Bryan, G. Karlan, D., Nelson, S. (2010) Commitement Devices. Annual Review of
Economics. Vol.1:671-698.
27. British Standards Institute. (2019). PD 7974-7 Application of the fire safety engineering
principles to the design of buildings. Probabilistic risk assessment.
28. Klein, G. (2007). Performing a project premortem. Harvard business review, 85(9), 18-19.
29. Gigerenzer, G. (1991). How to Make Cognitive Illusions Disappear: Beyond “Heuristics
and Biases”. European Review of Social Psychology. Vol 2, pp 83-115.
30. Sunstein, C. (2014). Nudging: A Very Short Guide, Journal of Consumer Policy. 583.
... Kinsey et al. offer a bias-mitigating approach to fire engineering. Their approach covers biases, mitigation strategies and implementation initiatives [14]; nevertheless, there is no generic application for DPs. Although the literature on biases is vast, no literature could be found on a generic approach that offered biasmitigation strategies to support decision making along with the DP. ...
Full-text available
Citation: Sanchez Ruelas, J.G.; Sahin, T.; Vietor, T. Development of Decision-Model and Strategies for Allaying Biased Choices in Design and Development Processes. J. Open Innov. Technol. Mark. Complex. 2021, 7, 118. https://doi. Abstract: The design and development processes are full of decisions. Ranging from simple and straightforward to complex and elaborated. These decisions are taken by individuals that constantly rely on their intuition and heuristics to support their decision-making processes. Although heuristics tend to be very helpful, in many cases, they can lead to cognitive biases. This article postulates a method to recognize some of these biases and to apply dedicated strategies to diminish their effects. To do so, the study reviews different decision models in engineering design and consolidates them into one; here, called ABC decision model-ABC stands for Allaying Biased Choices. This model consists of four phases describing four different decision types. Subsequently, four matching strategy sets are prescribed to target some of the most prone biases on those phases. Then, to demonstrate the application opportunities of this method, the ABC decision model is applied to the process of Strategic Release Planning (SRP). Finally, to show the theory in real-world conditions, the results of a pilot industrial application are presented. This article offers promising opportunities for allaying biased choices in design and development processes.
... Specifically, extroversion and conscientiousness traits in individuals have been linked to risk-taking behaviors [39,85,86], and they should be explored further within the SRA domain in the construction industry. (2) Workers' cognitive biases-These beliefs have been reviewed in psychology and engineering-based studies that reveal a significant impact on workers' perception of safety risks [87][88][89]. Therefore, researchers involved in SRA research within the construction domain should further investigate the role these factors play in workers' perception of the safety risks of different accident causes. ...
Full-text available
Construction operations are hazardous, leading to thousands of accidents, injuries, and fatalities annually. Safety risk assessment (SRA) is a key component necessary to respond to hazards effectively. Individuals have different perceptions of the riskiness of construction hazards, and studies have shown that different sociodemographic factors among employees can alter their SRA skills. However, their role in the US construction industry has been understudied, and this analysis investigates this topic further. Following a detailed systematic review of the relevant literature, quantitative data were collected from 181 construction fieldworkers in the United States using images integrated into an interactive questionnaire survey. Responses on the severity and frequency of seven potential accident causes were captured and analyzed. Findings from the literature review revealed six key sociodemographic factors—age, education, training, gender, ethnicity, and work type—that could impact fieldworkers’ SRA. However, a quantitative analysis suggests that only education is a significant influence, and sociodemographic factors had a statistically significant impact on less than five percent of the assessments. Therefore, the present study proposes that future investigation within the SRA domain should complement sociodemographic factors with critical behavioral factors that are rarely discussed, such as cognitive biases, personality traits, and safety behavior. As a foundational study for safety researchers and practitioners, the results provide information on SRA that can help enhance the safety and workforce sustainability of construction companies with a diverse workforce.
Fundamentals of human behaviour in fire are discussed to provide a basis for understanding the research study and results presented in this book. The reader is further directed to numerous primary sources and education materials which detail this subject.
Egress modelling can be used in stadia design. This modelling describes the movement of pedestrians and crowd flow considering wayfinding and decision making in evacuation and circulation. The accuracy of the modelling is highly dependent on project-specific input data that accurately represents the movement of population with associated human factors considered. Currently, there are few contemporary studies of stadia that consider real egress decision making under a range of stimuli of which practitioners may use to influence their modelling and design process. Herein, a range of evacuation urgencies and their effect on pedestrian decision making and wayfinding in stadia are considered: standard post-game egress, egress under high-motivation conditions, and emergency egress. This is done through carefully collected and recorded observation of real stadia in Canada and obtained third party video for stadiums internationally. To reinforce findings, real case studies of other notable emergencies are also considered. Decision making at all stages of evacuation are analyzed. Results indicate that, based on the cases examined, the egress behaviours differ in relation to the level of urgency, such as high motivation and emergency, and gate densities are higher for high motivation egress by a factor of 1.5. The role of staff in the evacuation process is one of the predominant factors in reducing or extending premovement regardless of the scenario. Associated contemporary behavioural theorems are used to explain differences in movement and decision-making in evacuation scenarios.
Full-text available
During a fire evacuation, once an individual perceives cues from a fire event, they must interpret them to assess the new situation and determine whether action is required. It is proposed that this assessment and action selection can employ either an automatic or reflective processing system depending on the nature of the situation and the experiences of the individual involved. This decision-making process is bounded in terms of the information available, the time available, and an individual’s resources to process such information that influences which processing mechanism is adopted. To compensate for such limitations and manage the uncertainty and complexity associated with the decision-making process, people may employ heuristics that reduce decision-making from a cognitively effortful problem-solving task requiring mental reflection, to a less effortful pattern-matching process, where stored conditions and expectations are quickly scanned to identify relevant responses. During this decision-making process cognitive biases may occur which cause an individual to neglect or be biased towards certain information: this may potentially lead to an inappropriate and/or unexpected response. Cognitive biases affect performance without the individual being directly aware of them. This paper identifies cognitive biases from existing literature that may influence a person’s decision-making process during a fire evacuation, along with how these align with general decision-making in the process. The purpose of the article is to promote consideration of cognitive biases in the modeling of evacuee behavior, as well as during the fire safety design of buildings and evacuation procedures. Link to paper:
Full-text available
Can citizens heed factual information, even when such information challenges their partisan and ideological attachments? The “backfire effect,” described by Nyhan and Reifler (Polit Behav 32(2):303–330., 2010), says no: rather than simply ignoring factual information, presenting respondents with facts can compound their ignorance. In their study, conservatives presented with factual information about the absence of Weapons of Mass Destruction in Iraq became more convinced that such weapons had been found. The present paper presents results from five experiments in which we enrolled more than 10,100 subjects and tested 52 issues of potential backfire. Across all experiments, we found no corrections capable of triggering backfire, despite testing precisely the kinds of polarized issues where backfire should be expected. Evidence of factual backfire is far more tenuous than prior research suggests. By and large, citizens heed factual information, even when such information challenges their ideological commitments.
The Undoing Project examines the relationship between two psychologists, Amos Tversky and Daniel Kahneman, whose work altered how we understand the functioning of the mind. In this book, Lewis embarks on a journey to understand and explain psychological research to a popular audience. Lewis is an expert writer who knows what sells books. The Undoing Project is an informative, entertaining, and quick read. Lewis has produced a well-researched book that is accessible to a broad audience.
This brief essay offers a general introduction to the idea of nudging, along with a list of ten of the most important “nudges.” It also provides a short discussion of the question whether to create a separate “behavioral insights unit” or instead to rely on existing institutions.
A probabilistic risk assessment (PRA) is commonly accepted as a tool for performance based design in fire safety engineering, but the position of PRA in the design process, the relationship between different acceptance concepts (absolute, comparative, ALARP), and the responsibilities of the designer remain unclear. Aiming to clarify these aspects, the safety foundation of fire safety solutions is investigated, indicating that PRA is necessary for demonstrating adequate safety when no appeal can be made to the collective experience of the profession. It is suggested that PRA is not a methodology for ‘future fire safety engineering’, but rather a necessary methodology to provide an objective safety foundation for uncommon fire safety designs. Acknowledging that what constitutes ‘acceptable safety’ is subjective and may change over time, an objective proxy of ‘adequate safety’ is defined and proposed as a benchmark against which to assess the adequacy of fire safety designs. In order to clarify the PRA process, a hierarchy of different acceptance concepts is presented. Finally, it is shown how, depending on the applied acceptance concepts, the designer takes responsibility for different implicit assumptions regarding the safety performance of the final design.
Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.