Conference PaperPDF Available

Burning Biases: Mitigating Cognitive Biases In Fire Engineering

Authors:

Abstract and Figures

Fire engineering has developed from a niche subject into a mainstream engineering discipline employed within the building design process. Building fire codes are increasingly complex, commonly comprising thousands of requirements regarding a wide range of topics that must be considered. In general, fire engineers are required to possess increasingly complex knowledge about a variety of subjects, along with expertise in their application in the building design process. This has been magnified with the proliferation of performance-based methods using various computational tools. This coupled with increased project performance pressures, raises the potential for errors in judgement. Errors in judgement may be caused by limitations in a given resource (e.g. time, information available, knowledge, etc) or neglect/over-focus on certain information (at the expense of other and more relevant information) through cognitive biases. This paper initially provides a broad overview of general decision-making including dual process theory, heuristics, and cognitive biases. Specific examples of cognitive biases are presented which may potentially be linked to errors in a fire engineer’s decision-making. This contribution considers several fire engineering decision contexts where cognitive biases may exist which are associated with prescriptive fire code adoption/interpretation, modelling/calculations, general fire engineering practice, and perceptions based on experience. Proposed potential measures to mitigate some of these biases and prompt better decision-making are discussed using methods of promoting awareness/conscious decision-making, incentives, nudges, and procedures, based on research from related areas. It is suggested that the proposed mitigation measures may potentially help to reduce the likelihood of errors in judgement caused by cognitive biases and resources needed to rectify their consequences.
Content may be subject to copyright.
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
BURNING BIASES:
MITIGATING COGNITIVE BIASES IN
FIRE ENGINEERING
Michael J. Kinsey*, Arup, China
Max Kinateder, National Research Council Canada, Canada
Steven M.V Gwynne, Movement Strategies, UK
ABSTRACT
Fire engineering has developed from a niche subject into a mainstream engineering discipline
employed within the building design process. Building fire codes are increasingly complex, commonly
comprising thousands of requirements regarding a wide range of topics that must be considered. In
general, fire engineers are required to possess increasingly complex knowledge about a variety of
subjects, along with expertise in their application in the building design process. This has been
magnified with the proliferation of performance-based methods using various computational tools. This
coupled with increased project performance pressures, raises the potential for errors in judgement.
Errors in judgement may be caused by limitations in a given resource (e.g. time, information available,
knowledge, etc) or neglect/over-focus on certain information (at the expense of other and more relevant
information) through cognitive biases. This paper initially provides a broad overview of general
decision-making including dual process theory, heuristics, and cognitive biases. Specific examples of
cognitive biases are presented which may potentially be linked to errors in a fire engineer’s decision-
making. This contribution considers several fire engineering decision contexts where cognitive biases
may exist which are associated with prescriptive fire code adoption/interpretation,
modelling/calculations, general fire engineering practice, and perceptions based on experience.
Proposed potential measures to mitigate some of these biases and prompt better decision-making are
discussed using methods of promoting awareness/conscious decision-making, incentives, nudges, and
procedures, based on research from related areas. It is suggested that the proposed mitigation measures
may potentially help to reduce the likelihood of errors in judgement caused by cognitive biases and
resources needed to rectify their consequences.
1. INTRODUCTION
The principle role of a fire engineer is to utilise scientific and engineering principles, fire codes and
expert judgment, based on an understanding of fire, building, and people. Fire engineers may be
involved in a broad set of tasks requiring a range of knowledge and highly specialised expertise.
Although a fire engineer may possess prerequisite expertise, there are many possible constraints one
might need to consider that might affect their performance. For instance, a fire engineer may be required
to develop the necessary skills/knowledge or be supported by other fire engineers (who already have
the necessary skills/knowledge) on a project in order to complete a given task. Indeed, where a fire
engineer is lacking in-depth technical expertise or experience about a given subject, they may rely on
the fire codes for guidance, potentially without a complete awareness of any underlying basis or
assumptions for the guidance. There may also be limited time in which a fire engineer is required to
conduct a task in order to meet project deadlines or limited available manpower to work on a given task.
Typically, many of the tasks have some level of uncertainty associated that may interact in non-additive
ways. There may be multiple ways to complete a given task and potentially multiple suitable outcomes,
all of which a fire engineer must contend with when assessing suitability. Assuming the possession of
the necessary knowledge and technical expertise to conduct such tasks, fire engineers may still make
mistakes, lapses in judgement or make inconsistent assumptions - which can all be caused by cognitive
biases. This may also be compounded by the fire engineer being subject to routine personal issues
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
including boredom, fatigue, illness, situational and interpersonal distractions, all of which can affect
their ability to consistently perform and make suitable decisions [1]. Furthermore, with fires being rare
events, there is a lack of feedback in the building design process regarding failures in methods or
decision-making within the fire engineering field. Most designed buildings will never have a fire or if
they do, it will not be serious enough to truly test the full range of fire safety features. Such feedback is
critical to enhancing the decision-making process within fire engineering. This is contrary to many other
types of building engineering which are routinely put to the test in normal usage demonstrating the level
of their suitability.
The paper initially provides an overview of general decision-making and cognitive biases theory.
Potential cognitive biases which may exist within fire engineering are then presented. A range of
potential cognitive bias mitigation measures are then proposed. This paper is intended to identify some
means of reducing potential sources of mistakes and lapses in decision-making which fire engineers
may exhibit through cognitive biases within the building design process.
2. DECISION-MAKING AND COGNITIVE BIASES
Basic research in decision science, proposes that decision-making can be thought of as if employing
two separate, though interacting, systems of thinking: an automatic system and a reflective system [2-
5]. The automatic system occurs without a person being directly aware/conscious of it. It is used for the
large majority of decisions, is relatively quick, and is based on a person recognising given situations or
types of situation and associating it with a stored response. Commonly it is adopted in familiar situations
or where a quick response is required. The reflective system occurs in the conscious and allows people
to address situations in a more deliberative manner. It is more consciously effortful to use, relative
slower and limited by working memory. Commonly it is adopted in unfamiliar situations, where a task
requires reflection, and an assessment where time is not considered a limiting factor [2-5].
Irrespective of which system of decision-making is used, the process is bounded in terms of information
available, time available, and an individual’s mental resources to process such information. To
compensate for such limitations and manage uncertainty or complexity with a decision, people may
employ heuristics in the decision-making process. Heuristics are defined as relatively simple rules of
thumb [5] which can be applied to complex decisions where not all information is known, that can result
in a suitable response. Such rules of thumb may be developed by creating associations between a given
situation and past experience of a learned response which results in the adoption of a given action based
on the association. These provide a simplified and typically sufficient response for a decision. The
development of a rule of thumb involves substituting the decision being made for a similar simpler
decision which can be used to produce an outcome. It may also be possible to prime an individual to
adopt heuristics through training to enable speedy responses to a situation; however, the heuristics may
not be appropriate if used outside of the intended set of scenarios, placing greater importance on the
pattern matching capability of the individual and the quality of the information available. Heuristics are
extremely common and can be used in a variety of situations. However, if employed inappropriately
(e.g. due to incomplete information) systematic errors can inadvertently be produced. This can be made
worse through the existence of cognitive biases where information is inappropriately processed or
overly focused upon at the expense of more relevant information in the decision-making process.
Cognitive biases can occur in a wider variety of scenarios ranging from economic/political forecasting
[5] to fire evacuations [6]. The extent to which a cognitive bias may negatively impact a decision can
also vary widely from having no consequential impact to having life threatening implications.
3. FIRE ENGINEERING COGNITIVE BIASES
The following table lists potential cognitive biases with examples which may exist within fire
engineering during the building design process. The list is derived from biases identified in the general
decision science literature and applied to fire engineering activities. The list is non-exhaustive and not
based on empirical data specific to fire engineering. As such the extent to which the cognitive biases
exist, how common they are, or influencing factors, is unclear. The cognitive biases are categorised
according to the elements of the fire engineering process that they might affect: prescriptive fire code
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
adoption/interpretation, modelling/calculations, general fire engineering practice, and perceptions
based on experience. The extent to which the cognitive biases may influence an eventual building design
may vary significantly and are likely to depend on the nature of the project and the fire engineer involved.
Table 1: Postulated Fire Engineering Cognitive Biases
#
Cognitive
Bias Type
Specific Bias Example
Application to Fire Engineering
Prescriptive Fire Code Adoption/Interpterion Biases
1.1
Familiarity bias [7,8]
Prescriptive Code Familiarity Bias: Believing that a building designed in
accordance with familiar prescriptive fire codes is safer than one designed
using performance-based methods which are less familiar without
consideration of the specific reasons or limitations of each method or how
these relate to a specific project.
1.2
Default bias [9]
Prescriptive Code Default Bias: Utilizing the values/requirements within a
prescriptive fire code as these have always been used on an individual’s past
projects as the default without consideration of suitability or underlying
assumptions.
1.3
Authority bias [7]
Prescriptive Code Authority Bias: Utilizing the values/requirements
within prescriptive codes due to them being written by an authoritative
governing body (or even in an authoritative manner) without consideration
of suitability or underlying assumptions.
1.4
Risk aversion bias [7]
Prescriptive Code Risk Aversion Bias: Choosing to follow prescriptive
codes due to a belief that it will mitigate financial/legal/approvals risk; i.e.
belief of diffusion of accountability, without question of suitability or
underlying assumptions.
Modelling/Calculation Biases
2.1
Default bias [9]
Modelling Default Value Bias: Using default values within a model
without question of suitability or underlying assumptions.
2.2
Default bias [9]
Default Model Bias: Using a given model because a company or users has
an existing affiliation/experience/obligation to use it irrespective of the
suitability of the model for a specific engineering application.
2.3
Familiarity bias [7,8]
Model Familiarity Bias: Requiring a modelling analysis to be conducted
using only a specific model which an individual is familiar with without
consideration of using an alternate model which may be more suitable.
General Fire Engineering Practice Biases
3.1
Probability neglect [7]
Fire Probability Neglect Bias: Believing a fire is more likely to occur due
to only/mainly focusing on the consequences of a fire, giving little/no
consideration to likelihood of a fire actually occurring.
3.2
Availability bias [7]
Past Fire Incident Bias: Overly focusing on assessing a given aspect of the
building design whilst neglecting other parts of the building design due to it
being a major contributing factor in a recent major fire incident.
3.3
Incentive bias [10]
Fire Engineer Financial/Time Incentive Bias: Conducting an assessment
of a building fire safety design with less/little consideration to potential
negative fire safety issues due to project financial/time factors.
3.4
Confirmation bias [11]
Fire Safety Confirmation Bias: Confirming initial/existing thoughts that a
building provides a sufficient level of fire safety by focusing on all the
reasons why the building design is fire safe whilst paying little/less attention
to the factors that make the building design unsafe, and the underlying
assumptions required to reach a sufficient level of safety.
3.5
Halo affect [7]
Fire Engineer/Consultancy/Researcher Halo Affect: Being inclined to
trust the work of a given fire engineer/consultancy/researcher due to
familiarity with them, leading to less scrutiny of their work.
3.6
Authority bias [7]
Authority Staff Fire Bias: Being more likely to trust what a senior member
staff states due to them being in a position of authority and neglecting the
basis and content of their assertions.
3.7
Status quo bias [12]
Meeting Status Quo Bias: Tendency to adhere to the current or accepted
views/ideas of others in a meeting despite having a differing opinion.
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
3.8
Optimism bias [13]
Project Planning Optimism Bias: Being inclined to think a future fire
engineering project will not overrun in cost/time (i.e. that the performance
levels are reached throughout the project) despite experience of similar
projects suggesting otherwise.
3.9
Sunk Cost Bias [5]
Fire Engineering Sunk Cost Bias: Continuing to conduct an item of work
in fire engineering which is clearly not working/failing due to being heavily
invested/committed in terms of time/money.
Perceptions Based on Experience Biases
4.1
Availability bias [7]
Fire Research/Experience Bias: Overly focusing on the fire safety aspects
of a design which an individual is familiar/has significant experience whilst
giving less consideration to other aspects of the building design.
4.2
Survivorship bias
[14,15]
Survivorship Fire Bias: Believing the current processes for conducting fire
engineering in the building design process must result in a safe design due to
a fire not occurring in the large majority of similar buildings, giving no/little
consideration to the buildings where fires have occurred (i.e. overly
focusing on the ‘surviving buildings’ which have never had fires).
4.3
Hindsight bias [16]
Fire Incident Hindsight Bias: Believing a past fire incident was
foreseeable and preventable based on post-incident fire analysis, neglecting
information which was not known or unlikely to be known by the relevant
stakeholders prior to the incident.
4.4
Clustering Illusion [7]
Fire Incident Correlation Illusion: Identifying a false pattern of failure
from a series of past fires given that the incidents share a particular attribute
(e.g. building type), ignoring the impact and interaction of other factors
present in each building, unduly simplifying the assessment of the incidents
and exaggerating the impact of the attribute highlighted.
4. COGNITIVE BIAS MITIGATION MEASURES
Based on the fire engineering biases identified in the previous section, a series of mitigation
measures are now proposed to reduce the likelihood of such biases occurring in fire engineering and
their impact. These have been grouped into the following categories: promoting bias awareness and
conscious decision-making, provide incentives, nudges, and procedural methods. These are derived
from cognitive bias mitigation measures identified in psychological literature with applications outside
of fire engineering. The measures are discussed in terms of potential effectiveness, applicability (which
biases it can potentially mitigate), method of implementation, and overhead cost (time/money/resources
involved with setting them up). No studies or empirical data has been collected regarding the extent to
which such measures may reduce the likelihood or impact the cognitive biases specifically within fire
engineering. As such the extent these measures may influence cognitive biases within fire engineering
requires further investigation. It should also be highlighted the proposed methods are not exhaustive
and may not be appropriate to apply in certain situations. Furthermore, it may be possible to classify
certain methods into more than one category.
4.1 PROMOTING BIAS AWARENESS AND CONSCIOUS DECISION-MAKING
Being aware of cognitive biases and promoting conscious awareness of the steps involved in an
explicit thought process when making a decision (i.e., shifting the decision-making process from the
automatic system to the reflective system) has been shown to mitigate cognitive biases and poor
decision-making in general [17, 18]. Klein [19] broadly discusses the features of improving conscious
decision-making as including:
Awareness of requirements of learning process
Recognition of limitations of memory
Ability to appreciate perspective
Capacity for self-critique
Ability to select strategies
Promoting awareness of cognitive biases within fire engineering could provide a number of benefits
within the decision-making process. Fire engineers with effective decision-making skills would be
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
aware of common mistakes caused by cognitive biases and can also learn from past decisions where
cognitive biases may have occurred. They could recognize cognitive biases, for example in the design
process and appreciate different perspectives which may provide insight for a given decision. They
would be able to critique their own decisions to identify cognitive biases and adopt appropriate
strategies for complex decisions in order to mitigate them. The following are potential measures to
promote awareness and conscious decision-making in order to mitigate cognitive biases within fire
engineering:
Training: Providing training to communicate the existence of cognitive biases in fire
engineering and associated mitigation measures for making key decisions within the fire
engineering process.
Reduce time pressure: Leaving more time to think about a decision to reduce the need of the
automatic system of thinking which can result in overly quick decisions being made which
may then be reversed justified. For example, defer providing feedback regarding a fire
engineering query/complex decision until after a meeting rather than giving a response in the
meeting.
Verbalizing: Talking out loud or with others about why a given decision is being made may
promote conscious decision-making through the use of the reflective system.
Cognitive bias checklist: Adopting the use of a cognitive bias checklists within a decision-
making process e.g. evacuation modelling, fire strategy report development, etc to ensure they
are not overlooked.
4.2 INCENTIVES
Incentives refer to methods encouraging active thinking about a given problem through reward or
punishment. Providing incentives has been shown to promote better decision-making and potentially
reduce cognitive biases in certain settings [20]. Generally, the value of the incentive increases with the
associated value of identifying/correcting a given decision. Potential incentives to promote better
decision-making and mitigate cognitive biases within the fire engineering process may include:
Rewards: Providing financial rewards (e.g. bonus) or annual leave (e.g. leave early on a Friday)
when significant errors are identified in a project (e.g. “good catch reward”).
Group dynamics and token incentives: Grouping people into teams and providing points to
people in each team when errors are identified in projects. The team with the most points at
the end of a given time period is rewarded.
Recognition: Celebrating projects which went well and also highlighting projects which they
did not go well with lessons learnt through internal company newsletters.
A key aspect of any incentive system within fire engineering is to clearly define what constitutes errors
in judgement in a given decision-making process. For fire engineering the incentives could be classified
according to the biases associated with given tasks shown in Table 1. It is also important to define a
standard relative measure of incentives in terms of how much incentive is provided for identification of
a given bias: the larger the bias and subsequent consequence, the large the incentive should be. The
specification of the incentive should be provided before or at the beginning of a project/item of
work/working contract so that all stakeholders are informed and ideally be included in formally agreeing
to the incentives.
4.3 NUDGES
Nudges represent small, simple, and typically low-cost methods to (a) guide (increase the likelihood
of) a positive decision, rather than dictate the response; and (b) mitigate potential biases [20]. Nudges
would typically be adopted where fire engineers make decisions using the automatic system for
relatively small or routine tasks though can also be for decisions using the reflective system or for larger
task. Potential nudges which may be adopted within fire engineering to promote better decision-making
and mitigate cognitive biases are listed below:
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
Changing modelling defaults: Changing the default values in fire/evacuation/structural fire
models to being highly conservative to encourage users to change the values to be more realistic
and subsequently justify their selection. If users do not change the defaults, then the simulation
will be more conservative so less likely to present a potential safety risk [21].
Commitment devices: A commitment device is a way to oblige an individual into a given
action that they might not want to do but is good for them [22]. Potential commitments devices
which may be adopted within fire engineering to promote better decision-making and mitigate
cognitive biases are listed below:
o Create larger obstacles to temptations than the temptation itself:
Pay fire engineering fees upfront before a project commences in order to
mitigate against financial bias.
Enforce minimum times to answer queries or to complete a project to
mitigate against not having enough time to fully assess or complete a detailed
analysis for a query/project.
o Have the fire engineer make a publicized commitment to an action so reputation
could be affected:
Conduct internal ‘third party’ reviews within an organization of a fire
engineering analysis with any mistakes anonymously shared with the wider
team.
Make all fire strategy reports public so are open to public scrutiny.
o Make a monetary contract to increase the benefit of keeping a promise:
Include a bonus fee for a fire engineering project in a holding account which
is only released a given number of years after the building is built and shown
to present no fire safety issue inherent in the design.
Reframing Information: Reframing refers to changing the method by which information is
communicated between people to promote better decision-making and mitigate against
cognitive biases/misinterpretation. Potential methods of reframing within fire engineering
which could promote better decision-making and mitigate cognitive biases are listed below:
o Presenting all high fire risk aspects of a fire strategy nearer the beginning of a report to
ensure they are noticed by the relevant stakeholders.
o Considering severe but unlikely fire/evacuation scenarios which could cause the fire
strategy to fail (i.e. cause loss of life), in order to present the limitations of a given fire
strategy.
The low-cost nature of nudges being voluntary generally means the cost involved in their use is typically
low. They can also be applied to a wide variety of decisions types. Once a specific nudge has been
developed it is imperative that it be tested first on a sample of individuals to ensure it has the desired
affect before being implemented on a large scale; i.e. across projects or organizations.
4.4 PROCEDURES
Procedural measures are those which are adopted as part of standard practice within an organisation to
carry out specific processes within fire engineering in order to mitigate cognitive biases. These can
potentially represent a wide range of activities involved in checking, communicating with peers,
providing feedback, or consistently performing a well-defined task given a decision under specified
conditions. Indeed, a number of the measures previously categorized in the preceding sections could
also be classified as procedural measures. Potential other procedural measures which may be adopted
within fire engineering to promote better decision-making and mitigate cognitive biases are listed below:
Modelling checklists: Adopting the use of quality assurance checklists within
evacuation/fire/structural fire modelling to ensure aspects of the modelling process are not
overlooked.
Encourage the questioning of senior staff: After a senior member of staff has stated a
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
decision, requesting junior members of staff to state what is wrong with the idea and how
could it go wrong. This creates a situation that promotes an individual to think about an issue
and makes is socially acceptable for criticism to be made.
Automated checkpoints in software tools: Whenever a software user is required to make a
decision, these should be made explicit and made salient (e.g. pop-up window asking “Do you
want to use the default values?”).
Peer review input/checking: Requiring all fire engineering documentation to be initially
checked by a peer and requiring specific feedback (e.g. project manager) during the process
as opposed to only at the end of the process.
Prescriptive code justification: Requiring a fire engineer to justify why adopting a
prescriptive code compliant aspect of design is suitable and under what conditions it may not
be suitable.
Third party fire strategy reviews: Requiring anonymised third-party reviews of fire strategy
documentation which is submitted to approving authorities. It should be noted that this already
exists within the fire engineering field but is not common practice for all projects and is
typically limited to where a certain aspect of the fire strategy contains specialised analysis e.g.
fire/evacuation/structural fire modelling.
Mandatory fire/evacuation modelling for large projects: Requiring all projects which
involve a large number of people to include mandatory fire/evacuation modelling assessments
to demonstrate all people can evacuate before untenable conditions occur rather than it be
assumed this is the case through compliance with prescriptive fire codes.
Pre-mortem meetings: The concept of pre-mortem meetings was developed by Klein [23].
Its broad principle is that when an important decision is being made, a meeting is held with
the relevant stakeholders. Within the meeting, a hypothetical situation is proposed where the
desired decision was selected, but after some time was found to be a total disaster. The people
in the meeting must then discuss the potential reasons why the choice was a disaster. After
which they then decide how the disaster would occur by identifying the potential causes and
how they might be mitigated. The group then decide if the choice should still be taken. Posing
such a hypothetical situation makes it more acceptable to freely discuss negative aspects of a
decision by reducing ‘group think’ whereby everyone just states positive views to support the
group. Pre-mortem meetings could be adopted in a wide range of important fire engineering
decisions including the following:
o Deciding to bid on a project (e.g. identifying the most likely tasks that may involve
costs or time overruns).
o Deciding to use performance-based analysis to justify a noncompliance with
prescriptive fire codes.
o Deciding whether to use a given material in the building design which could
significantly influence a fire.
Initial assessment: Thinking of a solution to a fire engineering problem before consulting
others for assistance in order to mitigate against anchoring to their views. Similarly, when
consulting the opinion of others about a fire engineering decision, avoiding discussing your
proposed solution or asking leading questions until they given their thoughts in order to
mitigate against anchoring their views.
Devil’s advocate project review: Conduct project reviews whereby a fire strategy or analysis
is presented to colleagues who are not involved in the project to get general feedback. Some
of the colleagues are tasked with deliberately trying to find problems with the fire strategy to
mitigate against status quo bias.
Measures to implement all methods can be clearly defined/documented and effectively be disseminated
within an organisation. However, implementation and compliance with procedural measures would
incur some cost in terms of time to perform which would vary depending on the measure.
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
5. CONCLUSIONS
The paper presents a general description of decision-making and cognitive biases. Potential fire
engineering cognitive biases are presented along with a range of potential mitigating methods. It should
be highlighted that the insights presented within the paper come from a review of general decision
science literature, observations within the general fire engineering field, and users of fire engineering
calculations/models. As such, many aspects of it are speculative/theoretical. This limited empirical basis
highlights the need for data to be collected regarding identification and prioritisation of the most
common/influential fire engineering cognitive biases along with associated mitigation measures.
Readers are therefor cautioned that whilst such fire engineering biases may exist within the field, the
extent to which they occur is uncertain and that it should not be assumed that they always occur
throughout the fire engineering field. The key objective of this article is to promote awareness of the
existence of potential fire engineering cognitive biases and opportunities of associated mitigation
measures. Potential users which may benefit from such awareness include not only practicing fire
engineers, but also building developers, prescriptive fire code committees, evacuation/fire/structural
fire modelling developers, approving authorities, and fire engineering researchers/students.
DISCLAIMER
The research and contents expressed in this paper represent those of the authors and not necessarily
those of the organisation which they are employed/associated with. The study was conducted entirely
voluntarily without any funding to promote impartiality.
REFERENCES
1. Lewis, M. (2016). The undoing project: A friendship that changed the world. Penguin UK.
2. Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: a failure to disagree.
American psychologist, 64(6), 515.
3. Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In
Communication and persuasion (pp. 1-24). Springer, New York, NY.
4. Sun, R. (2001). Duality of the mind: A bottom-up approach toward cognition. Psychology Press.
5. Kahneman, D. (2011). Thinking, fast and slow (Vol. 1). New York: Farrar, Straus and Giroux.
6. Kinsey, M. J., Gwynne, S. M. V., Kuligowski, E. D., & Kinateder, M. (2019). Cognitive biases
within decision making during fire evacuations. Fire Technology, 55(2), 465-485.
7. Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science,
185(4157), 1124-1131.
8. Park, C. W., & Lessig, V. P. (1981). Familiarity and its impact on consumer decision biases and
heuristics. Journal of consumer research, 8(2), 223-230.
9. Huh, Y. E., Vosgerau, J., & Morewedge, C. K. (2014). Social defaults: Observed choices become
choice defaults. Journal of Consumer Research, 41(3), 746-760.
10. Heath, C. (1999). On the social psychology of agency relationships: Lay theories of motivation
overemphasize extrinsic incentives. Organizational behavior and human decision processes, 78(1),
25-62.
11. Oswald, M.E., Grosjean, S. (2004). Confirmation Bias. In Pohl, Rüdiger F. (Eds.), Cognitive
Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory (pp. 7996).
Hove, UK: Psychology Press.
12. Colman, A. (2003). Oxford Dictionary of Psychology. p. 77. New York: Oxford University Press.
13. Helweg-Larsen M, Shepperd JA (2001). Do moderators of the optimistic bias affect personal or
target risk estimates? A review of the literature. Personal Soc Psychol Rev5(1):7495
14. Taleb, N. N. (2007). The black swan: The impact of the highly improbable (Vol. 2). Random house.
15. Elton, E. J., Gruber, M. J., & Blake, C. R. (1996). Survivor bias and mutual fund performance. The
review of financial studies, 9(4), 1097-1120.
16. Bernstein, D. M., Erdfelder, E., Meltzoff, A. N., Peria, W., & Loftus, G. R. (2011). Hindsight bias
Reference: Kinsey, M.J, Kinateder, M., Gwynne, S.M.V, “Burning Biases: Mitigating Cognitive Biases in Fire Engineering”, Interscience, Interflam Conference, 2019
from 3 to 95 years of age. Journal of Experimental Psychology: Learning, Memory, and Cognition,
37(2), 378.
17. Croskerry, P., Singhal, G., & Mamede, S. (2013). Cognitive debiasing 1: origins of bias and theory
of debiasing. BMJ Qual Saf, 22(Suppl 2), ii58-ii64.
18. Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive
developmental inquiry. American psychologist, 34(10), 906.
19. Klein, G. A. (2017). Sources of power: How people make decisions. MIT press.
20. Sunstein, Richard H. Thaler, Cass R. (2008). Nudge: improving decisions about health, wealth, and
happiness (Revised and expanded ed.). New Haven, Conn.: Yale University Press.
21. Gwynne, S. M. V., & Kuligowski, E. (2010, July). The faults with default. In Proceedings of the
Conference Interflam2010, Interscience Communications Ltd: London (pp. 1473-1478).
22. Reeves, D., & Goel, S. (2010). How To Do What You Want: Akrasia and Self-Binding. The Messy
Matters weblog, URL: http://messymatters.com/akrasia/, Accessed on: 04/04/2019
23. Klein, G. (2007). Performing a project premortem. Harvard business review, 85(9), 18-19.
... Heuristic methods refer to problem solving that employs a practical method that is not guaranteed to be optimal or perfect, but is instead considered (by the method's designers) sufficient for reaching an immediate goal. Heuristics are defined as "relatively simple rules of thumb which can be applied to complex decisions where not all information is known, that can result in a suitable response" [3]. Heuristic methods are therefore an attempt to facilitate the process of making the best decision about how to respond to a problemwhile simultaneously acknowledging that the problem has not been perfectly solved [4]. ...
... Kinsey et al. [3] believe that if "a fire engineer is lacking in-depth technical expertise or experience about a given subject, they may rely on fire codes for guidance, potentially without a complete awareness of any underlying basis or assumptions for the guidance". However, Stollard [68] stated that for a reliable analysis to be conducted "it will be necessary to know not only the contents of the legislation and guidance, but also the basis on which these documents were developed". ...
... One could argue that an indexing method is a (possible) heuristic solution to that lack of competence. Nevertheless, it is vital to acknowledge that "heuristics may not be appropriate if used outside of the intended set of scenarios, placing greater importance on the pattern matching capability of the individual and the quality of the information available" otherwise systematic errors can unintentionally be produced [3]. ...
Article
This work summarises the key points that can be drawn from the extensive body of literature associated with fire risk indexing methods. A comprehensive definition of fire risk indexing is provided and the sometimes opaque mechanics of indexing are described in detail. Issues arising from fire risk indexing methods are explored, and the variety of terminology associated with this method is clarified. It is also explored how the development and operation of indexing methods are entangled with issues of reliable expert elicitation and professional competence of the end user. It emerges that the greater the complexity of a method, the more the workings of the method become obfuscated. This creates an inherent tension between the simplicity of the method and its transparency to the users – an issue the developers of fire risk indices ought to address from an early point.
Article
Egress modelling can be used in stadia design. This modelling describes the movement of pedestrians and crowd flow considering wayfinding and decision making in evacuation and circulation. The accuracy of the modelling is highly dependent on project-specific input data that accurately represents the movement of population with associated human factors considered. Currently, there are few contemporary studies of stadia that consider real egress decision making under a range of stimuli of which practitioners may use to influence their modelling and design process. Herein, a range of evacuation urgencies and their effect on pedestrian decision making and wayfinding in stadia are considered: standard post-game egress, egress under high-motivation conditions, and emergency egress. This is done through carefully collected and recorded observation of real stadia in Canada and obtained third party video for stadiums internationally. To reinforce findings, real case studies of other notable emergencies are also considered. Decision making at all stages of evacuation are analyzed. Results indicate that, based on the cases examined, the egress behaviours differ in relation to the level of urgency, such as high motivation and emergency, and gate densities are higher for high motivation egress by a factor of 1.5. The role of staff in the evacuation process is one of the predominant factors in reducing or extending premovement regardless of the scenario. Associated contemporary behavioural theorems are used to explain differences in movement and decision-making in evacuation scenarios.
Article
Full-text available
During a fire evacuation, once an individual perceives cues from a fire event, they must interpret them to assess the new situation and determine whether action is required. It is proposed that this assessment and action selection can employ either an automatic or reflective processing system depending on the nature of the situation and the experiences of the individual involved. This decision-making process is bounded in terms of the information available, the time available, and an individual’s resources to process such information that influences which processing mechanism is adopted. To compensate for such limitations and manage the uncertainty and complexity associated with the decision-making process, people may employ heuristics that reduce decision-making from a cognitively effortful problem-solving task requiring mental reflection, to a less effortful pattern-matching process, where stored conditions and expectations are quickly scanned to identify relevant responses. During this decision-making process cognitive biases may occur which cause an individual to neglect or be biased towards certain information: this may potentially lead to an inappropriate and/or unexpected response. Cognitive biases affect performance without the individual being directly aware of them. This paper identifies cognitive biases from existing literature that may influence a person’s decision-making process during a fire evacuation, along with how these align with general decision-making in the process. The purpose of the article is to promote consideration of cognitive biases in the modeling of evacuee behavior, as well as during the fire safety design of buildings and evacuation procedures. Link to paper: http://rdcu.be/IkWn
Article
Full-text available
This chapter outlines the two basic routes to persuasion. One route is based on the thoughtful consideration of arguments central to the issue, whereas the other is based on the affective associations or simple inferences tied to peripheral cues in the persuasion context. This chapter discusses a wide variety of variables that proved instrumental in affecting the elaboration likelihood, and thus the route to persuasion. One of the basic postulates of the Elaboration Likelihood Model—that variables may affect persuasion by increasing or decreasing scrutiny of message arguments—has been highly useful in accounting for the effects of a seemingly diverse list of variables. The reviewers of the attitude change literature have been disappointed with the many conflicting effects observed, even for ostensibly simple variables. The Elaboration Likelihood Model (ELM) attempts to place these many conflicting results and theories under one conceptual umbrella by specifying the major processes underlying persuasion and indicating the way many of the traditionally studied variables and theories relate to these basic processes. The ELM may prove useful in providing a guiding set of postulates from which to interpret previous work and in suggesting new hypotheses to be explored in future research. Copyright © 1986 Academic Press Inc. Published by Elsevier Inc. All rights reserved.
Article
Full-text available
Defaults effects can be created by social contexts. The observed choices of others can become social defaults, increasing their choice share. Social default effects are a novel form of social influence not due to normative or informational influence: participants were more likely to mimic observed choices when choosing in private than in public (experiment 1) and when stakes were low rather than high (experiment 2). Like other default effects, social default effects were greater for uncertain rather than certain choices (experiment 3) and were weaker when choices required justification (experiment 4). Social default effects appear to occur automatically as they become stronger when cognitive resources are constrained by time pressure or load, and they can be sufficiently strong to induce preference reversals (experiments 5 and 6).
Article
Full-text available
Numerous studies have shown that diagnostic failure depends upon a variety of factors. Psychological factors are fundamental in influencing the cognitive performance of the decision maker. In this first of two papers, we discuss the basics of reasoning and the Dual Process Theory (DPT) of decision making. The general properties of the DPT model, as it applies to diagnostic reasoning, are reviewed. A variety of cognitive and affective biases are known to compromise the decision-making process. They mostly appear to originate in the fast intuitive processes of Type 1 that dominate (or drive) decision making. Type 1 processes work well most of the time but they may open the door for biases. Removing or at least mitigating these biases would appear to be an important goal. We will also review the origins of biases. The consensus is that there are two major sources: innate, hard-wired biases that developed in our evolutionary past, and acquired biases established in the course of development and within our working environments. Both are associated with abbreviated decision making in the form of heuristics. Other work suggests that ambient and contextual factors may create high risk situations that dispose decision makers to particular biases. Fatigue, sleep deprivation and cognitive overload appear to be important determinants. The theoretical basis of several approaches towards debiasing is then discussed. All share a common feature that involves a deliberate decoupling from Type 1 intuitive processing and moving to Type 2 analytical processing so that eventually unexamined intuitive judgments can be submitted to verification. This decoupling step appears to be the critical feature of cognitive and affective debiasing.
Article
The Undoing Project examines the relationship between two psychologists, Amos Tversky and Daniel Kahneman, whose work altered how we understand the functioning of the mind. In this book, Lewis embarks on a journey to understand and explain psychological research to a popular audience. Lewis is an expert writer who knows what sells books. The Undoing Project is an informative, entertaining, and quick read. Lewis has produced a well-researched book that is accessible to a broad audience.
Article
Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.