Conference PaperPDF Available

Crew Resource Management – A Concept for Teams in Critical Situations


Abstract and Figures

Background. It seems unbalanced: Much is known about training and optimal teamwork in the domain of civil and military aviation. This sector is famous for high-reliability operations, clear and efficient communication, roles and processes that support decision makers and teams, and systems that are designed with respect to human factors. Contrariwise similar civil sectors, namely medicine and crisis command units, discover only slowly the need and benefits of specialised team management approaches for units in high stress-level situations with severe consequences of possible errors. Aim. To describe the Crew Resource Management (CRM) approach as developed in aviation in such a generalised way that allows its adaption and implementation in other high-stake sectors. Methods. The effects of human errors in complex socio-technical systems are explained as foundation for the application and limitations of CRM. Then core elements of CRM are summarised based on a meta-analysis of evaluative studies, relevant CRM literature, and corresponding insight from psychology and behavioural science. Limitations are discussed, such as intercultural barriers and surrounding organisational requirements. Conclusion. CRM concepts can be described on a high but practical level that makes them available to many other fields of application. Especially medical teams and crisis command units on all levels can profit when they are properly trained not only in their technical duties but also in team management skills. Immediate benefit can be a higher reliability in decisions, reduced error rate and maintenance of precious human resources on the long term.
Content may be subject to copyright.
Crew Resource Management A Concept for Teams in Critical
Benedict Gross1
1 Munich, Germany. E-mail:
Background. It seems unbalanced: Much is known about training and optimal teamwork in the domain of civil and military
aviation. This sector is famous for high-reliability operations, clear and efficient communication, roles and processes that
support decision makers and teams, and systems that are designed with respect to human factors. Contrariwise similar civil
sectors, namely medicine and crisis command units, discover only slowly the need and benefits of specialised team
management approaches for units in high stress-level situations with severe consequences of possible errors.
Aim. To describe the Crew Resource Management (CRM) approach as developed in aviation in such a generalised way that
allows its adaption and implementation in other high-stake sectors.
Methods. The effects of human errors in complex socio-technical systems are explained as foundation for the application and
limitations of CRM. Then core elements of CRM are summarised based on a meta-analysis of evaluative studies, relevant
CRM literature, and corresponding insight from psychology and behavioural science. Limitations are discussed, such as
intercultural barriers and surrounding organisational requirements.
Conclusion. CRM concepts can be described on a high but practical level that makes them available to many other fields of
application. Especially medical teams and crisis command units on all levels can profit when they are properly trained not
only in their technical duties but also in team management skills. Immediate benefit can be a higher reliability in decisions,
reduced error rate and maintenance of precious human resources on the long term.
Keywords: Crew Resource Management, Safety Culture, Human Error, Disaster Management, Decision Making, Leadership,
Professional Communication.
Situations may arise where a team is forced to function under the worse possible circumstances, where mistakes could lead to
disastrous consequences. Cockpit crews in aviation, medical teams in operation theatres, emergency response teams and also
decision makers during national crisis scenarios are among a few examples. Whereas professional aviators undergo regular
training in flight simulators such concepts are relatively new to healthcare and crisis command staff. The chance that a crisis
command team has ever practiced together before a natural disaster occurs is in fact very low.
This article aims at explaining the principles of the concept of Crew Resource Management (CRM) that has originally been
developed for cockpit crews and is now adopted by other high-reliability or high-criticality branches. The aim of this article
is to uncover the underlying principles of CRM to make them available to other environments and branches. The question is:
How do we train crisis staff to enable them to unlock their full potential even under the most strenuous conditions.
1.1 History of CRM
The origins of CRM can be traced back to summer 1979 when a NASA workshop was held in San Francisco in order to
investigate root causes of fatal accidents in aviation. Statistics for airplane accidents from 1959 through to 1989 indicated that
flight crew actions were the primary cause for more than 70% of worldwide accidents involving aircraft damage beyond
economical repair; for instance when a whole crew was distracted by the failure of a landing gear indicator light and failed to
notice that the automatic pilot was disengaged and the aircraft was descending into a swamp (Helmreich & Foushee, 2010).
The introduction speech to the NASA workshop on Social Psychology on the Flight Deck (Helmreich, 1980) was given by
Robert Helmreich, a former marine officer and psychologist. This is a remarkable fact as psychology was perceived closer to
religion than to science during this time. Helmreich nonetheless convinced the audience of the importance to address the
flight crew’s social interactions as cause of incidents. In the aftermath a new interdisciplinary approach of safety research
emerged. It combined insights from many prominent scientists like Charles Perrow and his Normal Accident Theory (Perrow,
1999) about the inevitable risks of cascading failures in complex and tight-coupled systems e.g. nuclear plants, or James
Reason’s work about Organisational Accidents (Reason, 1997) and Human Error (Reason, 1990). But also psychology and
behavioural science were influencing, for instance Amos Tversky & Daniel Kahnemann’s experiments on Heuristics and
Biases in decision making (Tversky & Kahneman, 1974) and Daniel Klein’s exploration of Naturalistic Decision Making and
gut feelings (Klein, 1998).
Such research about the nature of human behaviour and its interaction with complex systems, paired with much practical
experience and try-and-error, came together in a new notion of safety culture. Consequently special training formats
addressing Crew Resource Management (CRM) have been developed and became part of the mandatory training schedule of
professional airline pilots. Since then aviation established a safety culture with an outstanding effect: In the year 2012 only
15 fatal accidents happened on 22,6 million flights worldwide or 414 total fatalities of 2,97 billion passengers travelled
(IATA, 2013, p. 14). Numbers for 2013 seem to be even significantly lower (Ranter, 2014). This sector is thereby a classical
example for high-reliability organisations, a term that has been coined for systems that operate in hazardous conditions but
incur less adverse events than their expectable share would be (e.g. Reason, 2000). How significant low the number of fatal
accidents in aviation is demonstrates a comparison with the German health care system: Regardless of it being a highly
professionalised sector, it is still estimated to have a rate of 0,1% of all patients coming to death as consequence of avoidable
medical errors (Fischer et al., 2007). This leads to an absolute number of 17-19.000 fatalities per year due to human errors
and only in this country compared to 414 fatalities overall in commercial aviation worldwide.
1.2 The Effect of Human Factors in Complex Socio-Technical Systems
Maybe the best approach to put CRM concepts in a frame is James Reasons theory of organisational accidents (Reason,
1997). Most high-reliability environments are complex socio-technical workplaces, where people and technical systems
interact with a risk of failure and system breakdowns. Safety systems in companies and institutions are therefore designed to
have several barriers or levels of defence. This can be physical measures (e.g. colour-coded connectors and backup power
systems), procedural (e.g. double checks and clearance processes), or organisational considerations (e.g. separation of duties
paradigm and rotation principle in working shifts). In reality unfortunately all of those safety barriers have some weak points
and the best picture to describe them is commonly not a strong and impervious wall but rather a slice of Emmentaler cheese
with several holes in it. These holes either emerge from latent mistakes in the organisation or because of active mistakes of
single individuals, not intentional of course. Even if several barriers and redundant lines of defence exist to mitigate this risk,
there is a chance that the holes in all layers, one day, may accidentally align. When an adverse event happens in such a
moment, its consequences can pass all barriers undamped and would lead to a catastrophic impact. Reason calls this his Swiss
Cheese Model (see Figure 1).
Fig. 1: An accident trajectory passing through corresponding holes in the layers of defences, barriers and safeguards: The
Swiss Cheese Model” (Reason, 1997). The holes can be created by active and latent failures. In a moment when holes in all
barriers align, an adverse event can pass all levels of defence undamped and unfold catastrophic impact.
The origins of such weak spots and holes in defence barriers can be investigated back through several organisational levels as
figure 2 demonstrates. Individual unsafe acts happen on the foundation of local and organisational factors (red arrows) and
reasons can be traced backwards again (white arrows). The human error is commonly not a singular mistake but often enough
product of the system the actor is working in. Whereas CRM trainings address the top of this pyramid collaboration and
decision making of teams in critical situations they must always be embedded in a context of organisational safety culture
with all underlying levels to be beneficial towards a higher reliability and reduction of organisational accidents.
Fig. 2: Stages in the development and investigation of an organisational accident according to (Reason, 1997). The
rectangular box represents how an event leads to an adverse outcome after several defence barriers failing. (see Swiss Cheese
Model) The triangular shape below represents the system producing it in three levels: The person (unsafe acts), the workplace
(error-provoking conditions), and the organisation. Latent conditions can arise from all of those levels and become manifest
as weak spots of a systems barrier as opposed to active failures of individual humans.
Reason’s theory about organisational accidents and human error prepares a context for CRM and also mark some limitations.
CRM is about error management but it needs to be implemented in a culture of safety and professionalism to unfold its
benefits. The organisational model is generic and can be applied to most types of institutions.
It might sound paradox now to acknowledge the human fallibility as main source for fatal accidents and to trust the same
people again as last line of defence when it comes to mitigate critical incidents but this is what CRM is about. The
approach is threefold: 1. Avoidance of error, 2. trapping incident errors before they are committed and 3. Mitigating the
consequences of errors which occur and are not mitigated (Helmreich, Merritt, & Wilhelm, 1999). But the definition of what
is “inside” CRM and how to train it varies widely between branches; a common standard has never been agreed. With
regards to the main influencing theories and having the evolution of CRM in mind, some core elements can be identified
nevertheless. The table below gives an overview. It has been compiled based on a meta-analysis of 28 evaluative studies
(Salas, Wilson, Burke, & Wightman, 2006), relevant CRM literature, and corresponding insights from psychology and
behavioural science. The core elements are grouped into team competencies, communication competencies and theoretical
background knowledge.
Table 1: Sample design of simulator training for health care professionals
Team Competencies
Ability of a team leader to direct and coordinate the activities of team members, encourage
team members to work together; assess performance; assign tasks; develop team knowledge,
skills, and abilities; motivate; plan and organise; and establish a positive team atmosphere.
(Salas et al., 2006)
“Effective leaders delegate so that they can regulate” (Sexton, 2004)
Workload Management
Mutual Performance Monitoring: Ability of team members to accurately monitor other
team members performance, including giving, seeking, and receiving task-clarifying
feedback. (Salas et al., 2006)
Backup Behaviour and Distribution of Workload: Ability of team members to anticipate
the needs of others through accurate knowledge about each other’s responsibilities,
including the ability to shift workload between members to create balance during periods of
high workload or pressure. (Salas et al., 2006)
Ability of team members to alter a course of action or adjust strategies when new
information becomes available. (Salas et al., 2006)
Communication Competencies
Some Golden Rules of Group Interaction and professional communication in High Risk
Environments: (Sexton, 2004)
Speak simply: Use small words, articulate simple thoughts, and ask simple questions
Use standardised phraseology
Talk about problems (High performers devote more time to “problem-solving”
Maintain an environment of open communication and stay calm during high workload
Closed Loop Communication: A communication technique to avoid misunderstandings: A
sender gives a message, the receiver repeats it back, the sender confirms if right. This closes
a loop to assure a message or command being transmitted and understood correctly.
Structured Briefings: Professional briefings follow a clear and commonly agreed structure,
e.g. SBAR: Situation Background Assessment Recommendation.
This ensures integrity of transmitted information, economy of time and contributes towards
a shared situational awareness.
Self-reflection and problem orientation: Talk about task-related problems. High
performers devote more time to “problem solving” communications. (Sexton, 2004)
A constant critical re-evaluation of own situational awareness and taken measures can also
help to de-bias teams and identify wrong assumptions.
Mission Evaluation: Critical mission evaluation to assess room for improvement but also
success criteria and to save best practices. This can contain individual self-reflection,
teamwork, and organisational factors. Aim is continuous improvement and organisational
learning but can also have psycho-hygienic effects after demanding situations (compare
CISM debriefings: Everly & Mitchell, 1997).
(continued - Table 1: Sample design of simulator training for health care professionals)
Theoretical Background Knowledge
(Shared) Situational
Situational Awareness describes the function of human perception and their interpretation
that leads us to a mental model of a situation (see e.g. Stanton, Chambers, & Piggott, 2001).
This becomes an important concept in situations of high uncertainty when information is
contradictory and fast changing.
Shared Situational Awareness describes the ability of a team to develop a common mental
model of a situation and is important for coordinating efforts and decisions.
Decision Making
Decision Making is a crucial process in crisis situations. Rational decision making theories
provide acronyms that guide through a structured process. Research indicates contrariwise
that human decision behaviour follows inherent and irrational mechanisms.
Decision making cycles provide structures for rational decisions, e.g.:
OODA: Observe Orient Decide Act
PDCA: Plan Do Check Act
FOR-DEC: Facts Options Risks & Benefits Decision Execution Check
Research about decision making indicates contrariwise that humans do not follow rational
processes in decision making. The Heuristics & Biases school (e.g. Tversky & Kahneman,
1974) found patterns in human thinking that bias decisions. The gut-instinct school (e.g.
Klein, 1998) in contrast identified instincts and non-rational feelings as source for good
It can be assumed that even if human decision making processes tend to be biased or at least
complex. In stress situations rational cycles like OODA can help teams to double-check
decisions or provide guidance for structured assessments in high-pressure and uncertainty
Human Error
Human (unintentional) error follows a typology:
Skill based slips and lapses are unintended failures of executing actions. They occur due
to attentional or perceptual failures (slips) or due to failures of memory (lapses).
Mistakes originate from higher levels of mental processing like assessing information,
planning, formulating intentions, and judging likely consequences. Rule-based mistakes
happen when a normally good rule is wrongly applied, knowledge-based mistakes occur
when we run out of pre-packaged solutions or rules and have to think out problem solutions
on line, which is highly error prone. (Reason, 1997, 2008)
Personal Resources
and Stress
Both the human physiology and psychology have several aspects that necessitate some
economy of personal resources to maintain them working during a longer crisis situation and
Attention for instance is a cognitive resource that will be limited under stress. Fixation
errors or inattentional blindness occur when attention sticks with a single topic, ignoring
other, possibly more threatening issues.
Stress leads to physical consequences limiting the personal capacity. Whereas in crisis
situations stress is an inevitable circumstance and mild forms can even be beneficial (Combs
& Taylor, 1952; Vasile, 2010), people in high-skilled and high-demanded teams need to be
equipped with strategies for coping with stress and recovery strategies to restore their
capacities after the crisis has ended.
Figure 3 is a visualisation of the core elements of CRM as described in the table above. The classification in 10 elements can
only be an approximation and must be arbitrary to some degree resulting from the multifarious nature of this subject. While
all of the CRM methods can be deducted from the theoretical elements, the team- and communication elements are
interlinked among themselves, overlapping and interdependent. What is CRM now? “CRM is one of an array of tools that
organisations can use to manage error” (Helmreich et al., 1999, p. 9) or like Salas an colleagues summarised it a “family of
instructional strategies designed to improve teamwork” (1999, p. 163).
Fig. 3: Core Elements of Crew Resource Management. Theoretical background knowledge is both foundation for practical
CRM methods and content of CRM trainings. The team- and communication competencies are interlinked and overlapping.
Until here CRM has been outlined as an industry-neutral blueprint. To apply it in practice, e.g. healthcare or disaster
management, it is still a bit too abstract in this form of representation. Methods and training content should therefore be
derived from these core elements, tailored according to the target group, and drilled down to a trainable format. The
challenge for implementation in an organisation is to translate the elements in practical methods and training concepts, for
instance finding an appropriate decision cycle, briefing agenda, mission evaluation formats, and leadership style. Medical
staff might need different briefing agendas and decision loops than military staff and aviation crews. The level of depth in
which the theory is presented should be adjusted according to the audience.
CRM trainings are usually facilitated in high-fidelity simulator environments like flight- or patient simulators. This enhances
learning effects, gives participants the perception of their real and complex working environment, and allows integration of
CRM with technical skills training. Some of the training content can be taught during class sessions, like psychological
background knowledge, but most of the education will happen in practical simulator exercises where the self-reflection and
group learning of participants can be fostered.
A high-fidelity simulator is of course favourable and allows integrating technical skill training with CRM in the same
simulation and thereby generating welcome synergetic effects. Nonetheless, an effective CRM training situation can also be
achieved with very simple means. It should always kept in mind that CRM is about human interaction in teams, about
decision making, leadership, and human error not so much about entertainment. The simulator is mainly the tool to support
the participants training intensity and success and simulations must be designed wisely to serve this purpose and utilise the
simulation environment efficient.
Table 2 gives an idea how simulator training content and learning objectives can be designed. Technical scenarios are
integrated with CRM skills, e.g. a cardiac arrest emergency while the team’s workload management abilities will be
challenged through simulated complications. After each scenario session participants evaluate their performance and define
learnings with the help of video recordings and observers feedback.
Also indicated by table 2 is the fact that transferring such integrated simulation trainings to other professional environments
will require most effort for adaption in the technical skills and scenarios. The CRM part will stay generic to a great extent but
still needs adaption to the new field.
Table 2: Sample design of simulator training for health care professionals
Technical Skills
CRM Skills
Classroom Lessons
Clinical pictures of e.g. obstetric
emergencies, or others
Standard treatment algorithms
Briefing on available medical equipment
Specific emergency procedures for the unit /
CRM introduction and overview
Theoretical background on:
Human Error
Decision Making
(Shared) Situational Awareness
Personal Resources and Stress
Simulator Training
Scenarios are designed to both simulate
medical emergency situation to foster
technical education as well as to stimulate
team dynamics that promote or require CRM
Examples for different scenarios are:
cardiac arrest
obstetric emergencies
Team- and Communication Competencies are
trained in simulator sessions, the participants
ability to analyse and self-reflect team
performance is enabled.
Team Competencies:
Workload Management
Communication Competencies:
Participants have the up-to-date knowledge
and abilities to detect and treat emergencies as
taught in the training. They are familiarised
with the available equipment, specific
workplace environment as well as algorithms
and standard operational procedures.
Participants are aware of the implications of
human factors. They have methods at hand to
cope with stress situations and unlock full
potential of a team in such conditions.
Techniques for self-reflection and de-briefings
enable them to assess, de-bias and
continuously improve their teamwork.
As demonstrated earlier in this article, CRM embedded in a safety culture of an organisation and should always be
implemented with respect to its characteristics. The most comprehensive research regarding differences in national and
organisational cultures comes from Geert Hofstede and his comparative studies of cultural dimensions (Hofstede, Hofstede,
& Minkov, 2010). A national healthcare system or disaster management organisation will inevitable be closely related with
the national culture. Some of the cultural dimensions are strongly relevant for the design of CRM systems: Introducing
Standard Operating Procedures (SOP’s) in cultures low on the Uncertainty Avoidance scale will be a challenge and the
maxim of flat hierarchies, which is often mentioned in the context of CRM, might not be appropriate for organisations high
on the Masculinity level. The acceptance of group decision making processes will be dependent on the Individualism value
and leadership models and communication principles have to be developed with the Power Distance in mind for instance.
Figure 4 explains four of Hofstede’s cultural dimensions with obvious relevance for CRM. It also shows the scales for
selected countries: The United States, Germany, and Great Britain as countries where nearly all of the CRM research
originates at this moment, Russia and China where only little interest for CRM is known, and Iran (where this paper was
presented initially) in comparison. The values underline the divergence of national cultural profiles and the importance of
individualisation and tailoring of CRM concepts for every organisation. The limitation is a logical consequence: CRM is not
a one-size-fits-all approach or an “off the shelf” product. It must always be tailored to the organisational and national cultures
initially. Important differences will exist between e.g. private profit orientated companies or public, small and large
organisations. University hospitals are different from public basic health care institutions and also crisis management is
dependent on whether it is on federal or state level, at fire brigades or police forces, in administrative, military, or civilian
context, public or private dominated.
Training Scenario 1 +
Fig. 4: Cultural tendencies influencing CRM (Hofstede et al., 2010)
Explanations of cultural dimensions (Johnston, 1993):
Uncertainty Avoidance (uai). High UAI tends to generate rigidity and strong adherence to the formality of rules and
regulations. High UAI is associated with social acceptance of aggressive behaviour, strong task orientation, ritual behaviour
and written rules. If formal goals are not achieved, there is potential for dispute over the “rules” or rule adherence, rather than
about the operational objectives to which the rules are directed. Alternatively, low UAI could conceivably be associated with
a permissive and indulgent approach to rules and standard operating procedures.
Masculinity (mas). This relates to beliefs regarding the gender division of social roles. High MAS societies tend to have a
belief in the “independent decision-maker,” and leaders value their decision-making autonomy. Decisiveness, interpersonal
directness and machismo are common in high MAS societies. Again, there are implications here for CRM.
Individualism (idv). Individualism and collectivism are intimately associated with the predominant value systems of society
at large. In high IDV societies, individual decision making is normal and preferred to group decisions. Individual initiative
and leadership are highly valued. In low IDV societies, group decisions are felt to be better than decisions made by
individuals. Personal initiative is not encouraged. Social identity and position are determined by membership in various in-
groups. Here the implications for CRM should be obvious.
Power Distance (pdi). The degree to which the less powerful members of a society accept and expect that power is
distributed unequally. In high PDI countries, junior crewmembers are more likely to fear the consequences of disagreeing
with leaders, possibly with good reason. Leaders are themselves likely to feel comfortable with paternalistic behaviour and
leadership by directive. Leaders, rather than followers, will tend to initiate communication. This is a useful reminder that
leadership does not exist in a social vacuum it is directly related to the incumbent social embodiment of “followership” or
“subordinateship.” Unresponsiveness to legitimate authority could arise with Low Power Distance especially if associated
with high individualism.
Another limitation and trap is to reduce CRM to a sheer training product that can be consumed solitary. As argued earlier
CRM can be part of safety culture but does not substitute other measures such as non-punitive error and incidence reporting
systems, leadership development, technical and organisational safety design, and many more. Key success factor is to embed
it in quality- and risk management efforts. When implemented CRM should become part of a regular technical skills training
schedule like the aviation industry is practicing it for years. CRM is not a one-time product: Skills have to be refreshed and
simulations repeated on a regular basis. Applied properly, simulation trainings can serve an additional purpose by enhancing
familiarisation of new employees with a new and complex work environment and reducing the opportunity costs for training
on the job.
A third and important limitation must be kept in mind when looking at aviation’s safety culture and the perfection that seems
to be in cockpit crew collaboration processes with envy: Medicine and disaster management are more complex than flying an
airplane. Helmreich for instance advocates that CRM concepts have to be adapted very carefully for the medical sector
because “this is a milieu more complex than the cockpit, with differing specialities interacting to treat a patient whose
condition and response may have unknown characteristics” and “aircrafts are more predictable than patients” (Helmreich,
2000). Large airlines can afford to train staff regularly, the crewmembers are all employed at the same company, roles and
positions are clearly defined and the vast majority of operations are predefined and standardised. These are definitely not the
same preconditions we find in medicine and disaster management, where staff constellation change constantly, qualifications
are widespread, and situations are usually dynamic, intransparent and high on uncertainty. Aviation can be a role model and
source of inspiration but CRM concepts should not be imitated hasty and without proper tailoring.
This article went back in the history of CRM and tried to develop a core concept of CRM based on the original research and
theoretical contributions. CRM evolved through several evolutionary stages in different industries and was luckily always
driven by practical needs. Ten core elements of CRM have been deduced and limitations have been discussed, especially the
national and organisational culture as important influences for the application of CRM concepts and the need for embedding
CRM in conjunction with more organisational safety measures.
Key messages of this article are:
CRM is part of organisational safety culture and has to be approached as high-priority management topic therefore.
CRM is about the human factors as source of error but also about humans as best resource to avoid and human
errors and incidents again.
It is therefore not restricted to a single branch or profession but can unfold its benefits in all safety critical or high-
stakes environments.
Individual tailoring of CRM concepts for a specific organisation is necessary.
The content of simulation based trainings can be distinguished between technical and CRM skills and
competencies. It is crucial to clearly define both and then integrate them into training scenarios.
The national and organisational culture is of high influence for the design of CRM concepts.
CRM cannot be applied ignoring surrounding organisational safety measures. It would be a severe mistake to
implement simulator training without analysing and addressing surrounding processes and constraints.
CRM training has to be refreshed on a regular base and integrated in technical skills training.
Benedict Gross is consulting expert for project- and crisis management, oscillating between practice and science. Clients
include banks, insurance companies, healthcare organisations but also football clubs and others. His current field of research
is patient safety and safety culture in health care systems.
After graduating in Business Law, being certified as Project Manager (IPMA) and Project Management Professional (PMI),
he completed his master studies in Disaster Management (University of Bonn) and in Forensic Psychology (University of
Liverpool) and holds the Certified Emergency Manager (IAEM) credential.
Corresponding address: Augustenstr. 53, Munich (Germany);
Combs, A. W., & Taylor, C. (1952). The effect of the perception of mild degrees of threat on performance. The Journal of
Abnormal and Social Psychology, 47(2, Suppl), 420-424. doi: 10.1037/h0057196
Everly, G. S., & Mitchell, J. T. (1997). Critical incident stress management-CISM-: A new era and standard of care in crisis
intervention: Chevron Pub.
Fischer, G., Glaseske, G., Kuhlmey, A., Schrappe, M., Rosenbrock, R., Scriba, P., & Wille, E. (2007). Gutachten 2007 des
Sachverständigenrates zur Begutachtung der Entwicklung im Gesundheitswesen: Kooperation und Verantwortung.
Voraussetzungen einer zielorientierten Gesundheitsversorgung. (16/6339). Berlin: Deutscher Bundestag.
Helmreich, R. L. (1980, 26-28 Jun. 1979 ). Social Psychology on the Flight Deck. Paper presented at the NASA Workshop on
Resource Management on the Flight Deck, San Francisco.
Helmreich, R. L. (2000). On error management: lessons from aviation. BMJ, 320(7237), 781-785.
Helmreich, R. L., & Foushee, H. C. (2010). Chapter 1: Why CRM? Empirical and Theoretical Bases of Human Factors Training.
Crew Resource Management, 3-57. doi: 10.1016/b978-0-12-374946-8.10001-9
Helmreich, R. L., Merritt, A. C., & Wilhelm, J. A. (1999). The Evolution of Crew Resource Management Training in
Commercial Aviation. The International Journal of Aviation Psychology, 9(1), 19-32. doi: 10.1207/s15327108ijap0901_2
Hofstede, G., Hofstede, V. G., & Minkov, M. (2010). Cultures And Organizations: Software For The Mind: McGraw-Hill USA.
IATA. (2013). Annual Review 2013 of the International Air Transport Association: International Air Transport Association
Johnston, N. (1993). CRM: Cross-cultural perspectives. In E. L. Wiener, B. G. Kanki & R. L. Helmreich (Eds.), Cockpit resource
management (pp. 367-398). San Diego: Academic Press.
Klein, G. A. (1998). Sources of power : how people make decisions. Cambridge, Mass. ; London: MIT Press.
Perrow, C. (1999). Normal accidents : living with high-risk technologies. Princeton, N.J.: Princeton University Press.
Ranter, H. (2014). Press release: Airliner accident fatalities at record low. Retrieved 20.1., 2014, from http://news.aviation-
Reason, J. T. (1990). Human error. Cambridge: Cambridge University Press.
Reason, J. T. (1997). Managing the risks of organizational accidents: Ashgate Publishing.
Reason, J. T. (2000). Human error: models and management. BMJ, 320(7237), 768-770.
Reason, J. T. (2008). The human contribution : unsafe acts, accidents and heroic recoveries. Farnham, England ; Burlington, VT:
Salas, E., Prince, C., Bowers, C. A., Bowers, C., Stout, R., eacute, . . . Cannon-Bowers, J. A. (1999). A Methodology for
Enhancing Crew Resource Management Training. Human Factors The Journal of the Human Factors and Ergonomics
Society, 41(1), 161.
Salas, E., Wilson, K. A., Burke, C. S., & Wightman, D. C. (2006). Does Crew Resource Management Training Work? An
Update, an Extension, and Some Critical Needs. Human Factors, 48(2), 392-412.
Salas, E., Wilson, K. A., Burke, C. S., Wightman, D. C., & Howse, W. R. (2006). A Checklist for Crew Resource Management
Training. Ergonomics in Design: The Quarterly of Human Factors Applications, 14(2), 6-15. doi:
Sexton, J. B. (Ed.). (2004). The Better the team the safer the world: Golden Rules of Group Interaction in High Risk
Environments. Ladenburg and Rüschlikon: Gottlieb Daimler and Karl Benz Foundation, Swiss Re Centre for Global
Stanton, N. A., Chambers, P. R. G., & Piggott, J. (2001). Situational awareness and safety. Safety Science, 39(3), 189-204. doi:
Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124-1131. doi:
Vasile, C. (2010). Mental Workload: Cognitive Aspects and Personality. Petroleum - Gas University of Ploiesti Bulletin,
Educational Sciences Series, 62(2), 132-137.
... comparing optional courses of action) • creative processes. Gross argues that, while human decision-making processes are complex and tend to be biased, the concise following of rational cycles, like OODA, can help teams to minimise reaction times and can provide guidance for structured assessments in high-pressure and uncertain environments, including pre-hospital emergency settings (33). Despite these and other bodies of knowledge on decision-making in non-medical domains, decision-making itself as a 'soft skill' is often not explicitly taught or coached in medical training programs. ...
Full-text available
IntroductionTraditionally, undergraduate emergency medical care (EMC) training programs have, over the years, typically focussed on developing individuals with proficiency in clinical skills who can perform complex procedures in the act of administering safe and effective emergency care in the pre-hospital setting. A shortcoming of this training relates to the attention given to the soft skills needed to work efficiently in a team-based environment. Crisis resource management (CRM) is a structured, evidence-based approach to training that is designed to enhance teamwork performance in critical circumstances where the absence of coordinated teamwork could lead to undesired outcomes. MethodsA narrative review of GOOGLE SCHOLAR, MEDLINE, PUBMED, CINAHL as well as paramedic-specific journals was conducted. Articles were included if they examined the importance of CRM in pre-hospital emergency care; training undergraduate pre-hospital emergency care students on the principles and practices of CRM; and non-technical skills in pre-hospital emergency care. DiscussionResearchers found limited articles related to CRM and the pre-hospital emergency care setting. Our findings reveal that CRM focusses on addressing non-technical skills necessary for effective teamwork and that those identified to be relevant for effective teamwork in pre-hospital emergency care setting include situation awareness, decision-making, verbal communication, teamwork as well as leadership and followership skills. Conclusion Effective team management is a core element of expert practice in emergency medicine. When practised in conjunction with medical and technical expertise, CRM can reduce the incidence of clinical error and contribute to effective teamwork and the smooth running of a pre-hospital emergency care plan.
... Fatigue and stress factors had personal limitations in pilots crew resource management skills. Gross (2014) expressed that stress caused personal limitations and physical difficulties. Also, author emphasized that strategies to cope with will reduce stress. ...
The purpose of this paper is to explore how everyday experiences of the Aircraft Maintenance Engineers (AMEs) affects human factors by first understanding what experiences adversely affect the AMEs. This study uses qualitative methods to answer the question, how do negative experiences in the workplace degrade the human performance of AMEs in Ontario, Canada’s air transportation sector? The preliminary findings suggest that an organisational culture that tolerates negative intra-employee and management to employee interactions degrades workers’ safety. The findings of this study show that psychosocial risks, like negative interactions, degrade a workers’ safety performance in a safety critical socio-technical environment by creating distractions, increasing stress, and reducing communication. Keeping in line with the Dirty Dozen framework: organisational culture is the ‘Freaky 13th’ element. The findings of this study first show that culture is perceived differently by workers in the same occupation, and negative experiences with peers and management degrade HFs. When these findings are compared to how the International Civil Aviation Authority currently frames aviation safety, aviation safety culture and practices fall short in acknowledging safety risks.KeywordsHuman factorsOrganisational safety cultureAircraft maintenance
The efficiency and convenience of gesture shortcuts have an important influence on user experience. However, it is unknown how the number of permitted swiping angles and their allowable range affect users’ performance and experience. In the present study, young and old users executed swiping in multiple directions on smartphones. Results showed that multiple allowable angles resulted in slower swiping speed and poorer user experience than the single allowable angle condition. However, as the number of allowable angles increased, only old users showed a significant decrease in swiping accuracy. Vertical-up and upper-right swiping were faster than swiping in the horizontal directions. Furthermore, narrower operable range of swiping only reduced swiping accuracy in the tilted direction. Though old users performed worse on swiping than younger users, their subjective ratings were more positive than younger users’. Suggestions on how to design swiping gestures on the human-mobile interface were discussed.KeywordsGesture shortcutsSwipe gesturesSwipe angleAge differenceUser experience
Full-text available
As one of the low-cost airlines in Indonesia, Citilink Indonesia comes with unique packaging on each flight through the concept of young, fun and dynamic, especially represented by their cabin crew. This research is intended to analyze the application of Crew Resources Management by Citilink Indonesia in relation to impression management conducted by Citilink Indonesia's cabin crew. In this research, researchers used impression management concept analysis on crew resources management in organizational communication by using qualitative study methods through a case study approach and using interpretive paradigms in looking at the problems in this research. Based on the results of the research that has been obtained, the researcher found that the cabin crew of Citilink Indonesia carried out impression management by making Crew Resources Management a guideline, which is seen through their way of representing themselves by following the policies and rules set by the company that covers various aspects such as Performance, Communication Skill, Knowledge, Poise and grace (attitude), problem solving, team work and decision making in order to provide satisfaction of safety, security and service to passengers. So the researchers conclude that the application of Crew Resources Management is an impression management step made by Citilink Indonesia to its cabin crew to be able to represent the company well through a row stages of preparation to minimize the undesirable things caused by human error. Sebagai salah satu maskapai berbiaya hemat (low cost carrier) di Indonesia, Citilink Indonesia hadir dengan kemasan unik disetiap penerbangannya melalui konsep young, fun and dynamic yang khususnya direpresentasikan para awak kabin mereka. Penelitian ini dimaksudkan untuk menganalisis penerapan crew resources management oleh Citilink Indonesia dalam kaitannya dengan impression managementyang dilakukan oleh awak kabin Citilink Indonesia. Pada penelitian ini, peneliti menggunakan analisis konsep impression management pada crew resources management dalam komunikasi organisasidenganmenggunakan metode penelitian kualitatif melalui pendekatan studi kasus dan menggunakan paradigma interpretif dalam memandang permasalahan dalam penelitian ini. Berdasarkan hasil dari penelitian yang telah didapatkan, peneliti menemukan bahwaawak kabin Citilink Indonesia melakukan impression management atau pengelolaan kesan dengan menjadikan crew resources managementsebagai sebuah pedoman, yangdimana hal ini terlihatmelalui cara mereka dalam merepresentasikan diri dengan mengikuti kebijakan serta aturan yang telah ditetapkan oleh perusahaan yang meliputi berbagai aspek seperti performance, communication skill, knowledge, poise and grace(attitude), problem solving, team work dan decision making atau pengambilan keputusanagar dapat memberikan kepuasan terhadap safety, security dan servicekepada penumpang. Sehingga peneliti menyimpulkan bahwa penerapan Crew Resources Managementmerupakan sebuah langkah pembentukan kesan yang dilakukan Citilink Indonesia terhadap awak kabinnya untuk dapat merepresentasikan perusahaan dengan baik melalui serangkaian tahap persiapan secara matang guna meminimalisir hal-hal yang tidak diinginkan yang diakibatkan oleh human error.
This book explores the human contribution to the reliability and resilience of complex, well-defended systems. Usually the human is considered a hazard - a system component whose unsafe acts are implicated in the majority of catastrophic breakdowns. However there is another perspective that has been relatively little studied in its own right - the human as hero, whose adaptations and compensations bring troubled systems back from the brink of disaster time and again. What, if anything, did these situations have in common? Can these human abilities be ‘bottled’ and passed on to others? The Human Contribution is vital reading for all professionals in high-consequence environments and for managers of any complex system. The book draws its illustrative material from a wide variety of hazardous domains, with the emphasis on healthcare reflecting the author's focus on patient safety over the last decade. All students of human factors - however seasoned - will also find it an invaluable and thought-provoking read.
Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.
One of the most striking developments in aviation safety during the past decade has been the overwhelming endorsement and widespread implementation of training programs aimed at increasing the effectiveness of crew coordination and flightdeck management. Crew Resource Management (CRM) is thus the application of human factors in the aviation system. Training in CRM involves communicating basic knowledge of human factor concepts that relate to aviation and providing the tools necessary to apply these concepts operationally. It represents a new focus on crew-level (as opposed to individual-level) aspects of training and operations. This chapter discusses why an industry would embrace change to an approach that has resulted in the safest means of transportation available and has produced generations of highly competent, well-qualified pilots. It examines both the historic, single-pilot tradition in aviation and the causes of error and accidents in the system. These considerations lead to the conceptual framework, rooted in social psychology, that encompasses group behavior and team performance. The study looks at efforts to improve crew coordination and performance through training. Finally, it discusses what research has revealed about the effectiveness of these efforts and the questions that remain unanswered.
The revolutionary study of how the place where we grew up constrains the way we think, feel, and act, updated for today's new realities The world is a more dangerously divided place today than it was at the end of the Cold War. This despite the spread of free trade and the advent of digital technologies that afford a degree of global connectivity undreamed of by science fiction writers fifty years ago. What is it that continues to drive people apart when cooperation is so clearly in everyone's interest? Are we as a species doomed to perpetual misunderstanding and conflict? Find out in Cultures and Organizations: Software of the Mind. A veritable atlas of cultural values, it is based on cross-cultural research conducted in seventy countries for more than thirty years. At the same time, it describes a revolutionary theory of cultural relativism and its applications in a range of professions. Fully updated and rewritten for the twenty-first century, this edition: Reveals the unexamined rules by which people in different cultures think, feel, and act in business, family, schools, and political organizations Explores how national cultures differ in the key areas of inequality, collectivism versus individualism, assertiveness versus modesty, tolerance for ambiguity, and deferment of gratification Explains how organizational cultures differ from national cultures, and how they can--sometimes--be managed Explains culture shock, ethnocentrism, stereotyping, differences in language and humor, and other aspects of intercultural dynamics Provides powerful insights for businesspeople, civil servants, physicians, mental health professionals, law enforcement professionals, and others Geert Hofstede, Ph.D., is professor emeritus of Organizational Anthropology and International Management at Maastricht University, The Netherlands. Gert Jan Hofstede, Ph.D., is a professor of Information Systems at Wageningen University and the son of Geert Hofstede.
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.
Crew resource management (CRM) training was developed more than 20 years ago, yet we still do not know its impact on safety. One reason might be that a simple, practical, scientifically based tool is lacking that outlines the critical steps to be taken when designing, implementing, and evaluating CRM training. In this article, we provide such a tool: a checklist to help CRM training designers and others throughout the process. We take a systematic approach to provide this heuristic as a means to shaping the thinking, design, delivery, implementation, and evaluation of CRM training. Copyright 2006 by Human Factors and Ergonomics Society, Inc. All rights reserved.
inquires why an industry would embrace change to an approach that has resulted in the safest means of transportation available and has produced generations of highly competent, well-qualified pilots / examine both the historic, single-pilot tradition in aviation and what we know about the causes of error and accidents in the system / these considerations lead us to the conceptual framework, rooted in social psychology, that encompasses group behavior and team performance / look at efforts to improve crew coordination and performance through training / discuss what research has told us about the effectiveness of these efforts and what questions remain unanswered (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Human error is an ever-present threat to the safe conduct of flight. Recently, applied psychologists have developed an intervention, crew resource management (CRM) training, designed to help prevent human error in the cockpit. However, as it is commonly applied within the aviation community, CRM lacks standardization in content, design, delivery, and evaluation. This paper presents a discussion of an applied program of research aimed at developing a methodology for the design and delivery of CRM training within the Navy. This long-term, theoretically based program of aviation team research included identification of skills to be trained, development of performance measures, application of instructional design principles, and evaluation of the training delivery. Our conclusion indicates that a systematic methodology for developing CRM training can result in better performance in the cockpit. Actual or potential applications of this research include any task environment in which teams are interdependent.