Crew Resource Management – A Concept for Teams in Critical
1 Munich, Germany. E-mail: email@example.com
Background. It seems unbalanced: Much is known about training and optimal teamwork in the domain of civil and military
aviation. This sector is famous for high-reliability operations, clear and efficient communication, roles and processes that
support decision makers and teams, and systems that are designed with respect to human factors. Contrariwise similar civil
sectors, namely medicine and crisis command units, discover only slowly the need and benefits of specialised team
management approaches for units in high stress-level situations with severe consequences of possible errors.
Aim. To describe the Crew Resource Management (CRM) approach as developed in aviation in such a generalised way that
allows its adaption and implementation in other high-stake sectors.
Methods. The effects of human errors in complex socio-technical systems are explained as foundation for the application and
limitations of CRM. Then core elements of CRM are summarised based on a meta-analysis of evaluative studies, relevant
CRM literature, and corresponding insight from psychology and behavioural science. Limitations are discussed, such as
intercultural barriers and surrounding organisational requirements.
Conclusion. CRM concepts can be described on a high but practical level that makes them available to many other fields of
application. Especially medical teams and crisis command units on all levels can profit when they are properly trained not
only in their technical duties but also in team management skills. Immediate benefit can be a higher reliability in decisions,
reduced error rate and maintenance of precious human resources on the long term.
Keywords: Crew Resource Management, Safety Culture, Human Error, Disaster Management, Decision Making, Leadership,
Situations may arise where a team is forced to function under the worse possible circumstances, where mistakes could lead to
disastrous consequences. Cockpit crews in aviation, medical teams in operation theatres, emergency response teams and also
decision makers during national crisis scenarios are among a few examples. Whereas professional aviators undergo regular
training in flight simulators such concepts are relatively new to healthcare and crisis command staff. The chance that a crisis
command team has ever practiced together before a natural disaster occurs is in fact very low.
This article aims at explaining the principles of the concept of Crew Resource Management (CRM) that has originally been
developed for cockpit crews and is now adopted by other high-reliability or high-criticality branches. The aim of this article
is to uncover the underlying principles of CRM to make them available to other environments and branches. The question is:
How do we train crisis staff to enable them to unlock their full potential even under the most strenuous conditions.
1.1 History of CRM
The origins of CRM can be traced back to summer 1979 when a NASA workshop was held in San Francisco in order to
investigate root causes of fatal accidents in aviation. Statistics for airplane accidents from 1959 through to 1989 indicated that
flight crew actions were the primary cause for more than 70% of worldwide accidents involving aircraft damage beyond
economical repair; for instance when a whole crew was distracted by the failure of a landing gear indicator light and failed to
notice that the automatic pilot was disengaged and the aircraft was descending into a swamp (Helmreich & Foushee, 2010).
The introduction speech to the NASA workshop on Social Psychology on the Flight Deck (Helmreich, 1980) was given by
Robert Helmreich, a former marine officer and psychologist. This is a remarkable fact as psychology was perceived closer to
religion than to science during this time. Helmreich nonetheless convinced the audience of the importance to address the
flight crew’s social interactions as cause of incidents. In the aftermath a new interdisciplinary approach of safety research
emerged. It combined insights from many prominent scientists like Charles Perrow and his Normal Accident Theory (Perrow,
1999) about the inevitable risks of cascading failures in complex and tight-coupled systems e.g. nuclear plants, or James
Reason’s work about Organisational Accidents (Reason, 1997) and Human Error (Reason, 1990). But also psychology and
behavioural science were influencing, for instance Amos Tversky & Daniel Kahnemann’s experiments on Heuristics and
Biases in decision making (Tversky & Kahneman, 1974) and Daniel Klein’s exploration of Naturalistic Decision Making and
gut feelings (Klein, 1998).
Such research about the nature of human behaviour and its interaction with complex systems, paired with much practical
experience and try-and-error, came together in a new notion of safety culture. Consequently special training formats
addressing Crew Resource Management (CRM) have been developed and became part of the mandatory training schedule of
professional airline pilots. Since then aviation established a safety culture with an outstanding effect: In the year 2012 only
15 fatal accidents happened on 22,6 million flights worldwide or 414 total fatalities of 2,97 billion passengers travelled
(IATA, 2013, p. 14). Numbers for 2013 seem to be even significantly lower (Ranter, 2014). This sector is thereby a classical
example for high-reliability organisations, a term that has been coined for systems that operate in hazardous conditions but
incur less adverse events than their expectable share would be (e.g. Reason, 2000). How significant low the number of fatal
accidents in aviation is demonstrates a comparison with the German health care system: Regardless of it being a highly
professionalised sector, it is still estimated to have a rate of 0,1% of all patients coming to death as consequence of avoidable
medical errors (Fischer et al., 2007). This leads to an absolute number of 17-19.000 fatalities per year due to human errors
and only in this country – compared to 414 fatalities overall in commercial aviation worldwide.
1.2 The Effect of Human Factors in Complex Socio-Technical Systems
Maybe the best approach to put CRM concepts in a frame is James Reasons theory of organisational accidents (Reason,
1997). Most high-reliability environments are complex socio-technical workplaces, where people and technical systems
interact with a risk of failure and system breakdowns. Safety systems in companies and institutions are therefore designed to
have several barriers or levels of defence. This can be physical measures (e.g. colour-coded connectors and backup power
systems), procedural (e.g. double checks and clearance processes), or organisational considerations (e.g. separation of duties
paradigm and rotation principle in working shifts). In reality unfortunately all of those safety barriers have some weak points
and the best picture to describe them is commonly not a strong and impervious wall but rather a slice of Emmentaler cheese
with several holes in it. These holes either emerge from latent mistakes in the organisation or because of active mistakes of
single individuals, not intentional of course. Even if several barriers and redundant lines of defence exist to mitigate this risk,
there is a chance that the holes in all layers, one day, may accidentally align. When an adverse event happens in such a
moment, its consequences can pass all barriers undamped and would lead to a catastrophic impact. Reason calls this his Swiss
Cheese Model (see Figure 1).
Fig. 1: An accident trajectory passing through corresponding holes in the layers of defences, barriers and safeguards: The
“Swiss Cheese Model” (Reason, 1997). The holes can be created by active and latent failures. In a moment when holes in all
barriers align, an adverse event can pass all levels of defence undamped and unfold catastrophic impact.
The origins of such weak spots and holes in defence barriers can be investigated back through several organisational levels as
figure 2 demonstrates. Individual unsafe acts happen on the foundation of local and organisational factors (red arrows) and
reasons can be traced backwards again (white arrows). The human error is commonly not a singular mistake but often enough
product of the system the actor is working in. Whereas CRM trainings address the top of this pyramid – collaboration and
decision making of teams in critical situations – they must always be embedded in a context of organisational safety culture
with all underlying levels to be beneficial towards a higher reliability and reduction of organisational accidents.
Fig. 2: Stages in the development and investigation of an organisational accident according to (Reason, 1997). The
rectangular box represents how an event leads to an adverse outcome after several defence barriers failing. (see Swiss Cheese
Model) The triangular shape below represents the system producing it in three levels: The person (unsafe acts), the workplace
(error-provoking conditions), and the organisation. Latent conditions can arise from all of those levels and become manifest
as weak spots of a systems barrier as opposed to active failures of individual humans.
Reason’s theory about organisational accidents and human error prepares a context for CRM and also mark some limitations.
CRM is about error management but it needs to be implemented in a culture of safety and professionalism to unfold its
benefits. The organisational model is generic and can be applied to most types of institutions.
2. CORE ELEMENTS OF CRM
It might sound paradox now to acknowledge the human fallibility as main source for fatal accidents and to trust the same
people again as last line of defence when it comes to mitigate critical incidents – but this is what CRM is about. The
approach is threefold: 1. Avoidance of error, 2. trapping incident errors before they are committed and 3. Mitigating the
consequences of errors which occur and are not mitigated (Helmreich, Merritt, & Wilhelm, 1999). But the definition of what
is “inside” CRM and how to train it varies widely between branches; a common standard has never been agreed. With
regards to the main influencing theories and having the evolution of CRM in mind, some core elements can be identified
nevertheless. The table below gives an overview. It has been compiled based on a meta-analysis of 28 evaluative studies
(Salas, Wilson, Burke, & Wightman, 2006), relevant CRM literature, and corresponding insights from psychology and
behavioural science. The core elements are grouped into team competencies, communication competencies and theoretical
Table 1: Sample design of simulator training for health care professionals
Ability of a team leader to direct and coordinate the activities of team members, encourage
team members to work together; assess performance; assign tasks; develop team knowledge,
skills, and abilities; motivate; plan and organise; and establish a positive team atmosphere.
(Salas et al., 2006)
“Effective leaders delegate so that they can regulate” (Sexton, 2004)
● Mutual Performance Monitoring: Ability of team members to accurately monitor other
team members performance, including giving, seeking, and receiving task-clarifying
feedback. (Salas et al., 2006)
● Backup Behaviour and Distribution of Workload: Ability of team members to anticipate
the needs of others through accurate knowledge about each other’s responsibilities,
including the ability to shift workload between members to create balance during periods of
high workload or pressure. (Salas et al., 2006)
Ability of team members to alter a course of action or adjust strategies when new
information becomes available. (Salas et al., 2006)
Some Golden Rules of Group Interaction and professional communication in High Risk
Environments: (Sexton, 2004)
● Speak simply: Use small words, articulate simple thoughts, and ask simple questions
● Use standardised phraseology
● Talk about problems (High performers devote more time to “problem-solving”
● Maintain an environment of open communication and stay calm during high workload
● Closed Loop Communication: A communication technique to avoid misunderstandings: A
sender gives a message, the receiver repeats it back, the sender confirms if right. This closes
a loop to assure a message or command being transmitted and understood correctly.
● Structured Briefings: Professional briefings follow a clear and commonly agreed structure,
e.g. SBAR: Situation – Background – Assessment – Recommendation.
This ensures integrity of transmitted information, economy of time and contributes towards
a shared situational awareness.
● Self-reflection and problem orientation: Talk about task-related problems. High
performers devote more time to “problem solving” communications. (Sexton, 2004)
A constant critical re-evaluation of own situational awareness and taken measures can also
help to de-bias teams and identify wrong assumptions.
● Mission Evaluation: Critical mission evaluation to assess room for improvement but also
success criteria and to save best practices. This can contain individual self-reflection,
teamwork, and organisational factors. Aim is continuous improvement and organisational
learning but can also have psycho-hygienic effects after demanding situations (compare
CISM debriefings: Everly & Mitchell, 1997).
(continued - Table 1: Sample design of simulator training for health care professionals)
Theoretical Background Knowledge
Situational Awareness describes the function of human perception and their interpretation
that leads us to a mental model of a situation (see e.g. Stanton, Chambers, & Piggott, 2001).
This becomes an important concept in situations of high uncertainty when information is
contradictory and fast changing.
Shared Situational Awareness describes the ability of a team to develop a common mental
model of a situation and is important for coordinating efforts and decisions.
Decision Making is a crucial process in crisis situations. Rational decision making theories
provide acronyms that guide through a structured process. Research indicates contrariwise
that human decision behaviour follows inherent and irrational mechanisms.
● Decision making cycles provide structures for rational decisions, e.g.:
OODA: Observe – Orient – Decide – Act
PDCA: Plan – Do – Check – Act
FOR-DEC: Facts – Options – Risks & Benefits – Decision – Execution – Check
● Research about decision making indicates contrariwise that humans do not follow rational
processes in decision making. The Heuristics & Biases school (e.g. Tversky & Kahneman,
1974) found patterns in human thinking that bias decisions. The gut-instinct school (e.g.
Klein, 1998) in contrast identified instincts and non-rational feelings as source for good
It can be assumed that even if human decision making processes tend to be biased or at least
complex. In stress situations rational cycles like OODA can help teams to double-check
decisions or provide guidance for structured assessments in high-pressure and uncertainty
Human (unintentional) error follows a typology:
● Skill based slips and lapses are unintended failures of executing actions. They occur due
to attentional or perceptual failures (slips) or due to failures of memory (lapses).
● Mistakes originate from higher levels of mental processing like assessing information,
planning, formulating intentions, and judging likely consequences. Rule-based mistakes
happen when a normally good rule is wrongly applied, knowledge-based mistakes occur
when we run out of pre-packaged solutions or rules and have to think out problem solutions
on line, which is highly error prone. (Reason, 1997, 2008)
Both the human physiology and psychology have several aspects that necessitate some
economy of personal resources to maintain them working during a longer crisis situation and
● Attention for instance is a cognitive resource that will be limited under stress. Fixation
errors or inattentional blindness occur when attention sticks with a single topic, ignoring
other, possibly more threatening issues.
● Stress leads to physical consequences limiting the personal capacity. Whereas in crisis
situations stress is an inevitable circumstance and mild forms can even be beneficial (Combs
& Taylor, 1952; Vasile, 2010), people in high-skilled and high-demanded teams need to be
equipped with strategies for coping with stress and recovery strategies to restore their
capacities after the crisis has ended.
Figure 3 is a visualisation of the core elements of CRM as described in the table above. The classification in 10 elements can
only be an approximation and must be arbitrary to some degree resulting from the multifarious nature of this subject. While
all of the CRM methods can be deducted from the theoretical elements, the team- and communication elements are
interlinked among themselves, overlapping and interdependent. What is CRM now? “CRM is one of an array of tools that
organisations can use to manage error” (Helmreich et al., 1999, p. 9) or like Salas an colleagues summarised it a “family of
instructional strategies designed to improve teamwork” (1999, p. 163).
Fig. 3: Core Elements of Crew Resource Management. Theoretical background knowledge is both foundation for practical
CRM methods and content of CRM trainings. The team- and communication competencies are interlinked and overlapping.
3. IMPLEMENTING CRM
Until here CRM has been outlined as an industry-neutral blueprint. To apply it in practice, e.g. healthcare or disaster
management, it is still a bit too abstract in this form of representation. Methods and training content should therefore be
derived from these core elements, tailored according to the target group, and drilled down to a trainable format. The
challenge for implementation in an organisation is to translate the elements in practical methods and training concepts, for
instance finding an appropriate decision cycle, briefing agenda, mission evaluation formats, and leadership style. Medical
staff might need different briefing agendas and decision loops than military staff and aviation crews. The level of depth in
which the theory is presented should be adjusted according to the audience.
CRM trainings are usually facilitated in high-fidelity simulator environments like flight- or patient simulators. This enhances
learning effects, gives participants the perception of their real and complex working environment, and allows integration of
CRM with technical skills training. Some of the training content can be taught during class sessions, like psychological
background knowledge, but most of the education will happen in practical simulator exercises where the self-reflection and
group learning of participants can be fostered.
A high-fidelity simulator is of course favourable and allows integrating technical skill training with CRM in the same
simulation and thereby generating welcome synergetic effects. Nonetheless, an effective CRM training situation can also be
achieved with very simple means. It should always kept in mind that CRM is about human interaction in teams, about
decision making, leadership, and human error – not so much about entertainment. The simulator is mainly the tool to support
the participants training intensity and success and simulations must be designed wisely to serve this purpose and utilise the
simulation environment efficient.
Table 2 gives an idea how simulator training content and learning objectives can be designed. Technical scenarios are
integrated with CRM skills, e.g. a cardiac arrest emergency while the team’s workload management abilities will be
challenged through simulated complications. After each scenario session participants evaluate their performance and define
learnings with the help of video recordings and observers feedback.
Also indicated by table 2 is the fact that transferring such integrated simulation trainings to other professional environments
will require most effort for adaption in the technical skills and scenarios. The CRM part will stay generic to a great extent but
still needs adaption to the new field.
Table 2: Sample design of simulator training for health care professionals
● Clinical pictures of e.g. obstetric
emergencies, or others
● Standard treatment algorithms
● Briefing on available medical equipment
● Specific emergency procedures for the unit /
● CRM introduction and overview
Theoretical background on:
● Human Error
● Decision Making
● (Shared) Situational Awareness
● Personal Resources and Stress
Scenarios are designed to both simulate
medical emergency situation to foster
technical education as well as to stimulate
team dynamics that promote or require CRM
Examples for different scenarios are:
● cardiac arrest
● obstetric emergencies
Team- and Communication Competencies are
trained in simulator sessions, the participants
ability to analyse and self-reflect team
performance is enabled.
● Workload Management
Participants have the up-to-date knowledge
and abilities to detect and treat emergencies as
taught in the training. They are familiarised
with the available equipment, specific
workplace environment as well as algorithms
and standard operational procedures.
Participants are aware of the implications of
human factors. They have methods at hand to
cope with stress situations and unlock full
potential of a team in such conditions.
Techniques for self-reflection and de-briefings
enable them to assess, de-bias and
continuously improve their teamwork.
As demonstrated earlier in this article, CRM embedded in a safety culture of an organisation and should always be
implemented with respect to its characteristics. The most comprehensive research regarding differences in national and
organisational cultures comes from Geert Hofstede and his comparative studies of cultural dimensions (Hofstede, Hofstede,
& Minkov, 2010). A national healthcare system or disaster management organisation will inevitable be closely related with
the national culture. Some of the cultural dimensions are strongly relevant for the design of CRM systems: Introducing
Standard Operating Procedures (SOP’s) in cultures low on the Uncertainty Avoidance scale will be a challenge and the
maxim of flat hierarchies, which is often mentioned in the context of CRM, might not be appropriate for organisations high
on the Masculinity level. The acceptance of group decision making processes will be dependent on the Individualism value
and leadership models and communication principles have to be developed with the Power Distance in mind for instance.
Figure 4 explains four of Hofstede’s cultural dimensions with obvious relevance for CRM. It also shows the scales for
selected countries: The United States, Germany, and Great Britain as countries where nearly all of the CRM research
originates at this moment, Russia and China where only little interest for CRM is known, and Iran (where this paper was
presented initially) in comparison. The values underline the divergence of national cultural profiles and the importance of
individualisation and tailoring of CRM concepts for every organisation. The limitation is a logical consequence: CRM is not
a one-size-fits-all approach or an “off the shelf” product. It must always be tailored to the organisational and national cultures
initially. Important differences will exist between e.g. private profit orientated companies or public, small and large
organisations. University hospitals are different from public basic health care institutions and also crisis management is
dependent on whether it is on federal or state level, at fire brigades or police forces, in administrative, military, or civilian
context, public or private dominated.
Training Scenario 1 +
Fig. 4: Cultural tendencies influencing CRM (Hofstede et al., 2010)
Explanations of cultural dimensions (Johnston, 1993):
Uncertainty Avoidance (uai). High UAI tends to generate rigidity and strong adherence to the formality of rules and
regulations. High UAI is associated with social acceptance of aggressive behaviour, strong task orientation, ritual behaviour
and written rules. If formal goals are not achieved, there is potential for dispute over the “rules” or rule adherence, rather than
about the operational objectives to which the rules are directed. Alternatively, low UAI could conceivably be associated with
a permissive and indulgent approach to rules and standard operating procedures.
Masculinity (mas). This relates to beliefs regarding the gender division of social roles. High MAS societies tend to have a
belief in the “independent decision-maker,” and leaders value their decision-making autonomy. Decisiveness, interpersonal
directness and machismo are common in high MAS societies. Again, there are implications here for CRM.
Individualism (idv). Individualism and collectivism are intimately associated with the predominant value systems of society
at large. In high IDV societies, individual decision making is normal and preferred to group decisions. Individual initiative
and leadership are highly valued. In low IDV societies, group decisions are felt to be better than decisions made by
individuals. Personal initiative is not encouraged. Social identity and position are determined by membership in various in-
groups. Here the implications for CRM should be obvious.
Power Distance (pdi). The degree to which the less powerful members of a society accept and expect that power is
distributed unequally. In high PDI countries, junior crewmembers are more likely to fear the consequences of disagreeing
with leaders, possibly with good reason. Leaders are themselves likely to feel comfortable with paternalistic behaviour and
leadership by directive. Leaders, rather than followers, will tend to initiate communication. This is a useful reminder that
leadership does not exist in a social vacuum – it is directly related to the incumbent social embodiment of “followership” or
“subordinateship.” Unresponsiveness to legitimate authority could arise with Low Power Distance – especially if associated
with high individualism.
Another limitation and trap is to reduce CRM to a sheer training product that can be consumed solitary. As argued earlier
CRM can be part of safety culture but does not substitute other measures such as non-punitive error and incidence reporting
systems, leadership development, technical and organisational safety design, and many more. Key success factor is to embed
it in quality- and risk management efforts. When implemented CRM should become part of a regular technical skills training
schedule like the aviation industry is practicing it for years. CRM is not a one-time product: Skills have to be refreshed and
simulations repeated on a regular basis. Applied properly, simulation trainings can serve an additional purpose by enhancing
familiarisation of new employees with a new and complex work environment and reducing the opportunity costs for training
on the job.
A third and important limitation must be kept in mind when looking at aviation’s safety culture and the perfection that seems
to be in cockpit crew collaboration processes with envy: Medicine and disaster management are more complex than flying an
airplane. Helmreich for instance advocates that CRM concepts have to be adapted very carefully for the medical sector
because “this is a milieu more complex than the cockpit, with differing specialities interacting to treat a patient whose
condition and response may have unknown characteristics” and “aircrafts are more predictable than patients” (Helmreich,
2000). Large airlines can afford to train staff regularly, the crewmembers are all employed at the same company, roles and
positions are clearly defined and the vast majority of operations are predefined and standardised. These are definitely not the
same preconditions we find in medicine and disaster management, where staff constellation change constantly, qualifications
are widespread, and situations are usually dynamic, intransparent and high on uncertainty. Aviation can be a role model and
source of inspiration – but CRM concepts should not be imitated hasty and without proper tailoring.
This article went back in the history of CRM and tried to develop a core concept of CRM based on the original research and
theoretical contributions. CRM evolved through several evolutionary stages in different industries and was – luckily – always
driven by practical needs. Ten core elements of CRM have been deduced and limitations have been discussed, especially the
national and organisational culture as important influences for the application of CRM concepts and the need for embedding
CRM in conjunction with more organisational safety measures.
Key messages of this article are:
• CRM is part of organisational safety culture and has to be approached as high-priority management topic therefore.
• CRM is about the human factors as source of error but also about humans as best resource to avoid and human
errors and incidents again.
• It is therefore not restricted to a single branch or profession but can unfold its benefits in all safety critical or high-
• Individual tailoring of CRM concepts for a specific organisation is necessary.
• The content of simulation based trainings can be distinguished between technical and CRM skills and
competencies. It is crucial to clearly define both and then integrate them into training scenarios.
• The national and organisational culture is of high influence for the design of CRM concepts.
• CRM cannot be applied ignoring surrounding organisational safety measures. It would be a severe mistake to
implement simulator training without analysing and addressing surrounding processes and constraints.
• CRM training has to be refreshed on a regular base and integrated in technical skills training.
Benedict Gross is consulting expert for project- and crisis management, oscillating between practice and science. Clients
include banks, insurance companies, healthcare organisations but also football clubs and others. His current field of research
is patient safety and safety culture in health care systems.
After graduating in Business Law, being certified as Project Manager (IPMA) and Project Management Professional (PMI),
he completed his master studies in Disaster Management (University of Bonn) and in Forensic Psychology (University of
Liverpool) and holds the Certified Emergency Manager (IAEM) credential.
Corresponding address: Augustenstr. 53, Munich (Germany); firstname.lastname@example.org
Combs, A. W., & Taylor, C. (1952). The effect of the perception of mild degrees of threat on performance. The Journal of
Abnormal and Social Psychology, 47(2, Suppl), 420-424. doi: 10.1037/h0057196
Everly, G. S., & Mitchell, J. T. (1997). Critical incident stress management-CISM-: A new era and standard of care in crisis
intervention: Chevron Pub.
Fischer, G., Glaseske, G., Kuhlmey, A., Schrappe, M., Rosenbrock, R., Scriba, P., & Wille, E. (2007). Gutachten 2007 des
Sachverständigenrates zur Begutachtung der Entwicklung im Gesundheitswesen: Kooperation und Verantwortung.
Voraussetzungen einer zielorientierten Gesundheitsversorgung. (16/6339). Berlin: Deutscher Bundestag.
Helmreich, R. L. (1980, 26-28 Jun. 1979 ). Social Psychology on the Flight Deck. Paper presented at the NASA Workshop on
Resource Management on the Flight Deck, San Francisco.
Helmreich, R. L. (2000). On error management: lessons from aviation. BMJ, 320(7237), 781-785.
Helmreich, R. L., & Foushee, H. C. (2010). Chapter 1: Why CRM? Empirical and Theoretical Bases of Human Factors Training.
Crew Resource Management, 3-57. doi: 10.1016/b978-0-12-374946-8.10001-9
Helmreich, R. L., Merritt, A. C., & Wilhelm, J. A. (1999). The Evolution of Crew Resource Management Training in
Commercial Aviation. The International Journal of Aviation Psychology, 9(1), 19-32. doi: 10.1207/s15327108ijap0901_2
Hofstede, G., Hofstede, V. G., & Minkov, M. (2010). Cultures And Organizations: Software For The Mind: McGraw-Hill USA.
IATA. (2013). Annual Review 2013 of the International Air Transport Association: International Air Transport Association
Johnston, N. (1993). CRM: Cross-cultural perspectives. In E. L. Wiener, B. G. Kanki & R. L. Helmreich (Eds.), Cockpit resource
management (pp. 367-398). San Diego: Academic Press.
Klein, G. A. (1998). Sources of power : how people make decisions. Cambridge, Mass. ; London: MIT Press.
Perrow, C. (1999). Normal accidents : living with high-risk technologies. Princeton, N.J.: Princeton University Press.
Ranter, H. (2014). Press release: Airliner accident fatalities at record low. Retrieved 20.1., 2014, from http://news.aviation-
Reason, J. T. (1990). Human error. Cambridge: Cambridge University Press.
Reason, J. T. (1997). Managing the risks of organizational accidents: Ashgate Publishing.
Reason, J. T. (2000). Human error: models and management. BMJ, 320(7237), 768-770.
Reason, J. T. (2008). The human contribution : unsafe acts, accidents and heroic recoveries. Farnham, England ; Burlington, VT:
Salas, E., Prince, C., Bowers, C. A., Bowers, C., Stout, R., eacute, . . . Cannon-Bowers, J. A. (1999). A Methodology for
Enhancing Crew Resource Management Training. Human Factors The Journal of the Human Factors and Ergonomics
Society, 41(1), 161.
Salas, E., Wilson, K. A., Burke, C. S., & Wightman, D. C. (2006). Does Crew Resource Management Training Work? An
Update, an Extension, and Some Critical Needs. Human Factors, 48(2), 392-412.
Salas, E., Wilson, K. A., Burke, C. S., Wightman, D. C., & Howse, W. R. (2006). A Checklist for Crew Resource Management
Training. Ergonomics in Design: The Quarterly of Human Factors Applications, 14(2), 6-15. doi:
Sexton, J. B. (Ed.). (2004). The Better the team the safer the world: Golden Rules of Group Interaction in High Risk
Environments. Ladenburg and Rüschlikon: Gottlieb Daimler and Karl Benz Foundation, Swiss Re Centre for Global
Stanton, N. A., Chambers, P. R. G., & Piggott, J. (2001). Situational awareness and safety. Safety Science, 39(3), 189-204. doi:
Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124-1131. doi:
Vasile, C. (2010). Mental Workload: Cognitive Aspects and Personality. Petroleum - Gas University of Ploiesti Bulletin,
Educational Sciences Series, 62(2), 132-137.