Endsley, M. R. and Garland D. J (Eds.) (2000) Situation Awareness Analysis and
Measurement. Mahwah, NJ: Lawrence Erlbaum Associates.
THEORETICAL UNDERPINNINGS OF SITUATION AWARENESS:
A CRITICAL REVIEW
Mica R. Endsley
The enhancement of operator situation awareness (SA) has become a major design
goal for those developing operator interfaces, automation concepts and training programs in
a wide variety of fields, including aircraft, air traffic control, power plants, and advanced
manufacturing systems. This dramatic growth in interest in SA, beginning in the mid-
1980’s and accelerating through the 1990’s, was spurred on by many factors, chief among
them the challenges of a new class of technology.
One can easily see that situation awareness has always been needed in order for
people to perform tasks effectively. Prehistoric man undoubtedly needed to be aware of
many cues in his environment in order to successfully hunt and keep from being hunted.
For many years, having good situation awareness was largely a matter of training and
experience — learning the important cues to watch for and what they meant.
With the advent of the machine age, our emphasis shifted to creating a new class of
tools to help people perform tasks, largely those physical in nature. The computer age and
now the information age have followed rapidly on the heals of basic mechanization. The
tools provided are no longer simple; they are amazingly complex, focused on not just
physical tasks, but elaborate perceptual and cognitive tasks as well. The pilot of today’s
aircraft, the air traffic controller, the power plant operator, the anesthesiologist: all must
perceive and comprehend a dazzling array of data which is often changing very rapidly. I
have taken to calling this challenge the information gap (Figure 1).
Today’s systems are capable of producing a huge amount of data, both on the status
of their own components, and on the status of the external environment. Due to
achievements in various types of datalink and internet technologies, systems can also
provide data on almost anything anywhere in the world. The problem with today’s systems
is not a lack of information, but finding what is needed when it is needed.
Unfortunately, in the face of this torrent of data, many operators may be even less
informed than ever before. This is because there is a huge gap between the tons of data
being produced and disseminated and people’s ability to find the bits that are needed and
process them together with the other bits to arrive at the actual information that is required
for their decisions. This information must be integrated and interpreted correctly as well; a
frequently tricky task. This problem is real and ongoing, whether the job is in the cockpit or
behind a desk. It is becoming widely recognized that more data does not equal more
information. Issues of automation and “intelligent systems” have frequently only
exacerbated the problem, rather than aided it (Endsley & Kiris, 1995; Sarter & Woods,
The criteria for what we are seeking from system designs have correspondingly
changed. In addition to designing systems that provide the operator with the needed
information and capabilities, we must also insure that it is provided in a way that is useable
cognitively as well as physically. We want to know how well the system design supports
the operator’s ability to get the needed information under dynamic operational constraints.
(i.e. How well does it bridge the information gap?) This design objective and measure of
merit has been termed situation awareness.
Endsley Page 2
More Data ≠ More Information
Information NeededData Produced
Figure 1. The Information Gap
What is SA?
Most simply put, SA is knowing what is going on around you. Inherent in this
definition is a notion of what is important. SA is most frequently defined in operational
terms. While someone not engaged in a task or objective might have awareness (e.g.
someone sitting under a tree idly enjoying nature), this class of individuals has been largely
outside the scope of human factors design efforts. Rather, we have been concerned mostly
with people who need SA for specific reasons. For a given operator, therefore, SA is
defined in terms of the goals and decision tasks for that job. The pilot does not need to
know everything (e.g. the co-pilot’s shoe size and spouse’s name), but does need to know a
great deal of information related to the goal of safely flying the aircraft. A surgeon has just
as great a need for situation awareness; however, the things she needs to know about will be
quite different, dependent on a different set of goals and decision tasks.
Although the “elements” of SA vary widely between domains, the nature of SA and
the mechanisms used for achieving SA can be described generically. (A determination of the
elements of SA for different domains is discussed in more detail in Chapter 8.) It is the
goal of this chapter to provide a foundation for understanding the construct that is SA. This
foundation is important both for creating systems that support SA and for creating tools that
effectively measure SA.
Many definitions of SA have been developed, some very closely tied to the aircraft
domain and some more general. (See Dominguez (1994) or Fracker (1988) for a review.)
Endsley Page 3
That is, many are tied to the specifics of one domain,!aircraft piloting, from whence the term
originated. SA is now being studied in a variety of domains, however; education, driving,
train dispatching, maintenance and weather forecasting are but a few of the newer areas in
which SA has been receiving attention.
A general definition of SA that has been found to be applicable across a wide variety
of domains describes SA as “the perception of the elements in the environment within a
volume of time and space, the comprehension of their meaning and the projection of their
status in the near future” (Endsley, 1988). Shown in Figure 2, this definition helps to
establish what “knowing what is going on” entails.
State Of The
• Goals & Objectives
• System Capability
• Interface Design
• Stress & Workload
Figure 2. Model of SA in Dynamic Decision Making (from Endsley, 1995b)
Level 1 SA - Perception
First, perception of cues (Level 1 SA) is fundamental. Without basic perception of
important information, the odds of forming an incorrect picture of the situation increase
dramatically. Jones and Endsley (1996) found that 76% of SA errors in pilots could be
traced to problems in perception of needed information (due to either failures or
shortcomings in the system or problems with cognitive processes).
Level 2 SA - Comprehension
Situation awareness as a construct goes beyond mere perception however. It also
encompasses how people combine, interpret, store, and retain information. Thus, it includes
more than perceiving or attending to information, but also the integration of multiple pieces
Endsley Page 4
of information and a determination of their relevance to the person’s goals (Level 2 SA).
This is analogous to having a high level of reading comprehension as compared to just
reading words. Twenty percent of SA errors were found to involve problems with Level 2
SA (Jones & Endsley, 1996).
Flach (1995) points out that “the construct of situation awareness demands that the
problem of meaning be tackled head-on. Meaning must be considered both in the sense of
subjective interpretation (awareness) and in the sense of objective significance or importance
(situation)” (p. 3). A person with Level 2 SA has been able to derive operationally relevant
meaning and significance from the Level 1 data perceived. As Flach points out, this aspect
of SA sets it apart from earlier psychological research and places it squarely in the realm of
Level 3 SA - Projection
At the highest level of SA, the ability to forecast future situation events and
dynamics (Level 3 SA) marks operators who have the highest level of understanding of the
situation. This ability to project from current events and dynamics to anticipate future
events (and their implications) allows for timely decision making. In almost every field I
have studied (aircraft, air traffic control, power plant operations, maintenance, medicine), I
have found that experienced operators rely on future projections heavily. It is the mark of a
Temporal Aspects of SA
Time, both the perception of time and the temporal dynamics associated with events,
plays an important role in the formulation of SA. First, time itself has appeared as an
important component of SA in many domains ((Endsley, 1993b; Endsley, 1994; Endsley,
Farley, Jones, Midkiff, & Hansman, 1998a; Endsley & Robertson, 1996; Endsley &
Rodgers, 1994). A critical part of SA is often understanding how much time is available
until some event occurs or some action must be taken. The “within a volume of space and
time” contained in the definition of SA pertains to the fact that operators constrain the parts
of the world (or situation) that are of interest to them based on not only space (how far away
some element is), but also how soon that element will have an impact on the operator’s
goals and tasks. Time is a strong part of Level 2 SA (comprehension) and Level 3 SA
(projection of future events).
The dynamic aspect of real-world situations is a third important temporal aspect of
SA. The rate at which information is changing is a part of SA regarding the current
situation, which also allows for projection of future situations (Endsley, 1988; 1995c). The
dynamic nature of situations dictates that as the situation is always changing, so the
person’s situation awareness must constantly change or be rendered out-dated and thus
inaccurate. In highly dynamic environments, this forces the human operator to adapt many
cognitive strategies for maintaining SA (as will be discussed later). Adams, Tenney and
Pew (1995) emphasize the importance of dynamics of both situations and cognitive
processes in their model of SA. Sarter and Woods (1991) also discuss the importance of
temporal aspects of the situation for SA.
SA and Decision Making
The Endsley model, shown in Figure 2, shows situation awareness as a stage
separate from decision making and performance. Situation awareness is depicted as the
operator’s internal model of the state of the environment. Based on that representation,
operators can decide what to do about the situation and carry out any necessary actions.
Situation awareness therefore is represented as the main precursor to decision making,
Endsley Page 5
however, many other factors also come into play in turning good situation awareness into
SA is clearly indicated as a separate stage in this model rather than as a single
combined process. This is for several reasons. First, it is entirely possible to have perfect
SA, yet make an incorrect decision. For example, the battle commander may understand
where the enemy is and the enemy’s capabilities, yet select a poor or inappropriate strategy
for launching an attack. They may have inadequate strategies or tactics guiding their
decision processes. They may be limited in decision choices due to organizational or
technical constraints. They may lack the experience or training to have good, well-
developed plans of actions for the situation. Individual personality factors (such as
impulsiveness, indecisiveness or riskiness) may also make some individuals prone to poor
decisions. A recent study of human error in aircraft accidents found that 26.6% involved
situations where there was poor decision making even though the aircrew appeared to have
adequate situation awareness for the decision (Endsley, 1995b). Conversely, it is also
possible to make good decisions even with poor SA, if only by luck.
This characterization is not meant to dispute the important role of SA in the decision
making process or the integral link between SA and decision making in many instances,
particularly where experienced decision makers are involved. Klein’s work in the area of
recognition-primed decision making shows strong evidence of a direct link between
situation recognition/classification and associated action selection (Klein, 1989; Klein,
Calderwood, & Clinton-Cirocco, 1986). Where such learned linkages exist, they
undoubtedly may be activated frequently in the decision process. Adams, et al (1995) and
Smith and Hancock (1994) also discuss the integral relationship between SA and decision
making. Decisions are formed by SA and SA is formed by decisions. These are certainly
views with which I agree. I never the less feel it is important to recognize that SA and
decision making need not be coupled as one process and in practice frequently are not.
The human operator has conscious choice in the decision to implement the linked
recognition-primed decision action plan or to devise a new one. This behavior can be seen
in combat tasks, for instance, where people often wish to be unpredictable. In the many
instances where no action plan is readily linked to the recognized situation, a separate
decision as to what to do must take place. Just as easily as SA and decision making can be
linked in practice, they can also be unlinked. While this distinction may be purely
theoretical, it is made here for the purpose of clarity of discussion. SA is not decision
making and decision making is not SA. This distinction has implications for the
measurement of SA.
Furthermore, the link between human decision making and overall performance is
indirect in many environments. A desired action may be mis-executed due to physical error,
other workload, inadequate training or system problems. The system’s capabilities may
limit overall performance. In some environments, such as the tactical aircraft domain, the
action of external agents (e.g. enemy aircraft) may also create poor performance outcomes
from essentially good decisions (and vice-versa). Therefore, SA, decision making and
performance can be seen for theoretical purposes to be distinct stages that can each affect
the other in a circular ongoing cycle, yet which can be decoupled through various other
Who Needs SA?
The earliest discussions of SA undoubtedly derive from the pilot community, going
back as far as World War I (Press, 1986). Spick (1988) provides a fascinating fighter
pilot’s view of the importance of SA in combat aircraft. He traces SA to one of
Clausewitz’s four main ‘frictions’ of war.
Although an emphasis on SA in systems design and early scientific work on SA and
its measurement originated in the fighter aircraft domain (Arback, Swartz, & Kuperman,
Endsley Page 6
1987; Endsley, 1987; 1988; Fracker, 1988; IC3, 1986; Marshak, Kuperman, Ramsey, &
Wilson, 1987; Taylor, 1990), the use and application of the term has rapidly spread to other
domains. The most prominent driver of this trend has been technology. The growth and
complexity of electronic systems and automation has driven designers to seek new
methodological frameworks and tools for dealing effectively with these changes in domains
from driving to space operations to medicine.
It should be clearly noted, however, that technological systems do not provide SA in
and of themselves. It takes a human operator to perceive information to make it useful. Nor
do they create the need for SA. No matter how a person does his or her job, the need for
SA has always existed, even in what would not normally be considered heavily cognitive or
technological fields. In sports, such as football or hockey, for instance, the importance of
SA in selecting and running plays is readily apparent. The highlighted emphasis on SA in
current system design has occurred because (a) we can now do more to help provide good
SA through decision aids and system interfaces, and (b) we are far more able to actually
hinder SA through these same efforts if, as designers, we fail in adequately addressing the
SA needs of the operator.
How do we get SA?
In its simplest terms, SA is derived from all of our various sources of information,
shown in Figure 3 (Endsley, 1995c; Endsley, 1997). Cues may be received through visual,
aural, tactile, olfactory or taste receptors. Some cues may be overt (e.g. a system alarm) and
some quite subtle (e.g. the slight change in the hum of an engine) which are registered only
subconsciously. As we move towards the instantiation of remote operators in many
domains (e.g. unmanned air vehicles, remote maintenance, etc…), a major challenge will be
providing sufficient information through a remote interface to compensate for the cues once
It is important to note that while as system designers we tend to focus on the
information provided through the system and its operator interface, this is not the only
source of SA. In many domains, operators may be able to directly view and hear
information from the environment itself (e.g. pilots, drivers, machine operators, ATC tower
controllers), although in some cases they may not (e.g. an enroute ATC controller who only
has remotely transmitted information). It is important therefore that analyses of the SA
provided by system designs also take into account that information which operators also
derive from other means. That is, what is the value added (or subtracted) by a given system
taking into account what one already gets via other means and what it may cost in terms of
interference with that information? For instance, the U.S. Army is currently considering
providing helmet mounted displays to foot soldiers. Any analysis of this technology needs
to consider not only the SA proved by the new system, but also the SA the system may be
interfering with (information on the local environment), and that the soldier is able to achieve
without the new technology.
In addition to directly perceived information, as shown in Figure 3, the system’s
sensors (or its datalink/internet components) collect some subset of all available information
from the system’s environment and internal system parameters. Of all the data the system
possesses, some portion is displayed to the operator via its user interface. Of this
information, the operator perceives and interprets some portion, resulting in SA.
It is critical to note that this is not a passive process of receiving displayed
information, but one in which the operator may be very actively involved. For instance, the
operator in many systems can control which information is displayed (e.g. through menu
selection) and which information is attended to. They may also be able to control which
information the system collects by sending out commands to get certain information from
linked systems or by setting the direction and coverage of the sensors, for example. People
Endsley Page 7
are therefore very active participants in the situation assessment process, with SA guiding
the process and the process resulting in the SA.
Figure 3. Sources of SA Information (from Endsley, 1995c,1997)
The role of others in the process of developing SA has also received attention.
Verbal and non-verbal communication with others (including radio communication, hand
signals and “wing-tipping” by other pilots) has historically been found to be an important
source of SA information. Even in situations with restricted visual cues, air traffic
controllers report that they get a great deal of information just from the voice qualities of the
pilot’s radio communications, deriving information on experience level, stress, familiarity
with English instructions, level of understanding of clearances and need for assistance.
Operators also rely on each other for confirming their own SA. Jentsch, Barnett and
Bowers (1997) and Gibson, et al (1997), for instance, both found that it was much rarer for
a single pilot to lose SA than for the whole crew to lose SA when accidents and incidents
occurred. While these results undoubtedly reflect the fact that the second crewmember with
SA would most likely prevent the situation from becoming an accident in the first place, it
also speaks of the effectiveness of the process. Thus, SA is derived from a combination of
the environment, the system’s displays and others, as integrated and interpreted by the
THEORIES OF SA
Simply describing the many sources of SA information does little to convey the
intricate complexities of how people pick and choose information, weave it together and
interpret it in an ongoing and ever-changing fashion as both situations change and operator
goal states change. Despite the claims by some operators that SA is derived through
Endsley Page 8
instinct (Spick, 1988) or extra-sensory perception (Klein, 1995), I believe that we can shed
some light on the cognitive processes that come into play acquiring and maintaining SA.
Several researchers have put forth theoretical formulations for depicting the role of
numerous cognitive processes and constructs on SA (Adams, et al., 1995; Endsley, 1988;
1995c; Fracker, 1988; Smith & Hancock, 1995; Taylor, 1990; Taylor & Selcon, 1994;
Tenney, Adams, Pew, Huggins, & Rogers, 1992). There are many commonalties in these
efforts pointing to essential mechanisms that are important for SA and which have a direct
bearing on the appropriateness of proposed measures of SA. The key points will be
discussed here, however, the reader is directed to these papers for more details on each
Endsley (1988; 1990b; 1995c) proposed a framework model based on information
processing theory(Wickens, 1992). The model will be used as an umbrella for discussing
other theories and more recent research relevant to SA theory. The cognitive mechanisms
that are important for the development of SA are shown in Figure 4.
Stress Work-load Interface Complexity Automation
Figure 4. Mechanisms and Processes Involved in SA
Working Memory and Attention.
Several factors will influence the accuracy and completeness of situation awareness
that individual operators derive from their environment. First, humans are limited by
working memory and attention. The way in which attention is employed in a complex
environment with multiple competing cues is essential in determining which aspects of the
situation will be processed to form situation awareness. Once taken in, information must be
Endsley Page 9
integrated with other information, compared to goal states and projected into the future
—!all heavily demanding on working memory.
Much can be said about the importance of attention on SA. How people direct their
attention has a fundamental impact on which portions of the environment are incorporated
into their SA. In this light, both the perceptual salience of environmental cues (i.e. the
degree to which they draw attention), and the meaningful direction of attention of the
individual are important. Numerous factors will dictate how people direct their attention in
acquiring information, including learned scan patterns and information sampling strategies,
goals, expectations and other information already processed.
Several recent studies have confirmed the role of attention in situation awareness.
Endsley and Smith (1996) found that fighter pilots’ attention to targets on a tactical
situation display was directly related to the importance of those targets in their tactical tasks.
Endsley and Rodgers (1998) showed that air traffic controllers tend to shed attention to less
important information as taskload (numbers of aircraft) increases. Gugerty (1998) found
that drivers paid more attention to cars in front of and near to them than to those behind or
farther away, also reflecting distribution of attention based on the operational importance of
the vehicles. Clearly, the experienced operators in these three tasks deployed their attention
in ways that are consistent with operational goals.
Attention to information is prioritized based on how important that information is
perceived to be. It should also be pointed out, however, that even experienced operators can
err in this process, neglecting to attend to certain information over other information.
Adams et al (1995) describe the challenges aircrews face in dynamically juggling many
completing tasks and pieces of information and the role of attention in managing this
challenge. Jones and Endsley (1996) found that the single most frequent causal factor
associated with SA errors involved situations where all the needed information was present,
but was not attended to by the operator (35% of total SA errors). This was most often
associated with distraction due to other tasks. Correctly prioritizing information in a
dynamic environment remains a challenging aspect of SA. Good SA requires enough
awareness of what is going on across a wide range of SA requirements (global SA) to be
able to determine where to best focus one’s attention for more detailed information (local
SA). Attentional narrowing is a well-known example of failures that can befall this process.
The limits of working memory also pose a constraint on SA (Endsley, 1988;
Fracker, 1988). Novice decision makers and those in novel situations must combine
information, interpret it and strive to make projections, all in limited working memory. (This
model, however, shows that in practice, experienced decision makers have many
mechanisms for overcoming this bottleneck, described subsequently.) Jones and Endsley
(1996) found that working memory losses (where information was initially perceived and
then forgotten) explained 8.4% of SA errors (frequently associated with task distractions
and interruptions). Many other failures in ability to form Level 2 and 3 SA may also have a
working memory limitation come into play.
Durso and Gronlund (in press) report four strategies actively used by operators to
reduce the working memory load associated with SA, including information prioritization,
chunking, “gistification” of information (such as encoding only relative values of
information if possible), and restructuring the environment to provide external memory
cues. Despite these workarounds, Gugerty and Tirre (1997) found strong evidence of the
importance of working memory in discriminating between people with higher and lower
levels of SA. As Adams et al (1995) point-out, even experienced operators may be faced
with so much information that attention and working memory constraints can be an issue.
Long-term Memory & Working Memory Connection
To view SA as either a function of working memory or long-term memory would
probably be erroneous. Endsley (1990a; 1995a), for instance, showed that experienced
Endsley Page 10
pilots could report on relevant SA information for five to six minutes following freezes in an
aircraft simulation without the memory decay that would be expected from information
stored in working memory. This was hypothesized to support a model of cognition that
shows working memory to be an activated subset of long-term memory (Cowan, 1988), as
shown in Figure 5. In this model, information proceeds directly from sensory memory to
long-term memory, which is necessary for pattern-recognition and coding. Those portions
of the environment that are salient remain in working memory as a highlighted subset of
long-term memory through either localized attention or automatic activation. In this way,
information from the environment may be processed and stored in terms of the activated
mental model or schema (i.e. it provides the current situation values of these more abstract
models). Thus activated, these schema provide a rich source of data for bringing to bear on
the situation including mechanisms for processing the data (i.e. forming Level 2 and Level 3
SA) and default values for filing in missing information (e.g. a Cessna is probably going
about a certain speed).
Durso and Gronlund (in press) come to a similar conclusion, drawing on a model
by Ericson and Kintch (1995) in which pointers in working memory point to information
stored in long-term memory. Adams et al (1995) also discuss pointers from working
memory to long-term memory. Sarter and Woods (1991) emphasize the importance of
information that can be activated from long-term memory to support limited working
memory. In this sense, SA is a unique product of external information acquired, working
memory processes and the internal long-term memory stores activated and brought to bear
on the formation of the internal representation.
Figure 5. Working Memory as an Activated Subset of Long-Term Memory
(based on Cowan, 1988)
Endsley Page 11
Long-term Memory, Mental Models and SA
While long-term memory stores may take many forms, the idea of mental models
has received much support. Long-term memory stores in the form of mental models or
schema are hypothesized to play a major role in dealing with the limitations of working
memory (Endsley, 1988; 1995c). With experience, operators develop internal models of the
systems they operate and the environments in which they operate. These models serve to
help direct limited attention in efficient ways, provide a means of integrating information
without loading working memory, and provide a mechanism for generating projection of
future system states.
Associated with these models may be schema of prototypical system states, as
shown in Figure 6. Critical cues in the environment may be matched to such schema to
indicate prototypical situations that provide instant situation classification and
comprehension. Scripts of the proper actions to take may be attached to these situation
prototypes, simplifying decision making as well. Schema of prototypical situations are
incorporated in this process and in many instances may also be associated with scripts to
produce single-step retrieval of actions from memory, thus providing for very rapid decision
making such as has been noted by Klein (1989). The use of mental models in achieving SA
is considered to be dependent on the ability of the individual to pattern match between
critical cues in the environment and elements in the mental model.
SA Guides Selection
of Active Goal
Active Goal Selects Model
Figure 6. Role of Goals and Mental Models in SA
Current Situation Model
Endsley Page 12
In this sense, SA, or the situation model, is the current state of the mental model.
For example, one can have a mental model of a car engine in general, but the situation model
(SA) is the state it is believed to be in now (e.g. it has a given RPM, it has a given level of
fuel, it is at a given temperature, etc.). This situational model captures not only the person’s
representation of the various parameters of the system, but also a representation of how they
relate in terms of system form and function to create a meaningful synthesis —!a gestalt
comprehension of the system state. So in addition to containing the current value of engine
temperature and its rate of change, the situation model also includes an understanding of the
impact of that state on the system and on projected events (e.g. it will overheat).
The concept of a mental model is useful in that it provides a mechanism for (a)
guiding attention to relevant aspects of the situation, (b) a means of integrating information
perceived to form an understanding of its meaning, and (c) a mechanism for projecting
future states of the system based on its current state and an understanding of its dynamics.
Without such a device, integration and projection would be particularly onerous tasks, yet
experts appear to be able to perform them with ease. Endsley (1995c) provides a more
detailed description of how mental models and schema may be developed and formed
through experience based on work by Holland, et al (1986).
Other researchers also have posited a strong relationship between SA and mental
models. Sarter and Woods (1991) state that adequate mental models are a pre-requisite for
achieving SA. Mogford (1997) discusses the relationship between mental models and SA.
He states that a mental model is the underlying knowledge that is the basis for SA.
“Information in the mental model influences and structures the data held in SA and directs
attention” (p. 333). As information moves from the mental model into SA, he states, it
increases in awareness, although its retention may decrease.
The use of mental models is not all positive for SA, however. Fracker (1988) states
that while schema may be very useful for facilitating situation assessment by providing a
reduction in working memory demands, they can also lead to significant problems with
biasing in the selection and interpretation of information that may create errors in situation
awareness. Supporting this position, Jones and Endsley (1996) found that approximately
7% of SA errors could be traced to a poor or insufficient mental model. In addition, they
found that 6.5% of SA errors involved the use of the incorrect mental model to process
information, thus arriving at an incorrect understanding of the situation. Another 4.6% of
SA errors involved over-reliance on default values in the mental model. Together these three
problems with the use of mental models accounted for approximately 18% of SA errors,
most of the 20.3% of the cases comprising level 2 SA errors.
Jones (1997) further investigated the impact of mental models on the development
of Level 2 SA. Specifically, she examined a type of Level 2 SA problem that has been
labeled a representational error in which incoming cues and information are misinterpreted
based on an incorrect mental model. She manipulated the mental model of air traffic
controllers by first introducing certain erroneous information. She then provided cues that
should have indicated that the earlier information could not be correct. These cues were
either “bizarre” (could not be possible with the mental model), “irrelevant” (could be
possible with the mental model, although unlikely and more attuned with an alternate
model), “unexpected” (a cue which was not expected to occur with the mental model), and
“absence of expected” (in which a cues anticipated by the mental model did not happen).
Interestingly, only 35% of the cues overall resulted in the controller detecting the error in the
mental model. The absence of expected cues were the most successful for overcoming the
representational error (56%), demonstrating the real depth of information provided by
mental models. Overall, however, the study primarily demonstrated the resilience of a
person’s initial assessment of a situation in guiding subsequent interpretations of later
information. While this strategy can be effective (Mosier & Chidester, 1991), it has also
been related to errors in process control (Carmino, Idee, Larchier Boulanger, & Morlat,
1988) and medicine (Klein, 1993). Anchoring and confirmation bias undoubtedly come
Endsley Page 13
Building on the relationship between SA and mental models, Endsley, English and
Sundararajan (1997) demonstrated that measures of SA (as momentary snapshots of the
mental model) could be reconstructed to form a computer model of the mental model of
expert fighter pilots. While it is difficult to say that any computer model is identical to a
human’s internal model, the resultant computer model performed with an accuracy rate
equivalent to the experts, demonstrating the utility of exploiting such a relationship. The
problems involved in building good computer and analytic models of mental models have
been well aired (Wilson & Rutherford, 1989). Overall, mental models and schema are
believed to be central mechanisms for overcoming limited working memory constraints to
form SA under very challenging conditions.
Pattern Matching and Other Cognitive Processes
Pattern matching goes hand-in-hand with schema and mental models in facilitating
the development of SA. There is considerable evidence that experienced decision makers
use pattern matching to recognize perceived information as a particular exemplar of a known
class of situations (Dreyfus, 1981; Hinsley, Hayes, & Simon, 1977; Kaempf, Wolf, &
Miller, 1993; Klein, 1989; Klein, et al., 1986). This pattern matching process can occur
almost instantly (Hinsley, et al., 1977; Kaempf, et al., 1993; Secrist & Hartman, 1993).
Hartman and Secrist (1991) attribute near-threshold processing of environmental cues as
an important basis for pattern matching.
Federico (1995) found that situations tended to be classified based on both surface
features and deeper structural features. Amongst the naval operators in his study, surface
features tended to include contact information, critical cues, and similar platforms, patterns
and sequences. Deep structures included importance attributed to similar contexts and other
background information. Federico found that experts attended more to the issue of context
than did novices. He did not find evidence for differences between novices and experts in
the number of schema or accessibility to them.
Endsley and Bolstad (1994) found evidence for the importance of pattern matching
skills in distinguishing between fighter pilots with higher and lower levels of SA. It should
be noted, however, that SA is not solely dependent on pattern-matching. Cohen, Freeman
and Wolf (1996) present a good discussion of the role of certain meta-cognitive skills in
augmenting pattern matching and in forming the basis for developing situation assessment
in cases where pattern-matching fails. Kaempf, et al (1996) found that pattern matching to
situation prototypes accounted for 87% of decisions by tactical commanders, with another
12% employing a story building process (most likely made possible through mental
models). Commanders also tended to evaluate courses of action through mental simulation
18% of the time, actively developing level 3 SA. A combination of pattern-matching,
conscious analysis, story building, mental simulation, and meta-cognitive processes all may
be used by operators at various times to form SA.
Goals are central to the development of situation awareness. Essentially, human
information processing in operating complex systems is seen as alternating between data
driven (bottom-up) and goal driven (top-down) processing. This process is viewed as
critical in the formation of SA (Endsley, 1988, 1995c). In goal driven processing, attention
is directed across the environment in accordance with active goals. The operator actively
seeks information needed for goal attainment and the goals simultaneously act as a filter in
interpreting the information that is perceived. In data driven processing, perceived
environmental cues may indicate new goals that need to be active. Dynamic switching
between these two processing modes is important for successful performance in many
Endsley Page 14
In defining SA as “adaptive, externally directed consciousness”, Smith and
Hancock (1995) take the view that SA is purposeful behavior that is directed toward
achieving a goal in a specific task environment. They furthermore point-out that SA is
therefore dependent on a normative definition of task performance and goals that are
appropriate in the specific environment.
Within any environment, operators typically will have multiple goals that may shift
in importance. At any one time, only a subset of these goals may be actively pursued. In the
model shown in Figure 6, the individual’s goals serve a number of critical functions
1. First, the active goals direct the selection of the mental model. For example, the
goal of ‘diagnose the warning light’ would activate the mental model associated
with that particular system and diagnostic behavior.
2. That goal and its associated mental model are used to direct attention in selecting
information from the environment. They serve to direct scan patterns and
information acquisition activities. For this reason, the selection of the correct
goal(s) is extremely critical for achieving SA. If the individual is pursing the
wrong goal (or a less important goal), critical information may be missed or not
3. Goals and their associated mental models are used to interpret and integrate the
information to achieve comprehension. The goal determines the “so what” of
the information. For example, a flight level of 10,000 feet can only be
interpreted in light of the goal of the desired altitude (e.g. the clearance altitude).
The associated mental model is also critical in that the selection of the wrong
mental model may lead to the misinterpretation of the information.
This is essentially a top-down, goal-driven process in which goals actively guide
information selection and processing. Simultaneously, a bottom-up process occurs in which
as information perceived is processed to form SA, a given situation assessment can trigger
the selection of a new goal. So while a pilot may be busily engaged in the goal of
navigation, the chiming of an emergency engine light will trigger that a new goal should be
made active. This then directs selection of a new mental model and attention focus in the
The alternating between bottom-up and top-down processing is one of the most
important mechanisms underlying SA. Failures in this process can be particularly
damaging to SA, including failures to process information needed to assure proper goal
selection and the resultant inattention to needed information (e.g. attentional narrowing) or
misinterpretation of information perceived (Level 2 SA errors).
Adams et al (1995) point to a similar cycle of SA directing perception and perception
leading to SA based on Neisser’s model of the perceptual cycle. They do not, however,
specifically address the role of goals and goal juggling in this process. I feel a recognition
of goals is key to understanding SA. Without understanding operator goals, the
information in the environment has no meaning. In the words of Flach (1996), the construct
of SA essentially allows researchers and designers to address the issue of meaning,
something that has been here-to-fore all too frequently lacking in our science.
In addition, preconceptions or expectations influence the formation of situation
awareness. People may have certain expectations about what they expect to see or hear in a
particular environment. This may be due to mental models, prior experiences, instructions or
other communications. These expectations will influence how attention is deployed and the
actual perception of information taken in. That is, there is a tendency for people to see what
they expect to see (or hear).
Expectations may be formulated based on the active mental model and prior
expectations. They also may be developed through instructions or other communications.
Endsley Page 15
Pilots, for example, frequently develop strong expectations based on the pre-flight briefing.
These expectations can provide a very efficient mechanism for filtering the data in the
environment to a subset that is expected to be useful. Adams et al (1995) state that
“anticipation reflects the focal schema’s preparedness for certain information” (p. 89).
Adelman et al (1996) report that contextual information, provided by early
information, acts to trigger different information processing strategies, thus influencing
which information is attended to and how conflicting information is explained. This
description is consistent with the model presented here and with the findings of Jones
(1997) who found that even highly experienced air traffic controllers developed quite
elaborate stories to explain away conflicting information that was inconsistent with the
mental model created by earlier information. Jones and Endsley (1996) found that in
approximately half of the cases where incorrect mental models had been the source of a SA
error, information was mismatched to the false expectation created by the incorrect mental
model and thereby misinterpreted.
Taylor, Endsley and Henderson (1996) demonstrated that expectations can form a
two-edged sword. Teams who were given a well-developed set of expectations based on a
pre-mission video briefing tended to fall into traps easily when events and situations did not
unfold as expected. They also did not tend to have developed the same type of group skills
for active problem solving, as did other teams who were not given the video briefing. This
demonstrated that while expectations can be a useful mechanism for SA, they also can
contain certain dangers.
Finally, automaticity is another mechanism developed with experience that can
influence situation awareness. With experience, the pattern-recognition/action-selection
sequence can become highly routinized and developed to a level of automaticity (Logan,
1988). This mechanism provides good performance with a very low level of attention
demand in certain well-understood environments. In this sense, automaticity can positively
affect situation awareness by reducing demands on limited attention resources, particularly
for demanding physical tasks. Situation awareness can be negatively impacted by
automaticity of cognitive processes, however, due to a reduction in responsiveness to novel
stimuli. Information that is outside the routinized sequence may not be attended to. Thus,
situation awareness may suffer when that information is important.
An issue of discussion has centered on whether or not people acting with
automaticity really need to have a high level of conscious SA. The example would be a
person driving over a familiar route without really thinking about it, turning and stopping
where appropriate yet unable to consciously remember the past few minutes of the trip. As
opposed to the view of performance presented here, where environmental cues are
interpreted to form SA, produce decisions and then actions (Figure 7), automaticity would
be seen to short circuit that process. A more direct stimulus/response paring would be
elicited and conscious awareness and decision making would thus be very low. While the
advantage of automaticity is the low level of attention required by the task, it also has
significant hazards for SA and performance. Namely, such direct stimulus-response pairing
(a) does not allow for novel integration of information, (b) people under automaticity tend to
be non-receptive to novel events, and (c) errors tend to occur when there must be a change in
the learned pattern.
While people may indeed perform many tasks with a fair degree of automaticity, I
think it can be argued that in the highly cognitive, complex systems that we are typically
interested in, the need for SA remains. In most real-world domains, people must perform
based on more than just stimulus-response. They need to combine information and
anticipate events that are beyond their experience. They must be proactive, not just reactive.
They must be goal driven and not just data driven. These aspects of successful performance
require SA and are at odds with automaticity of cognitive behaviors.
Endsley Page 16
Figure 7. Automaticity in Cognitive Processes
Processes vs. Product: Situation Assessment and Situation Awareness
Figure 4 differentiates between cognitive processes involved in achieving SA
(situation assessment) and the resultant state of knowledge about the situation (situation
awareness). It is widely recognized that each affects the other in an ongoing cycle. In the
Endsley model, the alternation of data-driven and goal-driven processes is seen as a central
mechanism in this process. The goals, expectations, mental model and current situation
model all act to affect the processes employed (see Figure 4). The processes determine the
resulting SA. Adams, et al. (1995) stress the importance of the inter-relationship between
one’s state of knowledge, or SA, and the processes used to achieve that knowledge. Framed
in terms of Neisser’s (1976) model of perception and cognition, they make the point that
one’s current knowledge effects the process of acquiring and interpreting new knowledge in
an ongoing cycle. This agrees with Sarter and Woods (1991) statement that SA is “the
accessibility of a comprehensive and coherent situation representation which is continuously
being updated in accordance with the results of recurrent situation assessments”. Smith
and Hancock (1994) further support this proposition by stating that “SA is up-to-the
minute comprehension of task relevant information that enables appropriate decision
making under stress. As cognition-in-action (Lave, 1988), SA fashions behavior in
anticipation of the task-specific consequences of alternative actions” (p. 59). For purposes
of clarity, however, situation assessment, as an active process of seeking information from
the environment, is defined separately from situation awareness as the resultant of that
Endsley Page 17
Measurement of SA
Why Measure SA?
Some people have feared that SA will be a concept limited in utility because we
cannot measure it; perhaps important to operators, but ultimately circular from a scientific
standpoint. I believe the opposite is true. SA is a beneficial concept precisely because we
can measure it. The direct measurement of SA provides a great insight into how operators
piece together the vast array of available information to form a coherent operational picture.
The measurement of SA provides a useful index for evaluating system design and training
techniques and for better understanding human cognition. As an intervening variable in-
between stimulus and response, the measurement of SA provides far greater diagnosticity
and sensitivity than is typically available from performance measures.
One of the chief reasons for measuring SA has been for the purpose of evaluating
new system and interface designs. To determine the degree to which new technologies or
design concepts actually improve (or degrade) operator SA, it is necessary to systematically
evaluate them based on a measure of SA, thus providing a determination of which ideas have
merit and which have unforeseen negative consequences. Explicit measurement of SA
during design testing determines the degree to which design objectives have been met.
For many systems, operator situation awareness and workload can be directly
measured during design testing in addition to operator performance measures. High level
performance measures (as collected during the limited conditions of simulation testing) are
often not sufficiently granular or diagnostic of differences in system designs. Thus, while
one system design concept may be superior to another in providing the operator with
needed information in a format that is easier to assimilate with operator needs, the benefits
of this may go unnoticed during the limited conditions of simulation testing or due to extra
effort on the part of operators to compensate for a design concept’s deficiencies.
If situation awareness is measured directly, it should be possible to select design
concepts that promote situation awareness, and thus increase the probability that operators
will make effective decisions and avoid poor ones. Problems with situation awareness,
frequently brought on by data overload, non-integrated data, automation, complex systems
that are poorly understood, excess attention demands, and many other factors, can be
detected early in the design process and corrective changes made to improve the design.
Evaluation of Training Techniques
In addition to evaluating design concepts, a measure of SA may also be useful for
evaluating the impact of training techniques on SA. Although relatively little work has been
conducted to date on the effects of different training approaches on operator SA, in theory
the same principles apply as those discussed for evaluating design concepts.
Investigations of the Situation Awareness Construct
The measurement of SA is essential for conducting studies to empirically examine
factors that may effect SA, such as individual abilities and skills, or the effectiveness of
different processes and strategies for acquiring SA, and for investigating the nature of the
SA construct itself. The theories discussed here can only be expanded upon and developed
further through such efforts.
Requirements for SA Measures
Endsley Page 18
To adequately address these goals, however, the veracity of available SA measures
needs to be established. Ultimately, validity and reliability must be established for any SA
measurement technique that is used. It is necessary to establish that a metric (a) indeed
measures the construct it claims to measure and is not a reflection of other processes, (b)
provides the required insight in the form of sensitivity and diagnosticity, and (c) does not
substantially alter the construct in the process, providing biased data and altered behavior.
In addition, it can be useful to establish the existence of a relationship between the measure
and other constructs as would be predicted by theory. To this end, the implications of SA
information for SA measurement approaches will be discussed.
Implications of SA Theory for Measurement of SA
Several implications can be drawn from the theoretical viewpoints of SA that have
significant implications for developing and selecting measures of situation awareness
Processes vs. States
First, situation awareness as defined here is a state of knowledge about a dynamic
environment. This is different than the processes used to achieve that knowledge. Different
individuals may use different processes (information acquisition methods) to arrive at the
same state of knowledge, or may arrive at different states of knowledge based on the same
processes due to differences in comprehension and projection of acquired information or
different mental models, schema, etc... Measures that tap into situation assessment
processes, therefore, may provide information of interest in understanding how people
acquire information; however, they will only provide partial and indirect information
regarding a person’s level of situation awareness. While there may be experimental need for
both types of measures, great care should be taken in attempting to infer one from the other.
Situation Awareness, Decision Making, and Performance Disconnect.
Secondly, just as there may be a disconnect between the processes used and the
resultant situation awareness, there may also be a disconnect between situation awareness
and the decisions made. With high levels of expertise in well-understood environments,
there may be a direct situation awareness-decision link, whereby understanding what the
situation is leads directly to selection of an appropriate action from memory. This is not
always the case, however. Individuals can still make poor decisions with good situation
The relation between situation awareness and performance, therefore, can be viewed
as a probabilistic link. Good situation awareness should increase the probability of good
decisions and good performance, but does not guarantee it. Conversely, poor situation
awareness increases the probability of poor performance, however, in many cases does not
create a serious error. For instance, being disoriented in an aircraft is more likely to lead to
an accident when flying at low altitude than when flying at high altitude. Lack of situation
awareness about one’s opponent in a fighter aircraft may not be a problem if the opponent
also lacks situation awareness. In relation to situation awareness measurement, these issues
indicate that behavior and performance measures are only indirect indices of operator
The way in which a person deploys his or her attention in acquiring and processing
information has a fundamental impact on situation awareness. Particularly in complex
environments where multiple sources of information compete for attention, which
information people attend to has a substantial influence on their situation awareness.
Design changes that influence attention distribution (intentionally or inadvertently) therefore
can have a big impact on situation awareness. Similarly, measurement techniques that
Endsley Page 19
artificially influence attention distribution should be avoided, as they may well change the
construct that is being measured in the process.
Direct measures of situation awareness tap into a person’s knowledge of the state of
the dynamic environment. This information may be resident in working memory for a short
period of time or in long-term memory to some degree and under certain circumstances. A
significant issue for measures that attempt to tap into memory is to what degree people can
report on mental processes to make this information accessible.
Automaticity may influence memory recall. With automaticity, there is very little
awareness of the processes used. A careful review of literature regarding automatic
processing, however, reveals that while people may not be able to accurately report
processes used in decision making, they are usually aware of the situation itself at the time
(Endsley, 1995c). A low level of attention may make this information difficult to obtain
from memory after the fact, however.
Time also affects the ability of people to report information from memory. With
time there is a rapid decay of information in working memory, thus only long-term memory
access may be available. Nisbett and Wilson (1977) demonstrate that recall of mental
processes after the fact tends to be over-generalized, over-summarized, and over-rationalized,
and thus may not be an accurate view of the actual situation awareness possessed in a
dynamic sense. Real-time, immediate access of information from memory can also be
difficult, however, as this process may influence ongoing performance and decision
processes and situation awareness itself. Direct access of a person’s memory stores can be
problematic, therefore, and indicates that careful strategies for obtaining this information
must be employed.
The relationship between SA and workload has also been theorized to be important.
Taylor (1990) includes a consideration of supply and demand of resources as central to
situation awareness. Adams, et al. (1995) also discuss the task management problem,
involving prioritizing, updating task status, and servicing tasks in a queue, as central to SA.
Endsley (1993a), however, shows that for a large range of the spectrum, SA and workload
can vary independently, diverging based on numerous factors. Only when workload
demands exceed maximum human capacity is SA necessarily at risk. SA problems may
also occur under low workload (due to vigilance problems) or when workload is in some
moderate region. Situation awareness and workload, although inter-related in certain
circumstances, are essentially independent constructs in many ways.
As people can make tradeoffs between the level of effort extended and how much
they feel they need to know, it is important that both SA and workload be measured
independently in the evaluation of a design concept. A particular design may improve (or
diminish) SA, yet workload may remain stable. That is, operators may be putting forth the
same amount of effort, and getting more (or fewer) rewards in terms of the SA achieved.
With other designs, it may be that operators are able to maintain the same level of SA, yet
may have to work much harder. In order to get a complete understanding of the effects of a
particular design concept, therefore, both situation awareness and workload need to be
measured during design testing.
How much SA is enough?
This is perhaps the hardest question to answer regarding SA. As SA and
performance are only linked probabilistically, there is really no set threshold of SA that can
guarantee a given level of performance. That is, it may be possible (through luck) to
perform well on occasion with only a very low level of SA. Conversely, a person with even
Endsley Page 20
a high level of SA will not always perform well. It is probably better to think of SA in
relative terms. As the level of SA possessed by an individual increases, the probability of
making good decisions and performing well also increases. Therefore, in evaluating new
design concepts or training programs, in general it will be necessary to make relative
comparisons. We are left with the question of whether a given design increases operator
SA as compared to a previous design, or whether it provides better SA than that possessed
by rivals (e.g. enemy forces in a military context).
In making such assessments, it is also fair to say that there is no such thing as too
much SA. More SA is always better. As SA is operationally defined as that which one
really needs to know, gaining more information and understanding of these elements is
always better and increases the likelihood of effective decision making and performance.
(While a person can have too much extraneous information that interferes with accessing
and processing needed information, extraneous information, by definition, is not part of
In making this statement, however, it should be carefully noted that one should
ensure that SA on one aspect of the situation is not gained at the cost of another, equally
important aspect. If ideal SA is perfect knowledge on all relevant aspects of the situation,
then this establishes a yardstick against which to measure. People rarely will reach this
level; however, it allows us to determine the degree to which different system designs allow
the operator approach the ideal level of SA. This may vary for different elements of the
situation, as SA has generally been found to be a multi-dimensional construct and not a
unitary one (Endsley, 1990b; Endsley & Rodgers, 1998; Endsley, Selcon, Hardiman, &
Croft, 1998b). Therefore, as designers, we constantly seek to raise operator SA across a
wide range of SA requirements.
Numerous approaches to measuring situation awareness have been proposed and
will be reviewed in subsequent chapters. Ultimately, each class of measures may have
certain advantages and disadvantages in terms of the degree to which the measure provides
an index of situation awareness. In addition, the objectives of the researcher and the
constraints of the testing situation will also have a considerable impact on the
appropriateness of a given measure of SA. Certain classes of measures may be highly
suitable for qualitative investigations of SA processes, for instance, yet be inadequate for
design testing, and vice versa. Regardless, it is vital that the veracity of any measure used
for measuring SA be established, so that informed research and design testing can take
Adams, M. J., Tenney, Y. J., & Pew, R. W. (1995). Situation awareness and the cognitive
management of complex systems. Human Factors, 37(1).
Adelman, L., Bresnick, T., Black, P. K., Marvin, F. F., & Sak, S. G. (1996). Research with
Patriot air defense officers: Examining information order effects. Human Factors,
Arback, C. J., Swartz, N., & Kuperman, G. (1987). Evaluating the panoramic cockpit
controls and displays system. In Proceedings of the Fourth International Symposium
on Aviation Psychology (pp. 30-36). Columbus, OH: The Ohio State University.
Carmino, A., Idee, E., Larchier Boulanger, J., & Morlat, G. (1988). Representational errors:
Why some may be termed diabolical. In L. P. Goodstein, H. B. Anderson, & S. E.
Olsen (Eds.), Tasks, errors and mental models (pp. 240-250). London: Taylor and
Endsley Page 21
Cohen, M. S., Freeman, J. T., & Wolf, S. (1996). Metarecognition in time-stressed decision
making: Recognizing, critiquing, and correcting. Human Factors, 38(2), 206-219.
Cowan, N. (1988). Evolving conceptions of memory storage, selective attention, and their
mutual constraints within the human information processing system. Psychological
Bulletin, 104(2), 163-191.
Dominguez, C. (1994). Can SA be defined? In M. Vidulich, C. Dominguez, E. Vogel, & G.
McMillan (Eds.), Situation awareness: Papers and annotated bibliography (AL/CF-TR-
1994-0085) (pp. 5-15). Wright-Patterson AFB, OH: Armstrong Laboratory.
Dreyfus, S. E. (1981). Formal models vs. human situational understanding: Inherent
limitations on the modeling of business expertise (ORC 81-3). Berkeley: Operations
Research Center, University of California.
Durso, F. T., & Gronlund, S. D. (in press). Situation awareness. In F. T. Durso, R.
Nickerson, R. Schvaneveldt, S. Dumais, M. Chi, & S. Lindsay (Eds.), Handbook of
applied cognition. New York: Wiley.
Endsley, M. R. (1987). SAGAT: A methodology for the measurement of situation
awareness (NOR DOC 87-83). Hawthorne, CA: Northrop Corporation.
Endsley, M. R. (1988). Design and evaluation for situation awareness enhancement. In
Proceedings of the Human Factors Society 32nd Annual Meeting (pp. 97-101). Santa
Monica, CA: Human Factors Society.
Endsley, M. R. (1990a, March). Objective evaluation of situation awareness for dynamic
decision makers in teleoperations. Paper presented at the Engineering Foundation
Conference on Human-Machine Interfaces for Teleoperators and Virtual
Environments, Santa Barbara, CA.
Endsley, M. R. (1990b). Situation awareness in dynamic human decision making: Theory
and measurement. Unpublished doctoral dissertation, University of Southern
California, Los Angeles, CA.
Endsley, M. R. (1993a). Situation awareness and workload: Flip sides of the same coin. In
R. S. Jensen & D. Neumeister (Ed.), Proceedings of the Seventh International
Symposium on Aviation Psychology (pp. 906-911). Columbus, OH: Department of
Aviation, The Ohio State University.
Endsley, M. R. (1993b). A survey of situation awareness requirements in air-to-air combat
fighters. International Journal of Aviation Psychology, 3(2), 157-168.
Endsley, M. R. (1994). Situation awareness in FAA Airway Facilities Maintenance Control
Centers (MCC): Final Report. Lubbock, TX: Texas Tech University.
Endsley, M. R. (1995a). Measurement of situation awareness in dynamic systems. Human
Factors, 37(1), 65-84.
Endsley, M. R. (1995b). A taxonomy of situation awareness errors. In R. Fuller, N.
Johnston, & N. McDonald (Eds.), Human Factors in Aviation Operations (pp. 287-
292). Aldershot, England: Avebury Aviation, Ashgate Publishing Ltd.
Endsley, M. R. (1995c). Toward a theory of situation awareness. Human Factors, 37(1), 32-
Endsley, M. R. (1996). Situation awareness measurement in test and evaluation. In T. G.
O'Brien & S. G. Charlton (Eds.), Handbook of human factors testing & evaluation
(pp. 159-180). Mahwah, NJ: Lawrence Erlbaum.
Endsley, M. R. (1997, April). Communication and situation awareness in the aviation
system. Paper presented at the Aviation Communication: A Multi-Cultural Forum,
Endsley, M. R., & Bolstad, C. A. (1994). Individual differences in pilot situation awareness.
International Journal of Aviation Psychology, 4(3), 241-264.
Endsley, M. R., English, T. M., & Sundararajan, M. (1997). The modeling of expertise: The
use of situation models for knowledge engineering. International Journal of Cognitive
Ergonomics, 1(2), 119-136.
Endsley, M. R., Farley, T. C., Jones, W. M., Midkiff, A. H., & Hansman, R. J. (1998a).
Situation awareness information requirements for commercial airline pilots (ICAT-98-
Endsley Page 22
1). Cambridge, MA: Massachusetts Institute of Technology International Center for
Endsley, M. R., & Kiris, E. O. (1995). The out-of-the-loop performance problem and level
of control in automation. Human Factors, 37(2), 381-394.
Endsley, M. R., & Robertson, M. M. (1996). Team situation awareness in aviation
maintenance. In Proceedings of the 40th Annual Meeting of the Human Factors and
Ergonomics Society (pp. 1077-1081). Santa Monica, CA: Human Factors and
Endsley, M. R., & Rodgers, M. D. (1994). Situation awareness information requirements
for en route air traffic control. In Proceedings of the Human Factors and Ergonomics
Society 38th Annual Meeting (pp. 71-75). Santa Monica, CA: Human Factors and
Endsley, M. R., & Rodgers, M. D. (1998). Distribution of attention, situation awareness,
and workload in a passive air traffic control task: Implications for operational errors
and automation. Air Traffic Control Quarterly, 6(1), 21-44.
Endsley, M. R., Selcon, S. J., Hardiman, T. D., & Croft, D. G. (1998b). A comparative
evaluation of SAGAT and SART for evaluations of situation awareness. In
Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. 82-
86). Santa Monica, CA: Human Factors and Ergonomics Society.
Endsley, M. R., & Smith, R. P. (1996). Attention distribution and decision making in
tactical air combat. Human Factors, 38(2), 232-249.
Ericsson, K. A., & Kintsch, W. (1995). Long term working memory. Psychological review,
Federico, P. A. (1995). Expert and novice recognition of similar situations. Human Factors,
Flach, J. M. (1995). Situation awareness: Proceed with caution. Human Factors, 37(1), 149-
Flach, J. M. (1996). Situation awareness: In search of meaning. Cseriac Gateway, VI(6), 1-
Fracker, M. L. (1988). A theory of situation assessment: Implications for measuring
situation awareness. In Proceedings of the Human Factors Society 32nd Annual
Meeting (pp. 102-106). Santa Monica, Ca: Human Factors Society.
Gibson, J., Orasanu, J., Villeda, E., & Nygren, T. E. (1997). Loss of situation awareness:
Causes and consequences. In R. S. Jensen & R. L. A. (Ed.), Proceedings of the Eighth
International Symposium on Aviation Psychology (pp. 1417-1421). Columbus, OH:
The Ohio State University.
Gugerty, L. (1998). Evidence from a partial report task for forgetting in dynamic spatial
memory. Human Factors, 40(3), 498-508.
Gugerty, L., & Tirre, W. (1997). Situation awareness: a validation study and investigation of
individual differences. In Proceedings of the Human Factors and Ergonomics Society
40th Annual Meeting (pp. 564-568). Santa Monica, CA:
Hartman, B. O., & Secrist, G. E. (1991). Situational awareness is more than exceptional
vision. Aviation, Space and Environmental Medicine, 62, 1084-9.
Hinsley, D., Hayes, J. R., & Simon, H. A. (1977). From words to equations. In P.
Carpenter & M. Just (Eds.), Cognitive processes in comprehension. Hillsdale, N.J.:
Holland, J. H., Holyoak, K. F., Nisbett, R. E., & Thagard, P. R. (1986). Induction:
Processes of inference, learning and discovery. Cambridge: MIT Press.
IC3 (1986). Intraflight Command, Control and Communications Symposium Final Report.
Jentsch, F., Barnett, J., & Bowers, C. (1997). Loss of aircrew situation awareness: a cross-
validation. In Proceedings of the Human Factors and Ergonomics Society 41st Annual
Meeting (pp. 1379). Santa Monica, CA: Human Factors and Ergonomics Society.
Endsley Page 23
Jones, D. G. (1997). Reducing situation awareness errors in air traffic control. In
Proceedings of the Human Factors and Ergonomics Society 41st Annual Meeting (pp.
230-233). Santa Monica, CA: Human Factors and Ergonomics Society.
Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in aviation.
Aviation, Space and Environmental Medicine, 67(6), 507-512.
Kaempf, G. L., Klein, G. A., Thordsen, M. L., & Wolf, S. (1996). Decision making in
complex naval command and control environments. Human Factors, 38(2), 220-231.
Kaempf, G. L., Wolf, S., & Miller, T. E. (1993). Decision making in the AEGIS combat
information center. In Proceedings of the Human Factors and Ergonomics Society
37th Annual Meeting (pp. 1107-1111). Santa Monica, CA: Human Factors and
Klein, G. (1995, November). Studying situation awareness in the context of decision-
making incidents. Paper presented at the Experimental Analysis and Measurement of
Situation Awareness, Daytona Beach, FL.
Klein, G. A. (1989). Recognition-primed decisions. In W. B. Rouse (Eds.), Advances in
man-machine systems research (5) (pp. 47-92). Greenwich, Conn: JAI Press, Inc.
Klein, G. A. (1993). Sources of error in naturalistic decision making tasks. In Proceedings
of the Human Factors and Ergonomics Society 37th Annual Meeting (pp. 368-371).
Santa Monica, CA: Human Factors and Ergonomics Society.
Klein, G. A., Calderwood, R., & Clinton-Cirocco, A. (1986). Rapid decision making on the
fire ground. In Proceedings of the Human Factors Society 30th Annual Meeting (pp.
576-580). Santa Monica, CA: Human Factors Society.
Marshak, W. P., Kuperman, G., Ramsey, E. G., & Wilson, D. (1987). Situational awareness
in map displays. In Proceedings of the Human Factors Society 31st Annual Meeting
(pp. 533-535). Santa Monica, CA: Human Factors Society.
Mogford, R. H. (1997). Mental models and situation awareness in air traffic control.
International Journal of Aviation Psychology, 7(4), 331-342.
Mosier, K. L., & Chidester, T. R. (1991). Situation assessment and situation awareness in a
team setting. In Y. Queinnec & F. Daniellou (Eds.), Designing for Everyone (pp. 798-
800). London: Taylor and Francis.
Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on
mental processes. Psychological Review, 84(3), 231-259.
Press, M. (1986). Situation awareness: Let's get serious about the clue-bird. Unpublished
Sarter, N. B., & Woods, D. D. (1991). Situation awareness: A critical but ill-defined
phenomenon. The International Journal of Aviation Psychology, 1(1), 45-57.
Sarter, N. B., & Woods, D. D. (1995). "How in the world did I ever get into that mode":
Mode error and awareness in supervisory control. Human Factors, 37(1), 5-19.
Secrist, G. E., & Hartman, B. O. (1993). Situational awareness: The trainability of near-
threshold information acquisition dimension. Aviation, Space and Environmental
Medicine, 64, 885-892.
Smith, K., & Hancock, P. A. (1994). Situation awareness is adaptive, externally-directed
consciousness. In R. D. Gilson, D. J. Garland, & J. M. Koonce (Eds.), Situational
awareness in complex systems (pp. 59-68). Daytona Beach, FL: Embry-Riddle
Aeronautical University Press.
Smith, K., & Hancock, P. A. (1995). Situation awareness is adaptive, externally directed
consciousness. Human Factors, 37(1), 137-148.
Spick, M. (1988). The ace factor: Air combat and the role of situational awareness.
Annapolis, MD: Naval Institute Press.
Taylor, R. M. (1990). Situational awareness rating technique (SART): The development of a
tool for aircrew systems design. In Situational Awareness in Aerospace Operations
(AGARD-CP-478) (pp. 3/1 - 3/17). Neuilly Sur Seine, France: NATO - AGARD.
Taylor, R. M., Endsley, M. R., & Henderson, S. (1996). Situational awareness workshop
report. In B. J. Hayward & A. R. Lowe (Eds.), Applied aviation psychology:
Endsley Page 24
Achievement, change and challenge (pp. 447-454). Aldershot, UK: Ashgate Publishing
Taylor, R. M., & Selcon, S. J. (1994). Situation in mind: Theory, application and
measurement of situational awareness. In R. D. Gilson, D. J. Garland, & J. M. Koonce
(Eds.), Situational awareness in complex settings (pp. 69-78). Daytona Beach, FL:
Embry-Riddle Aeronautical University Press.
Tenney, Y. T., Adams, M. J., Pew, R. W., Huggins, A. W. F., & Rogers, W. H. (1992). A
principled approach to the measurement of situation awareness in commercial aviation
(NASA Contractor Report 4451). Langely, VA: NASA Langely Research Center.
Wickens, C. D. (1992). Engineering Psychology and Human Performance (2nd ed.). New
York: Harper Collins.
Wilson, J. R., & Rutherford, A. (1989). Mental models: Theory and application in human
factors. Human Factors, 31(6), 617-634.