ArticlePDF Available

Navigating through Large Display Networks in Dynamic Control Applications

Authors:
  • San Jose State University (at NASA Ames)

Abstract

There is an increasing trend to use computer display systems as the primary 'window' by which users see and interact with complex dynamic processes (e.g., air traffic control; computerized control rooms for process control). These kinds of applications offer special challenges to the design of computer based display systems. In particular, the large scope of these applications necessitates large display structures involving thousands of displays. Further, the dynamic nature of the tasks mean that users need to be able to move rapidly through the display structure to keep pace with temporally evolving situations and to be able to respond to new events as they occur. As a result, special display navigation challenges arise in computer based display systems for monitoring and controlling dynamic processes.
PROCEEDINGS
of
the
HUMAN FACTORS SOCIETY
34th
ANNUAL MEETING-I990
NAVIGATING THROUGH LARGE DISPLAY NETWORKS IN DYNAMIC
CONTROL APPLICATIONS
David
D.
Woods
Cognitive Systems Engineering Laboratory
The Ohio State University
Columbus,
OH
Emilie
M.
Roth, William
F.
Stubler, Randall
J.
Mumaw
Westinghouse Science and Technology Center
Pittsburgh PA
ABSTRACT
There is an increasing trend to use computer display systems as the primary "window" by
which users see and interact with complex dynamic processes (e.g., air traffic control;
computerized control rooms for process control). These kinds of applications offer special
challenges to the design of computer based display systems.
In
particular, the large scope
of these applications necessitates large display structures involving thousands of displays.
Further, the dynamic nature
of
the tasks mean that users need
to
be able to move rapidly
through the display structure to keep pace with temporally evolving situations and to be
able to respond
to
new events as they occur.
As
a result; special display navigation
challenges arise
in
computer based display systems for monitoring and controlling dynamic
processes.
TRENDS
IN
INFORMATION TECHNOLOGY
There
are
a set of technological changes that are in motion
with regard to information technology in dynamic control
applications such as nuclear power control rooms, industrial
process control, surgical operating rooms (e.g., Cook,
Woods and Howie, 1990), process control systems
supporting space missions, and commercial aircraft
flightdecks. First, information technology is moving from
separated physically parallel devices to integrated computer
based systems. Second, the computer based display
technology has shifted from character-oriented graphics
systems to pixel-addressable systems and now to bit-
mapped, multi-windowed workstations. Third, we are
moving to greater intelligence in systems, especially the
combination of intelligent processing
of
data and powerful
display technologies. Fourth, we are moving from hybrid
systems, where hybrid means a mixture of hardwired and
computer based instrumentation (for example, the current
glass cockpits in commercial aviation are hybrid systems
-
Wiener, 1989), to completely computer based systems. This
is happening on a variety of scales and timing; for example,
Electricite de France has implemented the first fully computer
based nuclear power control room (Sun, 1990).
New Possibilities
The technological changes related to information systems
also lead us to new possibilities. Instead of having each
piece of plant data in one home (and in one form)
-
primary
system pressure is there, volume control tank level over
here, etc., instead of the operator, pilot, physician going to
the data, we can use the power of the computer to put each
piece of plant data in context that makes it meaningful for a
particular mode of system operation (Woods, in press). We
can bring the data they need to the practitioners and display
the data in different ways corresponding to the practitioners'
task.
Another trend is integrated smart displays. This type of
display puts together several or even tens of individual data
points into one integrated display that shows higher order
properties of the system, and it requires the use of either
heuristic or algorithmic processing power behind the surface
picture (for examples cf., Beltracchi, 1987; Woods et al.,
1987; Woods and Elias, 1988).
A
shift going on that may not be as apparent is that we
are
moving beyond the classic hierarchical organization of
computer based displays. The hierarchical display system
could work when we were dealing with a backfit system of
as many as
30
or
40
individual computer displays. But when
we
are
dealing with a fully computerized control center that
includes thousands of displays, hierarchical organization is
inadequate and does not provide the navigational tools
needed to deal with the huge space of data display that is
possible. Today people are beginning to use multi-
windowed display systems. The real question with window
managers is that they allow the designer to address questions
about how to coordinate different kinds of display frames
into a coordinated workspace (Henderson and Card, 1987;
Woods, in press). Guidelines for computer based display of
data, while extremely weak at the level of designing specific
graphic forms (Woods and Eastman, 1989), are virtually
non-existent for the design of a computer based workspace
rather than the physical workspace of a spatially distributed
hardwired control room (but cf., Woods, 1984).
Technological change is making the concept of a display
page wither away because of windows and because of
intelligent processing
so
that the display is customized to the
current data or system conditions. Furthermore, technology
is pushing into the area
of
automatic display
assembly/creation instead of laborious crafting of each
individual display (e.g., Mackinlay, 1986; Roth and Mattis,
1990). Obviously, this becomes animportant issue as the
scope of such systems expands to an entire control center.
396
PROCEEDINGS
of
the
HUMAN FACTORS
SOCIETY
34th
ANNUAL MEETING-I990
New Difficulties
As all of these changes are occurring, we need to ask
if
we
are better off. The shift to more computerization in control
centers, where the computer can do much more processing,
doesn't eliminate all of the hard problems in control center
design.
Look at the difference between a hardwired control room and
a fully computer based control room. In the hardwired case
much of the design work is directly visible in the layout of
controls, displays, status panels and annunciators in the
physically available space. But look closely at a picture of a
proposed fully computerized control room.You can
see
the
arrangement of the computer screens and workstations, but
the real design action and potential complexity is behind the
screens in the thousands of displays that an operator could
call up (for example, the Electricite de France computerized
control room has well over ten thousand computer displays,
and a new operating room integrated patient monitoring
system has well over 150 menu screens).
Before people had to navigate a large array of spatially
dedicated physically separated displays; now they will have
to navigate in a virtual space of thousands of computer
displays. Note how large display networks create new HCI
design challenges. The critical design bottleneck is shifted
from individual displays to the system of displays. Design
errors that can occur at the level of the interaction across
displays create new types of human performance problems
including getting lost in large display networks, tunnel
vision onto only a narrow subset of displays, display
thrashing, and new types of mental overhead related to
managing the display of data (e.g., Woods, 1984; Elm and
Woods, 1985; Henderson and Card, 1987; Cook et al.,
1990).
Our
experience with computer based display of data in
control centers comes from backfits of computers into
hardwired control centers, what we referred to earlier as
hybrid control centers. However, a system of thousands of
displays is not a simple evolution from a
30
display page
system organized in a simple hierarchy; it is a radical step-
change relative to information management. The challenges
in designing large display networks revolve around how to
avoid users getting lost in the large space
of
possibilities,
and how to avoid tunnel vision, or keyhole effects, where
users focus in on only a small portion
of
the display space
and are unaware of important changes in plant status that are
indicated in other parts of the display space where they are
not looking (Woods, in press). Large scale display systems
can place new mental burdens on the operator related to
information management (for specific studies showing this
see Moray, 1986; Cook et al., 1990). Given that one of the
problems in existing control centers is data overload in
rapidly changing circumstances, the shift to more computer
based systems can exacerbate this problem as well as
mitigate it (Wiener, 1989; Cook et al., 1990).
The use of designer aids that support rapid prototyping of
displays, as is necessary when one is developing large scale
display systems, can lead to a proliferation of displays
without adequate consideration of across display
organization and navigation issues. In part, this can occur
because, when it is technologically easy to create and add a
new display to the system, the solution to every problem can
end up being a new computer display (this is in part why
there is such a huge number of computer displays in the new
French nuclear control room). This is one case that illustrates
a potential for misuse of rapid prototyping: with rapid
prototyping techniques, you can make the same mistakes,
only more quickly or on a larger scale.
WORKSPACE DESIGN
At the workspace level, design activities are concerned with
grouping, organizing and coordinating forms to create a
workspace
-
the set of
viewports
and classes of
display
chunks
(content) that can be seen together in parallel or in
series. In the extreme only one viewport
is
available and
each display chunk takes up the entire viewport (the
historical default of a "display page"). The workspace can
include multiple windows and/or multiple
VDUs
as
viewports. The workspace includes the classes of display
chunks that are available and their inter-relationships.
Design of the workspace requires specification of how the
classes of display chunks are mapped into the available
viewports, i.e., a set of coordinated viewports/display
classes. Total flexibility, i.e., any display chunk can appear
in any viewport as the observer chooses, represents a failure
to design the workspace (e.g., Moray, 1986; Cook et al.,
1990).
A viewport is defined as any screen real estate that serves as
a unit where display chunks can appear. A viewport can be a
region within a single VDU
(
i.e., a window) or a whole
VDU screen. A set of viewports (Le., a workspace) can
consist of multiple window viewports, multiple VDU
viewports, and any combination of the two (Norman,
Weldon,
&
Shneiderman, 1986).
One of the critical forcing functions in the design of data
displays for the computer medium is that the set of
potentially observable display units or chunks is very much
larger than the available viewports (physical display area
or
real estate). This characteristic of computer based display
systems creates the danger of the
keyhole
effect where the
user is unable to maintain a broad overview, becomes
disoriented, fixated, or "lost" in the display structure
(Woods, 1984; Elm and Woods, 1985; Nielsen
&
Lyngboek, 1990). The main challenge in developing
computer based display systems is how to capitalize on the
computational power and display flexibility of computers
while still supporting the critical monitoring functions that
require the kind of rapid access to particular pieces of
information that traditional control center design supports at
least to some degree.
PARADIGMATIC COGNITIVE FUNCTIONS ELATED
TO
DISPLAY NAVIGATION
There are a number of paradigmatic user cognitive activities
that arise in dynamic process control applications for testing
an interface design with respect to navigation. How the
control center/workspace design supports or fails to support
these paradigmatic cognitive activities will determine the
degree of navigational trouble and whether any keyhole
effects occur. These constitute a set of "test cases" that can
provide a guide in designing or evaluating the viability of
any display navigation mechanism for dynamic control
applications.
397
PROCEEDINGS
of
the HUMAN FACTORS SOCIETY 34th ANNUAL MEETING1990
Consider the traditional control center. When you enter one,
even from the back of the room, among the first things that
you notice is that there are annunciator panels and/or large
screen displays ("big boards") that provide some indication
of overall system status
to
a knowledgeable observer
(therefore let
us
refer to the displays that support this as
'overview displays.' Stripping away the physical
implementation, there
are
several cognitive functions that
are
being supported by overview displays. To understand some
of these
it
is important to remember that control centers
are
almost always multi-agent settings (e.g.,
2
or
3
pilots on a
flightdeck plus autopilot functions; 15-20 people in the
mission control center at the Johnson Space Center during a
mission).
First, the overview displays provide
a common frame
of
reference
for multi-agent problem solving (Roth and Woods,
1989). The representation of the state of the process in
existing control centers is held in common by virtue of its
extension in a physical space. The multiple people in the
setting use this property of the system representation to
share data, to coordinate their activities across interacting
scopes of responsibility, to collaborate in solving problems
-
in other words, the representation provides support as an
informational medium for collaborative work (Le., the
current interest in computer-mediated collaborative work,
e.g., Stefik et al., 1987; Olson, 1989).
Second, overview displays support a rapid overall
assessment
of
system state, what we will refer to as an
orienting function. For example, when a new event changes
the system (such as a safety system activation in process
control), human operators quickly scan or walk the control
board to update, revise and evaluate their situation
assessment. During this scan the human practitioner checks
automatic responses, builds partial diagnosis about what
factors are at work in the system, and detects abnormalities.
Note that during this period the skilled practitioner is able to
detect abnormalities that heishe is not specifically alerted to
(data-driven search) or looking for (knowledge-driven
search). This can be termed incidental detection of
abnormalities because detection occurs in the course of doing
another activity. This occurs in traditional control centers
because in moving from one display to another in a physical
space one must scan the displays in between allowing for the
possibility of incidental detection of unexpected or abnormal
states. Also consider a common multi-agent aspect of fault
management in control centers in various domains. When
trouble occurs, very often new people enter the scene to
provide additional monitoring or control or diagnostic
resources. An effective representation should support the
integration of these newcomers by providing a mechanism
for them to size up the situation and integrate smoothly into
the team, without interference in the ongoing activities of the
original personnel on the scene.
A third cognitive function is the ability to rapidly shift views
to track a dynamically evolving event, what we call the
attentional control function. This addresses the need for
attentional control in a multi-signal, interleaved task situation
where new signals may interrupt ongoing lines
of
reasoning
and activity (e.g., Miyata and Norman, 1986; Woods, in
press). Consider the case where some trouble has occurred
in the system; the human practitioner has detected and
focused in on evaluating the source of the trouble and
possible corrective actions. However, there is a danger of
becoming too focused on that particular trouble spot and
failing to monitor the rest of the system for potential trouble
which may require a shift in attentional and action focus (for
example, the L-101 1 Everglades crash).
An
effective
representation should support the ability to step back from
the current detailed
focus
of attention and quickly size up the
entire situation in a mentally economical way that does not
disrupt the ongoing line
of
reasoning.
A fourth cognitive function is concerned with the problem
of
how does the observer decide where to look next
.
A bane of
today's information habitats is data overload. Technological
change tends to exacerbate rather than relieve the problem.
Representation design needs to support the domain
practitioner in filtering irrelevant data and focusing in on
relevant data given the current context. Many commentators
and much research on complex dynamic control domains
point out that data overload is a critical limiting factor on
human performance in traditional control centers, and that
the success of new technology is strongly tied to whether it
helps practitioners
cope
with overload, or exacerbates
overload (e.g., Woods, in press). Hence, this is a critical
issue to measure in defining the success of a system in
avoiding the keyhole effect.
CONCLUSIONS
It
is
very important to note that the cognitive functions in
monitoring complex dynamic systems identified above
are
not necessarily well supported by existing control center
designs. The support that does exist for these functions in
general was not a deliberate or conscious act of individual
design teams but a serendipitous property of the medium of
representation
-
physically parallel displays, and the result of
a historical process of human adaptation to the interacting
constraints imposed
by
the representation, and by the task
demands (e.g., for an historical analysis for one case, see
Cook
et
al., 1990).
However, the technological shift to a computer medium is a
double-edged sword. While it provides new representational
power for supporting cognitive work, it also undermines
some partially successful adaptations that have been worked
out for the previous medium. It provides the capability to
create much worse control centers (e.g., Elm and Woods,
1985), as well
as
much better ones (e.g., Stefik et al., 1987;
Olson, 1989) than our previous baseline. The research
challenge is to better understand the nature of aided human
information processing and computer-mediated work. This
will require integrating results and generalizing across
different specific domains and different specific
technological systems to identify a deeper underlying
structure of concepts.
ACKNOWLEDGMENTS
The ideas reported here have been stimulated in part by
research sponsored by NASA Ames Research Center to the
first author on human-automation interaction in aerospace
systems and in part by research sponsored by Westinghouse
Electric Corporation
to
the Human Sciences group of their
Science and Technology Center on the design of computer
based control centers. Portions of this work were presented
by the first author at a National Science Foundation panel
meeting on European Research and Development on Nuclear
398
PROCEEDINGS
of the
HUMAN FACTORS SOClElY
34th
ANNUAL MEETING-1990
Power Instrumentation, Controls, and Safety Technology,
December
1989.
REFERENCES
L.
Beltracchi. A direct manipulation interface for heat
engines based upon the Rankine cycle.
IEEE Systems,
Man, and Cybernetics,
SMC-
17:478-487, 1987.
history of introducing new information technology into
a dynamic high-risk environment. In
Proceedings
of
the Human Factors Society, 34th Annual Meeting,
1990.
R. I. Cook,
D.
D. Woods, and M.
B.
Howie. The natural
W.C. Elm and D.D. Woods. Getting lost: A case study in
interface design. In
Proceedings
of
the Human Factors
Society, 29th Annual Meeting,
1985.
A. Henderson and
S.
Card. Rooms: The use of multiple
virtual workspaces to reduce space contention in a
window-based graphical interface.
ACM Transactions
on Graphics,
5:211-243, 1987.
Support of Multiple Activities. In D.A. Norman and
S.W. Draper, editors,
User Centered System Design:
New Perspectives on Human-Computer Interaction,
Erlbaum, Hillsdale NJ,
1986.
Y. Miyata and D.A. Norman. Psychological Issues in
N. Moray. Modelling cognitive activities: Human limitations
in relation to computer aids. In E. Hollnagel,
G.
Mancini, and D.D. Woods, editors,
Intelligent
Decision Support,
Springer-Verlag, New York,
1986.
J. Nielsen
&
U. Lyngboek Two Field Studies
of
Hypermedia Usability. In C. Green
&
R. McALeese
(Eds.):
Hypertext: Theory Into Proactice
II,
INTELLECT,
1990.
Cognitive Layouts of Windows and Multiple Screens
for User Interfaces.
International Journal
of
Man-
Machine Studies,
1986, 25,229-248.
K.
L.
Norman,
L.
J.
Weldon and B. Shneiderman
M.H. Olson (Ed.)
Technological
Support
for Work Group
Collaboration.
Erlbaum, Hillsdale NJ,
1989.
E.M. Roth and D.D. Woods. Cognitive Task Analysis: An
approach to knowledge acquisition for intelligent
system design. In
G.
Guida and C. Tasso, editors,
Topics in Expert System Design,
North-Holland, New
York,
1989.
M. Stefik,
G.
Foster, D. Bobrow,
K.
Kahn,
S.
Lanning
and
L.
Suchman.Beyond the chalkboard: Computer
support for collaboration and problem solving in
meetings.
Communications of the ACM,
30:32-47,
1987.
W.
Sun.
A Summary Status Report of French Nuclear
Power Plants Information and Control Systems.
Assessment
of
European Nuclear Instrumentation,
Control and Safety Technology.
J.
Sackett and
P.
Planchon (eds.), National Science Foundation,
1990.
E.L. Wiener.
Human Factors
of
Advanced Technology
(”
Glass Cockpit”)
Transport Aircraft. Technical
Report1
17528,
NASA,
1989.
D.D. Woods.Visua1 Momentum: A concept to improve the
cognitive coupling of person and computer.
International Journal
of
Man-Machine Studies,
2 1 ~229-
244, 1984.
D.D. Woods. The Cognitive Engineering of Problem
Representations. In G.R.S. Weir and J.L. Alty,
editors,
Human-Computer Interaction and Complex
Systems,
Academic Press, London, in press.
integral display concept. In
Proceedings
of
the Human
Factors Society,
32nd Annual Meeting,
1988.
D.D. Woods and M.C. Eastman. Integrating principles for
human-computer interaction into the design process.
In
IEEE International Conference on Systems, Man, and
Cybernetics,
IEEE,
1989.
D.D. Woods and G. Elias. Significance Messages: An
D.D. Woods, J. OBrien, and L.F. Hanes. Human factors
challenges in process control: The case of nuclear
power plants. In
G.
Salvendy, editor,
Handbook
of
Human FactorslErgonomics,
Wiley, New York,
1987,
1724- 1770.
399
... One of the main challenges in designing this human-machine system is the complexity of the information that needs to be presented to the operators. For example, the navigation and control challenges cause designers difficulties (Woods, Roth, Stubler, & Mumaw, 1990). Moreover, besides the design of displays, researchers have attempted to evaluate and measure human performance by proposing an evaluation framework of human performance in nuclear power control rooms (Roth, Mumaw, & Stubler, 1992). ...
Article
Operators monitor gauge shapes in control room displays to maintain the status of the refinery control process within normal limits. To test if shapes currently used by operators are effectively designed, two experiments were conducted to test the effectiveness of individual shapes in a Visual Thesaurus toolkit for comprehension and change detection. Moreover, eye tracking technology was used to investigate basic perceptual recognition behaviors of each Visual Thesaurus shape for comprehension and change detection. Thirty-one students from Pennsylvania State University participated in this experiment. It was found that pupil size could be used to track the cognitive load of participants. Correctness was also measured but it was not as sensitive as pupil size, as it did not capture a difference in the effectiveness of shapes.
... Non-transparency (Funke, 2010;Kluge, 2014;Vi-cente, 1999;Woods, Roth, Stubler, & Mumaw, 1990), which requires the operator to work with more or less abstract visual cues that need to be composed into a mental representation and compared with the operator's mental model (Kluge, 2014). Multiple or conflicting goals (Brehmer & Dörner, 1993;Funke, 2010;Reason, 2008;Verschuur, Hudson, & Parker, 1996;Wickens & Hollands, 2000), which require the operator either to balance management intentions or to decide on priorities in the case of goal conflicts in the decision-making process, e.g. which course of action to take (Kluge, 2014). ...
Article
Full-text available
To handle complex technical operations, operators acquire skills in vocational training. Most of these skills are not used immediately but at some point later; this is called temporal transfer. Our previous research showed that cog-nitive abilities such as general mental ability (GMA) and memory are good predictors of temporal transfer. In addition to temporal transfer, operators also have to solve non-routine and abnormal upcoming problems using their skill set; this type of transfer is called adaptive transfer. Based on previous findings, it is assumed that GMA and memory will affect adaptive transfer as well. Thirty-three engineering students learned how to operate a complex technical system in normal operation with either a fixed or a contingent sequence. After two weeks, all participants had to adapt their learned skills to handle the adaptive transfer task, which was not initially trained. It was shown that high GMA positively predicted adaptive transfer, but no effect of memory was found. This implies that GMA is required to solve new complex tasks using a learned skill set. The findings are in line with studies that showed an effect of GMA on temporal transfer.
... In ACRs, information is separated on several displays, with each display revealing only a fraction of the total process. When navigating repeatedly, operators may focus on only a small portion of the display space and are unaware of important changes in plant status that are indicated in other parts of the display space where they are not looking (Kim & Seong, 2009;Woods, Roth, Stubler, & Mumaw, 1990). Therefore, the design of digital HCI systems should help operators keep the "big picture" through introducing overall information on system functions and physical features, through introducing overall information on functional and physical features related to whole or part of the system, and through designing interfaces that support pattern recognition (Braseth et al., 2009). ...
Article
Full-text available
Alarm systems are designed to provide cues to make operators aware of an operational problem, so that mitigation actions can be taken. New technologies are enabling innovative designs of alarm systems for safety-critical systems. To help better design alarm systems in advanced control rooms of nuclear power plants (NPPs), recent researches on alarm system design in the NPP domain are reviewed from the following aspects: 1) functions of an alarm system and opportunities and challenges of new technologies; 2) effects of alarm floods and different alarm management methods; 3) visual presentation of alarm information; 4) evaluation criteria for alarm systems. Through the comprehensive literature review, this paper proposes research topics related to human factors and ergonomics that are worth studying in advanced alarm systems of NPPs.
... overview displays), operators may not be able to assess all the alarm information, thereby requiring lower-level displays for more detailed alarm information. Under these circumstances, the key-hole effect (Woods et al. 1990) is likely to occur, i.e. while searching repeatedly, the operators may focus on a small area in the interface, without recognising the overall state of the plant (Kim and Seong 2009). Among the few validations of such integration, O'Hara et al. (2000)found that except for the preference of the operators, the task performance largely remained the same. ...
Article
In the main control rooms of nuclear power plants (NPPs), operators frequently switch between alarm displays and system-information displays to incorporate information from different screens. In this study, we investigated two integrated designs of alarm and process information – integrating alarm information into process displays (denoted as Alarm2Process integration) and integrating process information into alarm displays (denoted as Process2Alarm integration). To analyse the effects of the two integration approaches and time pressure on the diagnosis performance, a laboratory experiment was conducted with ninety-six students. The results show that compared with the non-integrated case, Process2Alarm integration yields better diagnosis performance in terms of diagnosis accuracy, time required to generate correct hypothesis and completion time. In contrast, the Alarm2Process integration leads to higher levels of workload, with no improvement in diagnosis performance. The diagnosis performance of Process2Alarm integration was consistently better than that of Alarm2Process integration, regardless of the levels of time pressure. Practitioner Summary: To facilitate operator’s synthesis of NPP information when performing diagnosis tasks, we proposed to integrate process information into alarm displays. The laboratory validation shows that the integration approach significantly improves the diagnosis performance for both low and high time-pressure levels.
... • Avoid attentional narrowing or keyhole effects. A frequent problem for mutual awareness is that when the operator switches to the teammate's interface, he may miss the potentially important information in his own (Woods, Roth, Stubler, & Mumaw, 1990). Thus, a switch button for information panel was added, and the operator could only switch the information panel to the teammates without neglecting the alarms and operation instructions in his own island. ...
Article
In the teamwork of nuclear power plants (NPPs), the maintenance of mutual awareness enables the operators to have an up-to-the-moment understanding of each other’s’ work and makes the collaboration more efficient. Providing interface support for mutual awareness is proved to be an effective way to enhance the operators’ mutual awareness in digital systems. What mutual awareness-relevant information to provide and how to present the information on the display are two questions worth studying. This research focused on the above two questions and provided a newly designed mutual awareness toolkit on the context of digital interfaces in NPPs. The usability of the designed toolkit was evaluated through a laboratory experiment. The influence of the mutual awareness toolkit on team performance and its interaction effect with task complexity was further examined under incident scenarios. The results showed that the designed mutual awareness toolkit improved the operators’ mutual awareness, while did not decrease their individual situation awareness (SA) or impose extra mental workloads. In team diagnosis tasks, the teams using the mutual awareness toolkit more thoroughly discussed the incident scenarios and identified more key points of the incidents. The diagnosis correctness, perhaps moderated by other factors besides mutual awareness, was not significantly improved.
... Such integration also risks setting the alarm information apart, with some alarm information not being accessible from the highlevel system information displays (e.g., overview displays). As operators need to go into lower level interfaces for more detailed alarm information, the key-hole effect is likely to occur (Woods et al., 1990), which is adverse to the safe operation of NPPs. ...
Article
In main control rooms of nuclear power plants (NPPs), operators often have to frequently switch their attention between alarm displays and system information displays to incorporate information from different screens. In this study, we proposed the idea of integrating system information into alarm displays. A bar-based integrated design of alarm display was proposed, and it was compared against a tile-based integrated design, and a traditional separate design through a lab experiment. To verify the idea of integration, forty-eight participants were randomly assigned to the three integration conditions to perform basic alarm response tasks, and their situation awareness levels and subjective evaluations were collected. The results indicated that the participants preferred the idea of integrating system information into alarm displays. Besides, the bar-based integrated display supported higher correct rate of answers to situation awareness questions related to the developing scenario than the tile-based integrated design. The idea of integrating system information into alarm displays merits further research and may be applicable to other safety–critical industries.
Chapter
The administration of anesthesia is a complex monitoring task and, as such, requires sustained vigilance. Unfortunately, humans are not very good at monitoring because we are error-prone, and our vigilance is susceptible to degradation by a variety of human, environmental, and equipment factors. Designers of anesthetic equipment therefore have attempted to aid the anesthesiologist by incorporating devices and systems that augment vigilance and clinical performance. Alarms intended to notify the operator of potentially critical situations are effective only if properly designed and implemented. Although many modern anesthesia delivery devices are physically integrated and generally contain systems for gas delivery, monitoring, alarms, and sometimes record keeping, many of the promised benefits of full-scale integration (e.g., “smart” alarms, decision aids) are as of yet unfulfilled. The successful implementation of comprehensive integrated anesthesia workstations will require further technologic advances as well as a more complete understanding of the task of administering anesthesia and the factors that affect performance of the anesthesiologist in this complex human/machine environment. Research to elucidate these “performance-shaping factors” in anesthesia has been under way for a number of years and is beginning to bear fruit.
Chapter
Human factors evaluation is a critical process for ensuring the success of man-machine systems. Meister (1987a, b) describes more traditional methods for systems design, development and testing, and system effectiveness testing. However, recent advances in man-machine interface system (M-MIS) technology (i.e., sensors, processors, displays and controls, and the overall control room configuration) and the man-machine systems that they control have created the need for more sophisticated approaches to evaluation. In particular, special attention is needed to address (a) cognitive issues related to how operators develop and maintain awareness of the state of the controlled system and (b) crew coordination issues related to the ways that individuals interact with each other and the rest of the man-machine system to accomplish operational goals. These issues may be difficult and expensive to resolve if evaluation is left until late in the design process and tested using a production prototype or a full-scope, full-scale simulator. This paper proposes an alternative approach in which human performance issues are evaluated earlier in the design process using lower fidelity testbeds (e.g., part-task simulators) in addition to more traditional means. This approach requires a systematic evaluation framework for defining issues and specifying required attributes of testbed fidelity.
Article
Full-text available
This chapter discusses some of the common pitfalls that arise in building intelligent support systems and describe a pragmatic knowledge acquisition approach for defining and building effective intelligent support systems. The cognitive task analysis provides an umbrella structure of domain semantics that organizes and makes explicit what particular pieces of knowledge mean about problem-solving in the domain. Acquiring and using such a domain semantics is essential (l) to specify what kinds of cognitive support functions are needed, (2) to specify what kinds of computational mechanisms are capable of providing such functions, (3) to clearly delineate machine performance boundaries, and (4) to build less brittle machine problem-solvers, for example, through features that enable the human problem-solver to extend and adapt the capability of the system to handle unanticipated situations. This is in contrast to technology-driven approaches where knowledge acquisition focuses on describing domain knowledge in terms of the syntax of particular computational mechanisms. In other words, the language of implementation is used as a substitute for a cognitive language of description. The cognitive task analysis approach redefines the knowledge acquisition problem: knowledge acquisition, first, is about deciding what kinds of intelligent systems would make a difference and, second, about what domain specific knowledge is needed to fuel those systems.
Article
Full-text available
At Xerox PARC, an experimental meeting room called the Colab has been created to study computer support of collaborative problem solving in face-to-face meetings. The long-term goal is to understand how to build computer tools to make meetings more effective. The authors describe the meeting tools we have built so far as well as the computational underpinnings and language support they have developed for creating distributed software. Finally, they present some preliminary observations from their first Colab meetings and some of the research questions they are now pursuing.
Article
Full-text available
Computer display system users must integrate data across successive displays. This problem of across-display processing is analogous to the question of how the visual system combines data across successive glances (fixations). Research from cognitive psychology on the latter question is used in order to formulate guidelines for the display designer. The result is a new principle of person-computer interaction, visual momentum, which captures knowledge about the mechanisms that support the identification of “relevant” data in human perception so that display system design can support an effective distribution of user attention. The negative consequences of low visual momentum on user performance are described, and display design techniques are presented to improve user across-display information extraction.
Article
Full-text available
A key constraint on the effectiveness of window-based human-computer interfaces is that the display screen is too small for many applications. This results in “window thrashing,” in which the user must expend considerable effort to keep desired windows visible. Rooms is a window manager that overcomes small screen size by exploiting the statistics of window access, dividing the user's workspace into a suite of virtual workspaces with transitions among them. Mechanisms are described for solving the problems of navigation and simultaneous access to separated information that arise from multiple workspaces.
Article
In order to make computers easier to use and more versatile many system designers are exploring the use of multiple windows on a single screen and multiple coordinated screens in a single work station displaying linked or related information. The designers of such systems attempt to take into account the characteristics of the human user and the structure of the tasks to be performed. Central to this design issue is the way in which the user views and cognitively processes information presented in the windows or in multiple screens. This paper develops a theory of the “cognitive layout” of information presented in multiple windows or screens. It is assumed that users adopt a cognitive representation or layout of the type of information to be presented and the relationships among the windows or screens and the information they contain. A number of cognitive layouts are derived from theories in cognitive psychology and are discussed in terms of the intent of the software driving the system and congruence with the cognitive processing of the information. It is hypothesized that the particular layout adopted by a user will drastically affect the user's understanding and expectation of events at the human-computer interface and could either greatly facilitate or frustrate the interaction. Ways of ensuring the former and avoiding the latter are discussed in terms of implementations on existing multiple-window and multiple-screen systems.
Article
A three-year study of airline crews at two U.S. airlines who were flying an advanced technology aircraft, the Boeing 757 is discussed. The opinions and experiences of these pilots as they view the advanced, automated features of this aircraft, and contrast them with previous models they have flown are discussed. Training for advanced automation; (2) cockpit errors and error reduction; (3) management of cockpit workload; and (4) general attitudes toward cockpit automation are emphasized. The limitations of the air traffic control (ATC) system on the ability to utilize the advanced features of the new aircraft are discussed. In general the pilots are enthusiastic about flying an advanced technology aircraft, but they express mixed feelings about the impact of automation on workload, crew errors, and ability to manage the flight.
Conference Paper
An alternative to the traditional human-factors guideline document is described. This new concept for organizing principles for human-computer interaction (HCI) was developed in response to four deficiencies in existing guidelines: (a) incorrect conceptualization of the units of display design, (b) inadequate integration of principles into process, (c) unrealistic view of the design process, and (d) inadequate communication of content issues. The authors describe the new heterarchical concept for organizing HCI guidance, including examples of how it would be used during the design process and examples of the kind of HCI guidance that appears within it. Under this approach, the basic units are theme-based modules; the organization is based on levels of display system design; and format illustrations are set up as models to be emulated
A concept for a direct manipulation interface to monitor and control water-based Rankine cycle heat engines is discussed and illustrated. The computer-driven interface utilizes measured plant data to form a model of the process. The model is presented in the form of an icon based upon the Rankine cycle. The icon is formed in terms of the temperature and entropy properties of water from various points within the Rankine cycle. In addition, real-time mimics of plant systems, which contain, control, and interact with the process, are presented within the same display format. The image displayed (on a cathode-ray tube) is designated a system-process iconic display of plant operations. An inference engine driven by plant data updates the display of plant segments which compose the iconic display. The iconic display serves as a visual knowledge base of plant operation as the interface is based on a model-world metaphor. A keyboard containing dedicated control keys serves as an interface to actuate control tasks allocated to human operators. A direct manipulation of the heat engine is achieved through the dedicated keys as the icon responds to plant reactions activated by the control tasks. The human operator evaluates the data presented in the system-process iconic display to determine if the goals of operation are being met. The workload imposed upon human operators to assess goals is minimized, making the human operator an effective supervisor of resources and also an active integrated member of plant operations.