ArticlePDF Available

Abstract

Technological advances in avionics systems and components have facilitated the introduction of progressively more integrated and automated Human-Machine Interfaces and Interactions (HMI²) on-board civil and military aircraft. A detailed review of these HMI² evolutions is presented, addressing both manned aircraft (fixed and rotary wing) and Remotely Piloted Aircraft System (RPAS) specificities for the most fundamental flight tasks: aviate, navigate, communicate and manage. Due to the large variability in mission requirements, greater emphasis is given to safety-critical displays, command and control functions as well as associated technology developments. Additionally, a top-level definition of RPAS mission-essential functionalities is provided, addressing planning and real-time decision support for single and multi-aircraft operations. While current displays are able to integrate and fuse information from several sources to perform a range of different functions, these displays have limited adaptability. Further development to increase HMI² adaptiveness has significant potential to enhance the human operator's effectiveness, thereby contributing to safer and more efficient operations. The adaptive HMI² concepts in the literature contain three common elements. These elements comprise the ability to assess the system and environmental states; the ability to assess the operator states; and the ability to adapt the HMI² according to the first two elements. While still an emerging area of research, HMI² adaptation driven by human performance and cognition has the potential to greatly enhance human-machine teaming through varying the system support according to the user's needs. However, one of the outstanding challenges in the design of such adaptive systems is the development of suitable models and algorithms to describe human performance and cognitive states based on real-time sensor measurements. After reviewing the state-of-research in human performance assessment and adaptation techniques, detailed recommendations are provided to support the integration of such techniques in the HMI² of future Communications, Navigations, Surveillance (CNS), Air Traffic Management (CNS/ATM) and Avionics (CNS + A) systems.
_________
* Corresponding author: Roberto Sabatini
E-mail address: roberto.sabatini@rmit.edu.au
Approved for Public Release #18-0766. Distribution Unlimited. Date Approved: 04/20/18
This is the author uncorrected pre-publication version. This paper does not include the changes arising from the revision, formatting and
publishing process. The final paper (available at https://doi.org/10.1016/j.paerosci.2018.05.002) that should be used for referencing is:
Y. Lim, A. Gardi, R. Sabatini, S. Ramasamy, T. Kistan, N. Ezer, J. Vince and R. Bolia, “Avionics Human-Machine Interfaces and
Interactions for Manned and Unmanned Aircraft.” Progress in Aerospace Sciences, 2018. DOI: 10.1016/j.paerosci.2018.05.002
Contents lists available at ScienceDirect
Progress in Aerospace Sciences
journal homepage: www.elsevier.com/locate/paerosci
Avionics Human-Machine Interfaces and Interactions
for Manned and Unmanned Aircraft
Yixiang Lima, Alessandro Gardia, Roberto Sabatinia,*, Subramanian Ramasamya,
Trevor Kistana, b, Neta Ezerc, Julian Vinced and Robert Boliad
aRMIT University, Melbourne, Victoria 3001, Australia
bTHALES Australia, World Trade Center, Melbourne, Victoria 3005, Australia
cNorthrop Grumman Corporation, 1550 W. Nursery Rd, Linthicum Heights, MD 21090, USA
dDefence Science and Technology Group, Fishermans Bend, Melbourne, Victoria 3207, Australia
ARTICLE INFO
ABSTRACT
Article histo
ry:
Received
Received in revised form
Accepted
Technological advances in avionics systems and components have facilitated
the introduction
of progressively more integrated and automated
Human-
Machine Interfaces and Interactions
(HMI
2) on-board civil and military aircraft. A detailed review of these HMI2
evolutions is
presented, addressing both manned aircraft
(fixed and rotary wing) and
Remotely Piloted Aircraft
System (
RPAS) specificities for the most fundamental flight tasks: aviate, navigate, communi
cate
and manage. Due to the large variability in mission requirements, greater emphasis is given to
safety
-
critical displays, command and control functions as well as associated technology
developments. Additionally, a top
-level definition of RPAS mission-
essential functionalities is
provided, addressing planning and real
-time decision support for single and multi-
aircraft
operations. While c
urrent displays are able to integrate and fuse information from several
sources
to perform a range of different funct
ions, these displays have limited adaptability.
Further
development
to increase HMI2
adaptiveness has significant potential to enhance the human
operator’s effectiveness, thereby
contributing to safer and more efficient operations
. The adaptive
HMI
2 concepts in the literature contain three common elements. These elements comprise
the
ability to assess the system and environmental states
; the ability to assess the operator states;
and
the ability to adapt the HMI
2 according to the first two elements. While s
till an emerging area of
research
, HMI2 adaptation driven by human performance and cognition has the potential
to greatly
enhance
human-machine teaming through varying
the system support according to the user’s
needs. However, one of the
outstanding challe
nges in the design of such adaptive systems is the
development of
suitable models and algorithms to describe human performance
and cognitive
states based on real
-time sensor measurements. After reviewing the state-of-
research in human
performance assessmen
t and adaptation techniques, detailed
recommendations are provided to
support the integration of such techniques in the HMI
2
of future Communications, Navigations,
Surveillance (CNS), Air Traffic Management (CNS/ATM) and Avionics (CNS+A) systems.
Keyword
s:
Adaptive Systems
Avionics
Cognitive Ergonomics
Human Factors Engineering
Human-Machine Interface and
Interaction
Human Performance Assessment
Unmanned Aerial Vehicle
Unmanned Aircraft System
Trusted Autonomy
Remotely Piloted Aircraft
Remotely Piloted Aircraft System
Contents
1. Introduction ................................................................................................................................................................................................................... 2
2. Developments in Manned and Unmanned HMI2........................................................................................................................................................... 3
2.1. Classic Flight Decks ............................................................................................................................................................................................... 3
2.2. First Generation Flight Decks ................................................................................................................................................................................. 6
2.3. Second Generation Flight Decks ............................................................................................................................................................................ 9
2.4. Third Generation Flight Decks ............................................................................................................................................................................. 10
2.5. Military Cockpits .................................................................................................................................................................................................. 12
2.6. Single-Pilot Operations ........................................................................................................................................................................................ 14
2.7. Rotorcraft Cockpits .............................................................................................................................................................................................. 14
2.8. RPAS Ground Segment ........................................................................................................................................................................................ 18
3. Human Factors Engineering Considerations ............................................................................................................................................................... 23
3.1. Physical Characteristics ........................................................................................................................................................................................ 23
3.2. Display Elements: Colour ..................................................................................................................................................................................... 23
2
3.3. Display Elements: Symbology ............................................................................................................................................................................. 24
3.4. Display Elements: Organization of Information ................................................................................................................................................... 24
3.5. Alerting ................................................................................................................................................................................................................ 24
3.6. User Controls and Inputs ...................................................................................................................................................................................... 24
4. Adaptive HMI2 concepts ............................................................................................................................................................................................. 24
4.1. Super Cockpit ....................................................................................................................................................................................................... 25
4.2. Pilot’s Associate ................................................................................................................................................................................................... 25
4.3. Pilot/Vehicle Interface .......................................................................................................................................................................................... 25
4.4. Cockpit Assistant System ..................................................................................................................................................................................... 25
4.5. Cognitive Cockpit................................................................................................................................................................................................. 25
4.6. Augmented Cognition .......................................................................................................................................................................................... 26
4.7. Cognitive Avionics Toolset .................................................................................................................................................................................. 26
4.8. Alerting and Reasoning Management System ...................................................................................................................................................... 26
4.9. Intelligent Adaptive Interface ............................................................................................................................................................................... 26
4.10. All Condition Operations and Innovative Cockpit Infrastructure....................................................................................................................... 26
4.11. Advanced Cockpit for Reduction of Stress and Workload ................................................................................................................................. 26
4.12. Cognitive Man-Machine Interface ..................................................................................................................................................................... 26
4.13. Cognitive Pilot Aircraft Interface....................................................................................................................................................................... 26
5. Driving Adaptations with Human Performance .......................................................................................................................................................... 27
5.1. Subjective Techniques .......................................................................................................................................................................................... 27
5.2. Task-based Measures ........................................................................................................................................................................................... 27
5.3. Analytical Techniques .......................................................................................................................................................................................... 28
5.4. Psycho-physiological measures ............................................................................................................................................................................ 29
5.5. Adaptation Strategies ........................................................................................................................................................................................... 35
6. Systems Design for Adaptive HMI2 ............................................................................................................................................................................ 35
6.1. System Elements .................................................................................................................................................................................................. 35
6.2. Assurance Considerations .................................................................................................................................................................................... 37
7. HMI2 in the CNS+A Context ...................................................................................................................................................................................... 38
7.1. Four-Dimensional Trajectory-Based Operations .................................................................................................................................................. 38
7.2. Performance-Based Operations ............................................................................................................................................................................ 38
7.3. System Wide Information Management and Collaborative Decision Making ...................................................................................................... 38
7.4. Dynamic Airspace Management........................................................................................................................................................................... 38
7.5. Separation Assurance and Collision Avoidance for Manned and Unmanned/Remotely Piloted Aircraft ............................................................. 39
8. Conclusions ................................................................................................................................................................................................................. 39
9. Future Research .......................................................................................................................................................................................................... 39
10. Acknowledgements ..................................................................................................................................................................................................... 39
References ......................................................................................................................................................................................................................... 39
1. Introduction
Ongoing developments in avionics have introduced a number
of new systems on-board civil aircraft, such as terrain and traffic
alerting systems, engine and system monitoring and alerting
systems, flight planning and management systems, data-link
communication systems, as well as electronic information
management and flight instrumentation systems. These
technological innovations have supported higher degrees of
automation, allowing a shift from manual control towards
supervisory management in the flight deck. Increasingly,
machine intelligence and autonomy is propelling the next
generation of technological advances in the flight deck,
particularly in the domain of unmanned/remotely piloted aircraft.
While automated systems are characterised by a set of predefined
responses to planned events, autonomous systems are able to
sense, learn and adapt to changes in the environment. This
paradigm shift represents an evolution in the human-machine
interaction: human-machine interactions with automated systems
are typically limited to top-down supervisory control, but human-
machine interactions with autonomous systems will emphasize
collaboration through human-machine teaming. Autonomous
systems have the potential to contribute to improvements in
operational safety, efficiency and effectiveness but have also
introduced additional Human Factors Engineering (HFE)
considerations to the design of the associated Human-Machine
Interfaces and Interactions (HMI2). The HMI2 needs to be user-
centric by providing the human user with appropriate and timely
information and support, while avoiding overloading the human
user with excessive clutter and information. At the same time,
excessive automation can lead to underloading of the human
user, leading to automation misuse, complacency and loss of
situational awareness. More importantly, appropriate design of
the HMI2 can help to establish trust between human users and
automated systems, which promotes more effective human-
machine teaming.
Ongoing research concepts envisage the use of associate
systems, which enhance operator capabilities by appropriate
adaptations of the HMI2. Associate systems are able to recognize
situations when the human operator requires assistance and
provide the necessary support. Basic associate systems are
typically composed of task management, operator assessment and
interface adaptation modules. The task management module
monitors environmental and system conditions to determine what
actions are required by human operators. The operator
assessment module monitors the functional state and
performance of the operator to determine if additional support is
required. Information from the two modules is communicated to
the interface adaptation module, which reconfigures the HMI2
according to predetermined decision logics.
Although there exists a substantial volume of literature on
operator state assessment, there are still significant challenges
towards implementing associate systems that are able to assess
the operator functional state reliably within the operational
environment. Much of the literature points towards the use of
human performance assessment techniques for assessing certain
cognitive states of the human operator correlated to human
performance. The use of cognitive states to drive adaptation in
HMI2 (termed as Cognitive HMI2, or CHMI2) has been
demonstrated to be feasible and opens up many avenues in the
area of human-machine teaming.
In addition to reviewing current HMI2 developments, this
article provides the state-of-the-art in CHMI2 techniques,
identifying the main challenges and opportunities in this field of
research. In particular, the application of CHMI2 in the
Communication, Navigation, Surveillance (CNS), Air Traffic
3
Management (CNS/ATM) and Avionics (CNS+A) context has
the potential to support a number of emerging operational
concepts. These include operational concepts include the
management of complex trajectories, the continuous monitoring
of system performance, increased air-ground collaboration, the
inclusion of unmanned or Remotely Piloted Aircraft Systems
(RPAS) in non-segregated airspace, as well as the command and
control of multiple unmanned platforms.
The term “Unmanned Aircraft System” (UAS) refers to the
combination of an uninhabited aircraft and its ground control
elements [1]. The UAS is differentiated from the actual aircraft,
which is itself a component of the UAS and is often referred to as
the Unmanned Aerial Vehicle" (UAV). More recently, the terms
“Remotely Piloted Aircraft System” (RPAS) and “Remotely
Piloted Aircraft” (RPA) have been introduced to provide a more
accurate description of such systems [2, 3], since the aircraft is
typically not completelyunmanned but it is controlled by
remote crew members. However, at present, the terms UAS and
RPAS are used interchangeably in the literature. For the sake of
consistency, the acronym RPAS will be used in this article to
describe the system in its entirety, while RPA will be used to
describe the aircraft platform.
2. Developments in Manned and Unmanned HMI2
This section presents an overview of the HMI2 for manned
and unmanned/remotely piloted aircraft. The evolution of civil
flight decks is first presented, followed by a brief overview of
military cockpits, and finally some emerging concepts for Single-
Pilot Operations (SPO) as well as for the control and
coordination of RPA platforms. Early human factors research
originated between WWI and WWII, and was characterised by
the development of methods for pilot selection and training, as
well as in aerospace medicine and physiology. Developments in
flight deck automation between the late-1930s to the 1960s
shifted the focus of human factors research towards physical
ergonomics and aviation psychology, which helped to guide the
design, configuration and layout of the controls and display
instrumentation in crew stations. In the late-1970s, Crew
Resource Management (CRM) was introduced to aviation human
factors, targeted at reducing human error and improving flight
crew performance. With the introduction of glass cockpits in the
late-1980s, human factors research has increasingly focused on
cognitive ergonomics, particularly on the cognitive processes
involved in higher-level information processing, decision making
and automation management. As next generation flight decks
trend towards human-machine interactions with autonomous and
unmanned platforms, human factors research will need to address
the important challenges surrounding trusted autonomy and
human autonomy teaming. Figure 1 illustrates this historical
development across the three flight deck eras mechanical,
electro-mechanical and electro-optical as described by Jukes
[4], Jacobsen et al [5], Moir et al [6] and Abbott [7], along with
the shift in the focus of human factors research [8]. This review
focuses on the HMI design in “classic” electro-mechanical flight
decks, as well as the first, second and third generations of electro-
optical (“glass”) flight decks.
1990
B747-
400
1982
B757,
A310
1900 1920 1940 1960 1980 2000 2020
2005
A380
2009
B787
1970
B747 1988
A320
1918
Smith’s Aviation
Instrument Board 1964
B727
1952
de Havilland Comet
(Jet Age)
1st Gen
Electro-Mechanical Deck 2nd Gen
Electro-Optical Deck Next-Gen
Flight Deck
Classic
EM Deck
Moir et al,
2013 [6]
1st Gen
Glass 2nd Gen
Glass
Classic
Abbott, 2014
[7]
1st Gen 2nd Gen
Classic 3rd Gen
This article
Jukes, 2004
[4] Mechanical
Era Electro-Mechanical
Era Electro-Optical
Era
Orlady, 2017
[8] Selection, training and
operational procedures Physical ergonomics CRM Cognitive
ergonomics Human autonomy
teaming
1st Gen
Glass 2nd Gen
Glass
Jacobsen et
al, 2010 [5] 3rd Gen
Glass
Historical evolution of civil flight decks. Fig. 1.
2.1. Classic Flight Decks
The evolution of flight decks can be traced back to the classic
flight decks of the 1960s, such as the Boeing 727-200 (Fig. 2).
These flight decks feature electro-mechanical instrumentation
requiring the operation of three to five crew members
(comprising the pilot, co-pilot, flight engineer, as well as
navigator and radio operator). HMI2 on classic flight decks are
characterised by low levels of information integration and low
levels of automation. Early warning systems introduced in the
1970s, such as the Ground Proximity Warning System (GPWS),
provided limited automated monitoring functionalities. Gradual
advances in flight deck autonomy have led to de-crewing, with
the navigator and radio operator roles being taken over by
advanced functions in the late-1970s.
4
Boeing 727 classic flight deck. Fig. 2.
Basic Six
The basic six set of flight instruments are meant to support
pilots in maintaining awareness of their aircraft’s essential flight
states and are depicted in Fig. 3. These instruments have
traditionally been either gyroscopic or air pressure-based and
comprise:
Airspeed Indicator (ASI), which provides information on
the indicated airspeed (in knots) of the aircraft. Markings
are used to provide indications of critical airspeed limits
such as take-off, stall, cruise and maximum speeds.
Attitude Indicator, which indicates the aircraft’s pitch and
roll. Information is displayed as an artificial horizon
(coloured to represent the sky, ground and horizon) with
markings to indicate the pitch and bank angles.
Altimeter, which functions as a barometer to indicate the
altitude (feet) of the aircraft. To account for variations in
atmospheric pressure, the pilot calibrates the altimeter by
using an adjustment knob to set the local pressure, which is
displayed on a Kollsman Window. The local pressure is
obtained through advisories from Air Traffic Control
(ATC) or weather reporting stations.
Turn and Slip Indicator, which provides information on the
roll and yaw of the aircraft and is used to perform a
coordinated turn.
Heading Indicator, which provides heading information
independent of the magnetic compass. Errors caused by
precession due to friction lead to heading drift. The rotation
of the earth also leads to wander in the gyroscope. The error
can be corrected by slaving the indicator to a magnetic
sensor, which allows the heading to be constantly corrected;
otherwise, pilots will need to manually realign the indicator
once every ten to fifteen minutes.
Vertical Speed Indicator (VSI), which indicates the rate-of-
climb (feet per minute) of the aircraft.
Basic six flight instruments, [9]. Fig. 3.
Attitude Directional Indicator and Horizontal Situation Indicator
The Attitude Directional Indicator (ADI) as illustrated in Fig.
4 is an evolution of the attitude indicator. Besides providing basic
attitude information, the ADI incorporates flight director overlays
to provide instruction on intercepting and maintaining the desired
flight path. To support Instrument Landing System (ILS)
approaches the ADI also displays glideslope, localizer, speed
deviation and decision height information. The flight director
display is connected to a flight director computer, which
computes the necessary guidance instructions from altitude,
airspeed, attitude, heading, navigation, navigation aid (NAVAID)
data (e.g., from VOR/DME) as well as autopilot mode inputs.
Inclinometers, used in the Turn and Slip Indicator, are an
additional component of ADI to assist the pilot in coordinating
turns.
Attitude directional indicator. Fig. 4.
The Horizontal Situation Indicator (HSI), depicted in Fig. 5,
is an evolution of the heading indicator. In addition to providing
basic heading information, the HSI also includes a Course
Deviation Indicator (CDI) overlay to support radio-based
navigation. When the aircraft’s Very High Frequency (VHF)
Omnidirectional Range (VOR) receiver is tuned to the frequency
of a selected VOR station, the course select knob on the HSI can
be tuned to intercept a chosen radial from the VOR station. The
selected course is indicated on the course select pointer. The
TO/FROM indicator is used to show if the aircraft is flying
towards (if pointing in the same direction as the course select
pointer) or away from (if pointing in the opposite direction as the
course select pointer) the VOR station. The course deviation bar
indicates left/right deviations from the selected course; each
interval on the course deviation scale corresponds to a deviation
of 2 degrees. The HSI is also used in ILS approaches, but instead
of indicating VOR information, the CDI overlay now shows the
course deviation on the localizer. Glideslope pointers are also
displayed on the HSI during ILS approaches.
Horizontal Situation Indicator, [10]. Fig. 5.
5
Third-Gen (2010s-)
Second-Gen (1990s-)
Classic (1960s-)
First-Gen (1980s-)
Airspeed
indicator
Turn and slip
indicator Heading
indicator Vertical speed
indicator
Attitude
indicator Altimeter Course Deviation
Indicator
Magnetic
compass
Flight Director
indicator
VOR
indicator
ADI
Automatic Direction
Finder display
Electronic HSI
Distance Measuring
Equipment display
HSI
Radio Magnetic
Indicator
Electronic ADI
Multi-Functional
Display
Flight Mode
Annunciator panel
TCAS I
display
GPWS display
Weather radar
display
EICAS/ECAM
displays
Primary Flight
Display Navigational Display
Electronic
Flight Bag
Flight charts
Legend Traffic Alert and Collision Avoidance System
Attitude Director Indicator
Electronic Centralised Aircraft Monitor
Engine-Indicating and Crew-Alerting System
Ground Proximity Warning System
Horizontal Situation Indicator
TCAS:
ADI:
ECAM:
EICAS:
GPWS:
HSI:
Synthetic/Enhanced/
Combined Vision
Systems
Heads-Up
Displays
Multipurpose
Control and
Display Unit
Seamless Cockpit
Displays
Enhanced GPWS
display
TCAS II
display
Integrated
Display
Systems
Muti-modal
Interfaces
Electronic
Flight
Instrument
Displays
Flight
Engineer
Panel
Engine pressure
ratio gauge
Oil state indicators
Fuel state
indicators and
controls
Wheels and
landing gear
indicators and
controls
Engine vibration
indicators
Hydraulic status
indicators and
controls
Engine N1, N2
indicators
Exhaust gas
temperature
indicator
Electrical (AC and
DC) power
indicators and
controls
Cabin status
indicators and
controls
Flight control
surface indicators
and controls
APU status
indicators and
controls
Air bleed/intake
indicators and
controls
Basic 6
3D audioMulti-touch
Speech
recognition
Gesture
recognition
Evolution of flight deck displays. Fig. 6.
6
Table 1. GPWS alerts, adapted from [11].
Mode Danger Voice message
Red blinking
light
Amber
blinking light
“Master”
warning
1 Excessive diving speed
Caution
“Sink rate”
Warning
“Whoop whoop, pull up”
2A Excessive rate of
closure Flaps retracted
Caution
“Terrain, terrain”
Warning
“Whoop whoop, pull up”
Recovery
“Terrain, terrain”
2B
Flaps extended
“Terrain, terrain”
3
Loss of altitude following take-off
“Don’t sink”
4A Insufficient terrain
clearance when aircraft
is not in the proper
landing configuration
Landing gear retracted
Airspeed is low
(e.g., < 190 kts)
“Too low, gear”
Airspeed is high
(e.g., > 190 kts)
“Too low, terrain”
4B Landing gear extended
but flaps not in landing
position
Airspeed is low
(e.g., < 154 kts)
“Too low, flaps”
Airspeed is high
(e.g., > 154 kts)
“Too low, terrain”
5
Excessive deviation below the glideslope
“Glide slope”
Ground Proximity Warning System
The Ground Proximity Warning System (GPWS) is an
implementation of the Terrain Awareness and Warning System
(TAWS), which is meant to provide pilots with predictive
cautions and warnings to avoid Controlled Flight Into Terrain
(CFIT) incidents. GPWS primarily uses a low range radio
altimeter to calculate the altitude of the aircraft above ground
level. Other inputs to GPWS may include air data (altitude,
altitude rate, speed) or glideslope deviation. Table 1 describes the
five operating modes of GPWS and the associated audio and
display elements.
2.2. First Generation Flight Decks
Avionics developments in the mid-1970s allowed significant
digitalisation of flight deck systems, leading to the consolidation
of a number of legacy interfaces into combined electronic
displays. Figure 6 illustrates the historical evolution of first,
second and third generation electro-optical deck displays, which
included the Electronic Attitude Director Indicator (EADI) and
Electronic Horizontal Situation Indicator (EHSI); Electronic
Centralised Aircraft Monitor (ECAM) or Engine-Indicating and
Crew-Alerting System (EICAS); as well as Multipurpose Control
and Display Unit (MCDU) of the Flight Management System
(FMS). The Boeing 757 (shown in Fig. 7) and Airbus A310 were
among the first aircraft to feature these electronic displays.
Boeing 757, first generation flight deck, [12]. Fig. 7.
Traffic Collision Avoidance Systems
Traffic Collision Avoidance Systems (TCAS) belong to a
family of airborne systems providing traffic advisories and
collision avoidance protection independently from ground-based
ATC. TCAS I only provides Traffic Advisories (TA) to alert
pilots of nearby aircraft, whereas TCAS II also provides
Resolution Advisories (RA) in addition to TA and is used by the
majority of commercial aviation aircraft. A TCAS display shows
the location of proximate traffic. TA provide cautions and
warnings to indicate possible conflicts with proximate aircraft,
while RA recommend vertical manoeuvers (either a climb or
descent) to the pilot for avoiding the conflict, which are
coordinated between the two TCAS II-equipped aircraft when
possible. The major components of TCAS II are: the TCAS
computer unit, Mode S Transponder as well as the TCAS control
panel and the cockpit display [13]. Figure 8 shows a typical
TCAS display containing both TA and RA information. Other
target aircraft are depicted using geometric symbols. The type of
symbol used depends on the threat status:
Unfilled cyan or white diamond: non-threat traffic;
Filled cyan or white diamond: proximate traffic within 6
nmi and ±1200 ft from own aircraft;
Amber or yellow circle: intruders triggering a TA;
Filled red square: intruders triggering a RA.
Additional information is provided for each target aircraft:
Relative altitude of the target aircraft to own aircraft, given
in hundreds of feet. If the target aircraft is above own
aircraft, the relative altitude is preceded by a '+' (plus) sign.
If the target aircraft is below own aircraft, the relative
altitude is preceded by a '' (minus) sign.
Up (↑) or down (↓) arrows to indicate if the target aircraft is
climbing or descending at more than 500 fpm.
A vertical speed tape is used to indicate the current vertical
speed (by a vertical speed needle), as well as the vertical speeds
to be avoided (red band) and to be flown (green band). The
information displayed on the TCAS display is also typically
shown on other glass displays (e.g., pitch, altitude and climb rate
cues can be displayed on the EADI while traffic information can
be displayed on the EHSI).
7
6
4
2
6
4
2
1
1
.5
.5
012 +08
+04
02
TCAS display. Fig. 8.
A number of human factors concerns have emerged early on
in the use of TCAS, which include [14-16]:
False alarms or nuisance alerts might lead to an erosion of
trust, causing pilots to disregard future alerts;
Advisories are issued under high time pressure, usually
within a minute to collision (e.g., RA tau values range from
15 to 35 seconds), leading to high pilot stress and workload
whenever advisories are triggered;
RA are not shared with ATC or other (conflicting) aircraft
(instead requiring pilots to manually report RA to ATC).
This might result in decreased awareness on the ATC’s part
and lead to conflicting instructions;
Incorrect response to weakening or negative RA (i.e., RA
requiring a reduction in an existing vertical speed, such as
“Reduce climb, reduce climb” or “Adjust vertical speed,
adjust”);
Reversal (e.g., a descend RA reverses to a climb RA) and
crossing (i.e., requiring the manoeuver of own aircraft to
pass through the altitude intruder aircraft, leading to a
crossing of flight paths) RA are challenging to execute and
contribute to pilot workload and stress.
A number of changes to the TCAS system over time have
supported the mitigation of these issues. The changes include:
Improved TCAS logic (e.g., tau-based alerting leads to
reduce false alarms at low altitudes);
Additional TCAS advisories (e.g., RA reversals to prevent
collisions in scenarios involving conflicting ATC
instructions or unresponsive intruder aircraft);
Modifications to the presentation of information (e.g.,
negative and weakening RA have been re-worded for
increased clarity);
Proposed downlinking or sharing TCAS information (e.g.,
following the Uberlingen mid-air collision accident, one of
the recommendations made in the German Federal Bureau
of Aircraft Accidents Investigation report was the down-
linking of RA to ATC [17]).
Electronic ADI and Electronic HSI
The Electronic ADI (EADI) and Electronic HSI (EHSI) serve
as electronic replacements of the ADI and HSI. The EADI and
EHSI integrate salient flight information on a central display,
thereby reducing the need for the flight crew to cross scan
between multiple instruments. The EADI and EHSI also interface
with other avionic systems, such as the Mode Control
Panel/Flight Control Unit (MCP/FCU) or the FMS to provide
navigation and guidance information in addition to basic control
information.
In addition to ADI information, the EADI displays airspeed,
vertical speed, altitude and heading information, effectively
replacing the Basic Six as the primary source of flight
information. The EADI contains a flight mode annunciator to
display the flight and autopilot modes traditionally displayed on
the MCP/FCU. Depending on the phase of flight, EADI elements
are toggled on/off to reduce unnecessary clutter (e.g., the
decision height and glideslope are only toggled on during the
approach phase).
The EHSI combines the functionalities of the different
electromechanical navigation instruments used in classic flight
decks (e.g., HSI and RMI) with additional functionalities by
virtue of FMS integration. Depending on the information
required by the pilot, the EHSI can be switched between different
modes. The main modes include:
MAP (or NAV): used for most phases of flight, different
toggles can be used to display NAVAID, route, airport or
waypoint information, as well as VNAV path deviation;
APP (or ILS): used for approaches, display depends on the
type of approach being performed (e.g., localiser and
glideslope information for ILS approach);
VOR: used for VOR navigation, displays the VOR
indicator on top of a compass rose;
PLN: used for flight planning in conjunction with the FMS.
The EADI and EHSI also integrates surveillance data from a
number of aircraft systems (e.g., TCAS, weather radar and
GPWS) to provide pilots with enhanced awareness of the
surrounding environment (e.g., traffic, weather and terrain).
Flight Management System
The Flight Management System (FMS) was first introduced
on the Airbus A310 and Boeing 767 in the 1980s and has become
a key avionic system on-board modern airliners. The FMS has
been described as the heart of an airplane’s flight planning and
navigation function [18], integrating data from a number of
subsystems including guidance, navigation, control, aircraft
performance, systems management as well as air-ground
communication. A typical FMS consists of one or more Flight
Management Guidance Computers (FMGC) and Control Display
Units (CDU). The tasks performed by the FMS include
performance calculations, trajectory prediction, flight planning
and optimisation, as well as vertical/lateral guidance. While early
functionality was limited to lateral navigation and vertical
guidance, later versions of the FMS have incorporated additional
functionalities including flight planning, wind and temperature
modelling, performance prediction, integration with Global
Navigation Satellite System (GNSS) data as well as Controller
Pilot Data Link Communications (CPDLC) capabilities [19].
Figure 9 illustrates a Boeing-style Multifunction Control Display
Unit (MCDU).
Boeing-style MCDU. Fig. 9.
The MCDU contains multiple pages, which allow the flight
crew to access different FMS functions. Pages are selected by
8
pressing the page keys. Line select and skew keys allow the user
to navigate within and between pages. The main pages on the
Airbus MCDU are:
DIR: used to initiate a direct flight to a waypoint not in the
programmed flight plan;
PROG: provides flight information (e.g., optimum and
maximum cruise flight levels) corresponding to the flight
phase that is currently in progress;
PERF: provides performance data associated with the active
flight phase;
INIT: used for pre-flight initialisation of the flight plan;
DATA: aircraft and navigation sensor status, as well as
FMGC data bases and stored data;
F-PLN: used to view and make revisions to the lateral and
vertical elements of the flight plan;
RAD NAV: displays the NAVAIDs tuned by the FMGC or
selected by the pilot;
FUEL PRED: used for fuel prediction and management;
SEC F-PLN: used to access the secondary flight plan;
ATC COMM: used for text-based communications between
the flight crew and ATC.
The main pages on the Boeing Future Air Navigation System
(FANS) (second generation flight deck) MCDU are:
INIT REF: used in initializing aircraft identification,
position of the Inertial Reference System (IRS),
performance, takeoff, approach and navigation settings as
well as providing reference data;
RTE: used to view, input or change origin, destination or
route;
DEP ARR: for viewing and changing departure and arrival
procedures;
ATC: used for text-based communications between the
flight crew and ATC;
VNAV: used to provide vertical performance guidance
through different phases of flight;
FIX: used to create fixes on the map from known
waypoints, using radials and distances from the waypoint;
LEGS: used to set lateral and vertical route data;
HOLD: used to create holding patterns and add holding
procedures to the active flight plan;
FMC COMM: used for text-based communications between
the flight crew and the company’s airline operations centre;
PROG: shows dynamic flight information (e.g., time,
distance or fuel consumption) of the flight progress;
N1 LIMIT: for viewing the N1 thrust limit as controlled by
the FMGC or selecting from the N1 limit from a number of
options including go-around, maximum continuous, climb
and cruise).
In line with the concepts originally envisioned by the
Advisory Group for Aerospace Research and Development
(AGARD) and by the FANS committee of the International Civil
Aviation Organization (ICAO), future evolutions FMS are
expected to allow aircraft to generate optimal 4-dimensional (4D)
trajectories, negotiate these trajectories with Air Traffic
Management (ATM) operators, and also support Separation
Assurance and Collision Avoidance (SA&CA) functionalities
[20].
Crew Alerting Systems
The 1980s also saw the introduction of Crew Alerting
Systems (CAS) such as the Electronic Centralized Aircraft
Monitor (ECAM), shown in Fig. 10, and the Engine-Indicating
and Crew-Alerting System (EICAS). The ECAM was developed
by Airbus and first used on the A310 while the EICAS was
developed by Boeing and first used on-board the 767. While
there are slight differences between the two displays, both the
EICAS and ECAM essentially serve the same purpose of
monitoring multiple aircraft systems, consolidating monitoring
data into integrated displays, alerting the flight crew of any
abnormal conditions, as well as providing relevant advisories.
Indications are classified in increasing importance: as memos
(used to recall normal or automatic selection of functions),
advisories (used when a monitored parameter drifts out of its
normal operational range), cautions (used for events requiring
crew awareness but not immediate action) or warnings (used for
events requiring immediate crew action). In the event of multiple
faults or failures, the relevant indications are prioritised
according to their level of importance.
ECAM status pages for different systems, image courtesy of FAA [21]. Fig. 10.
9
Typically, EICAS/ECAM displays are situated in the middle
of the flight deck and are composed of an upper and lower
display. The top display provides information relevant to primary
systems (primarily engine parameters), such as:
Primary engine parameters (thrust limit mode, N1, exhaust
gas temperature, N2, fuel flow, etc.);
Total and static air temperature;
Quantity of fuel-on-board;
Slat and flap position;
Landing gear position;
Summary messages or remedial instructions.
The lower unit typically contains secondary engine data, the
status of secondary systems in the aircraft, or remedial
procedures in the event of system malfunctions or failures.
Information is displayed on multiple pages, which can include:
Secondary engine parameters (fuel used, oil quantity, oil
pressure, oil temperature, engine vibration, engine bleed
pressure, etc.);
System status and synoptics (AC/DC electrical power,
auxiliary power unit, bleed air, cabin pressurisation and
oxygen, flight control systems, hydraulics, doors, landing
gear, etc.);
Remedial procedures in the form of electronic checklists.
EICAS/ECAM reduces the need for pilots to constantly
monitor system parameters and individual alerts, while providing
decision support in abnormal and emergency situations by
presenting appropriate caution, warning and advisory messages
in the event of system failures or malfunctions. EICAS/ECAM
has effectively superseded the flight engineer’s function and
enabled the transition from previous three-crew to current two-
pilot flight decks
2.3. Second Generation Flight Decks
Second generation flight decks, such as the A320 and Boeing
747-400 (Fig. 11), are characterised by additional integration of
avionics systems, allowing further consolidation of the displays
used in first generation decks. The Electronic Flight Instrument
System (EFIS) is a general term referring to the set of all
electronic display systems used on-board second generation
flight decks. EFIS displays typically include the Primary Flight
Display (PFD) and Navigation Display (ND) but can also refer to
ECAM/EICAS displays, the MCDU, as well as the Electronic
Flight Bag (EFB). While the terms EADI/PFD, as well as the
terms EHSI/ND are used interchangeably, PFD and ND typically
refer to the displays used on newer aircraft (such as the Airbus
A350-XWB [22] or the Boeing 747-400 [23]). As the EFIS uses
standard display units, each display unit is inherently
reconfigurable and interchangeable between different display
modes. Such displays can perform multiple functions and are
therefore known as Multi-Functional Displays (MFD).
Boeing 747-400, second generation flight deck, [24]. Fig. 11.
Enhanced GPWS
The Enhanced GPWS (EGPWS) was introduced in 1996,
nearly two decades after the development of the classic GPWS
[25]. The EGPWS augments the classic GPWS with an internal
database comprising terrain, obstacle and airport runways as well
as the capability to relate the aircraft position to these databases,
thereby providing predictive alerting and display functionalities.
The added functionalities of the EGPWS includes two additional
operating modes (Modes 5 and 6) as well as a number of
enhanced functions (envelope modulation, terrain display, terrain
look ahead alerting, terrain clearance floor) [26, 27].
Mode 6 (advisory callouts) provides altitude and bank angle
callouts. Altitude callouts include decision height-based callouts
as well as altitude-based callouts; a Smart 500 Foot callout is also
available during non-precision approaches. Bank angle callouts
provide excessive bank angle advisories based on a bank angle
envelope; different envelopes are defined for different classes of
aircraft. Some of the EGPWS callouts (such as the decision
height-based callouts) have been traditionally made by the Pilot
Not Flying (PNF). Automating these callouts enhances flight
crew awareness and allows the PNF to concentrate on other tasks
during the approach and take-off phases.
Mode 7 (windshear) provides windshear alerts during take-off
and approach. An envelope is defined based on a combination of
head/tailwind as well as up/downdraft conditions. Windshear
cautions (“Caution, windshear”) are given if an increasing
headwind (or decreasing tailwind) and/or a severe updraft is
detected to exceed the defined envelope. Windshear warnings (an
aural siren followed by “Windshear, windshear, windshear”) are
given if a decreasing headwind (or increasing tailwind) and/or a
severe downdraft is detected to exceed the defined envelope.
Envelope modulation is an enhanced function to modulate the
EGPWS envelope for special approach and landing cases. When
using the classic GPWS, terrain features near specific airports
around the world have resulted in nuisance or missed alerts
during approach or radar vectoring situations. To circumvent this
issue, EGPWS stores a database of these problem areas and
adjusts the alerting envelope when operating at these locations.
For example, envelope modulation desensitises Modes 1, 2 and 4
to minimize nuisance alerts due to unusual terrain or approach
procedures.
Terrain display is an enhanced function for depicting
surrounding terrain on compatible and enabled displays (e.g.,
EFIS displays). In the plan view display, relative terrain elevation
is represented by various colours and intensities (primarily black,
green, yellow and red).
Terrain look ahead alerting is an enhanced function, which
provides more time for the flight crew to react to alerts, which are
issued typically 60 seconds (cautions) or 30 seconds (warnings)
prior to a predicted terrain conflict. The EGPWS compares the
aircraft’s position, flight path angle, track and speed against an
internal terrain database to determine possible conflicts.
Additionally, the ability to vary the look-ahead region as a
function of the flight path angle, track and altitude prevents
undesired alerts when taking off or landing. Different alerts are
also given terrain or obstacle conflicts.
Terrain clearance floor is an enhanced function, which alerts
pilots of possible premature descent during non-precision
approaches. Using runway position data, a protective envelope is
first defined around all runways. During approach, if the aircraft
descends below these envelopes, the voice alert “Too low
terrain” is triggered.
10
Cockpit Display of Traffic Information
The Cockpit Display of Traffic Information (CDTI) is an
evolution of the TCAS and navigational displays for providing
flight crew with greater situational awareness of surrounding
traffic, potentially supporting future self-separation concepts.
CDTI displays are designed to integrate and display traffic
information from a range of different sources, such as Automatic
Dependent Surveillance-Broadcast (ADS-B) and Traffic
Information Service Broadcast (TIS-B), and can also depict
terrain or weather information. While TCAS displays provide the
relative position, altitude and climb/descent information of
proximate traffic, as well as the associated TA and RA, CDTI
displays are expected to provide additional information such as
aircraft callsign and type, range from ownship, closure rate,
ownship/traffic trajectories as well as other forms of spatio-
temporal information. As presented in Fig. 12, CDTI information
can be displayed using two views a plan-view Traffic Situation
Display (TSD) and a side-view Vertical Situation Display (VSD)
improving flight crew awareness of proximate traffic and
facilitating the execution of more complex lateral/vertical
avoidance manoeuvers.
G1W03
G1W02
RM0256
RM1097
RM0015
RM0256
RM1097
254
HDG MAG
GS 220
70°/10 G1W03
7.2NM
5000
4000
0 7.5 1515 7.5
G1W03
G1W02
4513
A CDTI display containing both plan Fig. 12. and vertical traffic displays.
Multi-Functional Displays
Multi-Functional Displays (MFD) feature multiple pages,
with each page typically serving a specific function. MFD allow
pages to be switched either automatically or manually and for
information to be layered on a single page, thus allowing the
presentation of data from many sources. Emerging MFD in the
business jet market also provide multi-touch capabilities (such as
the Garmin G3X Touch and the Astronautics NEXIS Flight-
Intelligence System) and can also serve as Combined Vision
Systems (CVS) by overlaying Enhanced Vision Systems (EVS)
images onto synthetically generated terrain [28]. The FAA has
provided guidelines for evaluating the human factors associated
with MFD [29], which include the presentation and organization
of information and controls to prevent clutter or mode confusion.
The HFE considerations for designing MFD are discussed in
greater detail in Section 3 but briefly include [29-31]:
Accessibility and sensitivity of control and input elements
(e.g., push buttons and touch screens);
Colour and symbology usage;
Organization of information elements (e.g., menus,
windows, overlays);
Clutter management;
Sharing and switching between different functions.
2.4. Third Generation Flight Decks
Third generation flight decks (Fig. 13) are likely to see a
consolidation of second generation displays and interfaces.
Advances in display technology will pave the way for larger
MFD, providing fully integrated and interchangeable displays.
Synthetic Vision Systems (SVS) are already in use in some
business jets such as the Bombardier Challenger 650 and are
likely to be integrated into the displays of third generation civil
flight decks. EVS such as Heads-Up Displays (HUD) are being
offered on some civil aircraft such as the Boeing 787 and the
Airbus A320. Both SVS and EVS provide increased safety for
flight in low visibility conditions such as darkness, smoke, fog or
rain. SVS and EVS concepts are currently being evaluated by
NASA [32, 33] for future operations.
Evolution of current second generation flight decks: Fig. 13.
conceptual image from ALICIA (All Condition Operations and
Innovative Cockpit Infrastructure), research and development project co-
funded by European Commission under the Seventh Framework
Programme [34], courtesy of Airbus and Deep Blue.
Synthetic/Enhanced/Combined Vision Systems
The intended function of Synthetic/Enhanced/Combined
Vision Systems (SVS/EVS/CVS) is to provide a supplemental
view of the external scene to provide the flight crew with an
awareness of terrain, obstacles and relevant cultural features
(which may include the runway and airport environment) [35,
36].
SVS provides a computer-generated image of the external
scene using navigation data (attitude, altitude, position) and
an internal reference database of terrain, obstacles and other
relevant features. Advanced guidance information such as
pathways-in-the-sky (Fig. 14) can be displayed on such
systems and are an ongoing area of research. The quality of
SVS images depends on the accuracy and precision of
navigation data as well as the validity of the database.
EVS uses imaging sensors (such as forward looking
infrared, millimetre wave radar and/or low light level image
11
intensifying) to provide the flight crew with a sensor-
derived or enhanced image of the external scene. As such,
the quality of EVS images very much depends on the type
of sensors used.
CVS combines information from synthetic and enhanced
vision systems in a single integrated display.
An EVS concept using the pathway-in-the-sky Fig. 14.
representation to provide approach guidance, image used with
permission of the author [37].
SVS/EVS/CVS have the capability to enhance the flight
crew’s situation awareness during normal and abnormal
operations, contributing to a number of safety benefits [38, 39].
These safety benefits include:
Enhanced vertical/lateral path awareness;
Enhanced terrain and traffic awareness;
Improved attitude/upset or missed approach recognition and
recovery (e.g., recovery path is depicted for additional
clarity);
Reduced runway incursions;
Supported transition from instrument to visual flight;
Improved compliance with ATC clearances;
Reduced possibility of spatial disorientation.
SVS/EVS/CVS provide the potential to support low-visibility
or closely-spaced operations. A number of potential operational
benefits were identified and include [38, 39]:
Intuitive depiction of information (e.g., ATC clearances,
airspace, traffic and weather hazards, flight path guidance,
RNP compliance, etc.);
Support for enhanced surface operations (e.g., rollout, turn
off and hold short, taxi, etc.);
Support for enhanced departure and arrival operations (e.g.,
noise abatement operations, independent operations on
parallel runways, reduced inter-arrival separation, reduced
minimums, CAT III approaches, non-ILS approaches, etc.);
Support for operations in low-visibility conditions;
Support for 4D trajectory planning, navigation and
monitoring.
Possible reductions in training requirements.
The human factors considerations associated with these
vision systems include [38, 40-43]:
Image quality (e.g., field-of-view, display size, clutter,
symbology, brightness, contrast, data inaccuracy, noise, lag,
jitter, etc.);
Information integration (e.g., information presentation,
information organization, systems and display integration,
operator-related cognitive tunnelling, complacency,
workload demand, skill retention, etc.);
Operational concepts (e.g., display transitions, crew
interaction, procedural changes, failure modes, essential
information, crew trust, resource management, etc.).
Airborne Collision Avoidance System X
The next generation of collision avoidance systems are being
developed to support future operational concepts. Airborne
Collision Avoidance System (ACAS) X [44, 45] is a proposed
concept that has already undergone a number of flight tests [46].
The decision logic of ACAS X is based on a probabilistic
approach, different from the rule-based logic of legacy TCAS
systems. The approach allows for uncertainties in aircraft state
[47], as well as pilot response [48] to be taken into account in the
collision avoidance decision logic, offering greater operational
flexibility and pilot acceptance in addition to enhanced safety.
For example, ACAS X decision logic can be tailored to meet
specific performance requirements for different classes (e.g.,
manned and unmanned) of aircraft, for different (e.g., reduced
separation) procedures, or optimised to reduce the reversal or
attitude crossing alert rates [49]. ACAS X is expected to reduce
overall pilot workload, accommodate new surveillance inputs in
addition to transponders that are currently used in TCAS,
improve operational efficiency, minimize interruptions to normal
air traffic flow, as well as reduce costs associated with system
implementation and upgrades. There are four variants of ACAS
X: ACAS Xa (active), the general purpose ACAS X that will
replace TCAS II; ACAS Xo (operation), designed for particular
operations where ACAS Xa is unsuitable; ACAS Xp (passive),
intended for low-performance general aircraft lacking certified
collision avoidance protection; ACAS Xu (unmanned), designed
for both RPAS and rotorcraft platforms.
Single Cockpit Displays
Research in Europe includes the 7th Framework Programme
(FP7) One Display for a Cockpit Interactive Solution (ODICIS)
project, which presented the concept of a projection-based single
cockpit display for enhancing system architecture flexibility,
customizability of displays and continuity of information [50].
All Condition Operations and Innovate Cockpit Infrastructure
(ALICIA) was another FP7 project examining cockpit solutions
in degraded flight conditions, such as the conceptual cockpit
depicted in Fig. 13. ALICIA explored multi-modal cockpit
displays such as tactile cueing, 3-dimensional audio, touch screen
and SVS/EVS technologies [51, 52].
Speech Recognition
Speech recognition is an emerging concept that is being
considered for implementation in third generation flight decks.
Speech recognition technology can be used to augment current
input methods, has significant synergies with other multi-modal
input methods and can increase overall flight deck efficiency.
Potential applications of speech technology include [53, 54]:
Voice-based FMS inputs;
Voice-based tuning of radio frequencies;
Calling up and interacting with voice-based checklists;
Supporting cross-check and read-back during ATC-based
communications;
Synthesis of multi-modal interactions (e.g., touch screen,
gesture recognition, eye tracking) to support context-aware
inputs;
Using voice authentication to augment cyber-security;
Reduction of cultural bias or language misunderstanding
through language-neutral cockpits;
Emotion, stress and workload identification.
The three main types of speech recognition systems are
speaker dependent, speaker independent and speaker adaptive
[55]. Speaker dependent systems offer high accuracies as they are
designed to be used by a single user. Speaker dependent systems
12
need to be trained to the user’s speech patterns, typically
requiring many hours of speech. Speaker independent systems
are designed to recognise general speech by any users and do not
require any training of the system. However, speaker independent
systems generally offer lower accuracies (or have more limited
vocabularies) than speaker dependent systems. Speaker adaptive
systems are a hybrid of the dependent and independent systems.
Speaker adaptive systems begin as a speaker dependent system
and adapt to the speaker incrementally over time, thereby
reducing the need for initial training while allowing for improved
performance over time.
However, a number of challenges remain before speech
recognition technology can be successfully implemented on-
board civil aircraft. These challenges include:
Accidental/inadvertent triggering of the system leading to
unintended inputs;
Accuracy of the speech recognition system, which is
affected by a number of factors, including aircraft
background noise, user differences (e.g., tone, pitch,
accents) as well as the type of system used;
Changes in a user’s voice under different operational (e.g.,
high/low workload) conditions, which might affect system
accuracy;
Reductions in system vocabulary, which might improve
system accuracy at the cost of limiting the number of
possible applications of the system;
Training time required for speaker-adaptive systems to
adapt to the user;
User acceptance some pilots prefer a sterile cockpit
without small talk.
Enhanced Audio
Multi-modal concepts are exploring applications of enhanced
current audio technologies to enhance the effectiveness of visual
displays and to increase the situational awareness of human
operators. In current flight decks, audio is used to provide
secondary warnings, or in critical cases, to issue instructions for
evasive action (as in the case of TCAS and GPWS advisories).
Enhanced auditory displays can potentially reduce operator head
down time on visual displays, while providing an additional
channel to convey information to operators. Research in
enhanced audio are focused on a number of key areas [56-58]:
3D audio for conveying spatial information (e.g., traffic
proximity and location);
Spatial separation of different auditory sources (i.e., the
cocktail party effect) to facilitate user localisation of
different types of cues;
Sonification to indicate changes in data;
Synthetic speech for voice narration of data link messages.
Enhanced audio concepts have also been explored in military
[59, 60] and RPAS applications [61-63], primarily for
representing spatial information, augmenting situational
awareness and improve overall operator performance.
2.5. Military Cockpits
Similar to civil aircraft, military jet fighters can be classified
into different generations according to their capabilities as well as
the avionics technologies incorporated into each generation of
aircraft. The Australian Air Power Development Centre identifies
five generations of fighter aircraft, which are briefly summarized
in Fig. 15 [64]. In generations one to three, the displays used on-
board military cockpits were predominantly electro-mechanical,
while from generation four onwards, military cockpits began to
feature more advanced electro-optical displays. It is observed that
advances in military aviation technology are generally ten to
fifteen years ahead of their civil counterparts. Some examples
include:
The first fighter jets were introduced about a decade before
the start of the civil jet age in 1952.
Glass displays were used in third and fourth generation
military cockpits in the 1970s, approximately a decade
before the technology became available on civil flight decks
of the 1980s.
Fly-by-wire was first implemented in fourth generation
fighter aircraft such as the F-16 (introduced to service in
1978) before being adopted in civil aircraft such as the
Airbus A320 (introduced to service in 1988).
While first and second generations of fighter jets were
designed to maximise aircraft performance (e.g., speed, range,
payload capacity, manoeuvrability, etc.), the third generation of
fighter aircraft had enhanced capabilities, brought about by
advances in mission and avionics systems (e.g., stealth,
surveillance, weapons, etc.). These third generation fighters
incorporated precision-guided munitions systems, which
supported the transition to engagements beyond the visual range
and led to the development of increasingly advanced systems for
detection, acquisition, engagement and evasion of enemy aircraft.
The transition to the fourth generation of fighters saw further
advances in mission systems, allowing swing and multi-role
fighters capable of operating in air-to-air or air-to-ground roles,
or performing airborne reconnaissance, surveillance and support.
Currently, fifth generation fighters are being developed to
support network-centric warfare. fifth generation fighter aircraft
are characterised by their inherent capability to network with
other aircraft and manage large amounts of data by intelligent
data fusion algorithms, thereby enhancing the fighter pilot’s
situational awareness and tactical decision making in increasingly
complex scenarios.
1940 1950 1960 1980 1990 2000
1
st
Gen
Subsonic 2
nd
Gen
Supersonic 3
rd
Gen
Multi-role
Fighter
4
th
Gen
Swing-role
Fighter
4.5
th
Gen
Multi-role
Missions
5
th
Gen
Networked
Air Power
Development
Centre, 2012
Electro-Mechanical Era Electro-Optical
Era Next-Gen
Flight Deck
2010 2020
Evolution of
Civil Decks
1970
Evolution of military fighter aircraft.Fig. 15.
First to Third Generation Cockpits
Similar to flight decks of the electro-mechanical era, first to
third generation cockpits featured dedicated analogue displays
and instrumentation. Guided missiles were used on second and
third generation fighters with the support of radar and infrared
technologies. Late-second generation and third generation
cockpits also saw the use of the electro-optical radar scope. Third
generation fighters such as the F-4, as shown in Fig. 16, featured
more advanced radar systems such as the Doppler radars, which
supported “look-down, shoot-down” capabilities. The F-4 was
manned by two crew members, with the pilot sitting in the front
seat and the Radar Intercept Officer (RIO) managing the
advanced radar system in the back.
13
F-4 Phantom II, a third generation fighter. Fig. 16.
Fourth Generation Cockpits
Fourth generation aircraft such as the F-15A (shown in Fig.
17) saw the introduction of advanced flight control systems.
Fighters such as the F-16 and Mirage 2000 were designed with
increased responsiveness and manoeuvrability but were
inherently unstable. Fly-by-wire systems were designed to
compensate for this lack of stability by providing artificial
stability. The head-down analogue instruments used in first to
third generation fighters were gradually replaced with electro-
optical displays in fourth generation cockpits. The MFD used in
second generation civil flight decks of the 1990s were already
found in F-16 cockpits since the late 1970s. Similar to civil flight
decks, MFD on fighter cockpits consolidated the numerous
standalone instruments into glass displays, allowing fighter pilots
and Weapon Systems Officers (WSO) to switch between
different display configurations and functions. MFD supported
the transition of fighter aircraft with dedicated roles, to multi-role
and swing-role fighters. Typical MFD pages include:
Horizontal Situation Display (HSD);
Fire Control Radar (FCR);
Stores Management System (SMS);
Terrain Following Radar (TFR);
Forward Looking Infrared (FLIR);
Targeting Pod (TGP);
Flight Control System (FLCS);
Data Terminal Entry (DTE);
Weapon Systems (WPN).
In addition to the head down MFD, fourth generation
cockpits also feature HUD that are now beginning to appear in
modern flight decks. The HUD is considered to be the aircraft’s
PFD and displays primary flight data. Additionally, fighter HUD
typically include G readouts as well as weapon targeting
information. The HUD allows fighter pilots to maintain
situational awareness by providing pilots with crucial flight and
combat information that they would otherwise have to obtain by
shifting their gaze down toward the heads down displays.
Another feature introduced in fourth generation fighters is the
Hands on Throttle-and-Stick (HOTAS) controls. Buttons and
switches are located on the HOTAS, allowing pilots to perform
some tasks without taking their hands off the controls. Common
switches on the side-stick include the trigger, weapon release
switch and trim hat, as well as switches for managing the active
displays, targets, countermeasures, sensors and autopilot. The
throttle contains switches for managing communications,
weapons and sensors. Some civil aircraft manufacturers (such as
Airbus, Bombardier and Embrarer) favour the use of the side-
stick over the yoke. The use of sidesticks in civil airliners frees
up the space in front of the pilot, improving display visibility and
allows for other for other uses in the area (such as incorporating
alternative control devices like keyboards, trackballs, etc.). The
Gulfstream G500 is the first civil aircraft to use an active inceptor
system to provide pilots with tactile feedback.
F-15A Eagle, a fourth generation fighter, image courtesy of Fig. 17. the National Museum of the U.S. Air Force.
Four-and-a-Half Generation Cockpits
Reduced aircraft development due to forced reductions in
military spending led to a half generation increment from fourth
generation fighter jets. Four-and-a-half generation fighters such
as the Eurofighter Typhoon (Fig. 18) included avionic
improvements such as the Active Electronic Scanned Array
(AESA) radar, integrated Infrared Search and Track (IRST)
systems and high capacity data-link, allowing for network centric
warfare. Four-and-a-half generation cockpits also feature
alternative input technologies such as Direct Voice Input (DVI)
and touchscreen displays.
Eurofighter Typhoon, a four-and-a-half generation fighter. Fig. 18.
Fifth Generation Cockpits
Fifth generation fighters are designed around the concept of
network centric warfare, with the ability to exchange and store
information between other battlespace elements. Such aircraft are
characterised by advanced avionics systems capable of multi-
sensor data fusion, which integrates data from multispectral
and/or networked sensors to provide fighter pilots with a
consolidated view of the battlespace. Helmet Mounted Displays
(HMD) have been used since fourth generation fighters and are
an ongoing area of development in fifth generation jets. Standard
HMD provide a holographic display of aircraft data and target
information within the helmet’s visor. Images from on-board
sensors such as the Forward Looking Infrared (FLIR) and Night
Vision Imaging Systems (NVIS) can be fused to maintain pilot
situational awareness in degraded visual conditions. HMD cueing
systems allow pilots to designate and acquire targets as well as
aim sensors and weapons via head motion. Eye tracking
technology is not featured on current HMD but offer the potential
for even more accurate HMD cueing.
14
Fifth generation fighters such as the F-35 (shown in Fig. 19)
have completely replaced the HUD with HMD as the primary
flight display. The F-35’s HMD uses a distributed aperture
system to provide a 360-degree view of the aircraft’s
surroundings, supporting even greater situational awareness. The
F-35 also features a panoramic touchscreen MFD, similar to the
type of displays that will be featured in next generation of civil
flight decks. Fifth generation fighters are also typically single-
seated, with the role of the Weapon Systems Officer (WSO) in
the back seat being replaced by other networked battlespace
support elements, as well as by higher levels of on-board
automation.
F-35 Lightning II, a fifth generation fighter. Fig. 19.
2.6. Single-Pilot Operations
SPO flight decks are currently used in the military, General
Aviation (GA) and business jet domain, but given developments
in flight deck autonomy and HMI2 evolutions, SPO is expected to
become a viable concept of operations for future commercial
airliners. SPO flight decks contain HMI2 elements of second and
third generation two-pilot civil flight decks business jets such
as the Embraer Phenom 300 (Fig. 20) are certified for SPO and
feature streamlined interfaces containing wide-screen and high
resolution MFD. The MFD functionalities can contain SVS,
touchscreen and satellite weather capabilities that are typical of
third generation civil flight decks.
Embraer EMB-505 Phenom 300 flight deck, [65]. Fig. 20.
Most notably, NASA conducted a series of SPO-related
studies exploring a new concept of operations, whereby single-
pilots would be supported by a number of ground crew members
[66]. The ground crew would provide dispatch information and
communication support in nominal operations. In off-nominal
operations (such as single-pilot incapacitation), the ground crew
would serve as remote pilots executing an emergency landing.
The SPO ground station would resemble a remote pilot station,
incorporating some ATM elements (such as weather and traffic
monitoring functionalities) to support the ground crew member’s
role as dispatcher. The extension of SPO to commercial aviation
has also been investigated in a number of studies [67-70]. These
studies have suggested pilot monitoring functions to be
implemented on-board SPO flight decks in order to detect
periods of high pilot workload or pilot incapacitation.
Sufficiently intelligent on-board systems would be able to
provide adaptive and context-aware support in the case of
excessive pilot workload, or even coordinate an emergency
landing in the event of pilot incapacitation. These and similar
considerations underpin the design of cognitive monitoring
systems such as the ones described in sections 4 to 6.
2.7. Rotorcraft Cockpits
Leishman [71] provides a concise history of rotorcraft flight,
beginning from the 1700s up to the 21st century. The early 1940s
marks the introduction of modern helicopters, followed by the
maturation of rotorcraft technologies in the next two decades. As
with most developments in aerospace, military applications were
the primary driver for these developments. In terms of cockpit
environment, the historical evolution of the rotorcraft cockpit
follows a similar trend as that of fixed wing aircraft, transitioning
from analogue instrumentation to glass displays as well as
featuring increasing levels of digitalisation and systems
integration. However, different operational needs served by fixed
wing aircraft and rotorcraft have led to a number of notable
divergences in their respective system and cockpit evolutions.
Rotorcraft cockpit instrumentation are largely similar to those
found on fixed wing aircraft. The information required for basic
flight includes: attitude, altitude, airspeed, engine and rotor RPM,
turn-and-slip, vertical speed, manifold pressure and magnetic
heading. To support radio-based navigation, a radio control panel
and CDI/RMI are also used. Modern glass cockpits feature PFD,
ND and CAS displays, which are similar to fixed wing flight
decks. Flight controls are more challenging as compared to fixed
wing piloting. The rotorcraft pilot typically operates three
controls: the collective lever, cyclic stick and anti-torque pedals.
The collective lever is operated by the pilot’s left hand and
controls the vertical movement of the rotorcraft by adjusting
collective pitch of the main rotor. The cyclic stick is operated by
the pilot’s right hand. The combined longitudinal and lateral
inputs to the cyclic stick determines the orientation of the rotor
disk. Motion of the cyclic along the longitudinal axis (fore and
aft) controls rotorcraft pitch attitude, while motion along the
lateral axis (left and right) controls rotorcraft roll attitude. The
anti-torque pedals are operated by the pilot’s feet and control the
rotorcraft’s yaw by adjusting the pitch angle of the tail rotor
blades. As the three controls are coupled, pilots have to control
all three simultaneously, which is different from the decoupled
vertical and lateral controls used on fixed wing aircraft. Hovering
is particularly challenging, as pilots are required to continuously
make small corrections, accounting for the presence of external
disturbances, in order to keep the rotorcraft stable.
Cockpit digitalisation
Rotorcraft avionics and cockpit digitalisation has traditionally
lagged behind their fixed wing counterparts, due to a number of
factors including the rugged operational environment, a lack of
necessity for more advanced avionics, low demand, high relative
cost, increased system complexity as well as space and weight
considerations. In particular, owing to operations in degraded
visual environments and turbulent flight conditions, the avionics
15
systems used on-board helicopters require both higher levels of
software certification as well as more robust, vibration-tolerant
Line Replaceable Units (LRU), effectively hindering the
introduction of modern cockpit technology. Glass cockpits first
appeared around the mid-1980s on military rotorcraft, such as the
Kiowa OH-58D reconnaissance helicopter (Fig. 21). This was
approximately a decade after the adoption of glass displays in
fighter jets, and around the same time that glass cockpits first
appeared on civil fixed wing aircraft. The piloting stations of
modern helicopter cockpits typically feature two MFD for
horizontal or vertical situation information, along with a MCDU
to allow the pilot access to the FMS for flight or mission
planning (Fig. 22). Further avionics integration has supported the
inclusion of more advanced functionalities in rotorcraft MFD,
introducing additional modes supporting the display of additional
mission or flight information, which might include video
surveillance, target tracking and indication, as well as controls for
facilitating communications between air and ground elements.
In terms of flight handling, the automated flight control
systems used on-board helicopters were fairly advanced for their
time. The Stability Augmentation Systems (SAS), force trim, as
well as flight path stabilisation and autopilot systems were
developed during the early-1960s to the late-1970s. However,
possibly owing to increased costs and decreased demand, full
cockpit digitalisation was somewhat slower (Table 2). The first
fully Fly-By-Wire (FBW) transport helicopter was the NH90
[72], which went into production in the mid-2000s, at around the
same time that FBW appeared for business jets like the Falcon
7X. In contrast, fixed wing aircraft with full FBW have appeared
a number of decades prior to this, with the F-16 and A320 being
the first military and civil aircraft to be equipped with fully
digital FBW systems.
Table 2. Early aircraft to be equipped with fully FBW systems
Type Aircraft
Year of
Introduction
Fighter jets
F-16
1978
Su-27
1985
Civil fixed-wing
aircraft
Concorde (hybrid
analogue-FBW)
1976
A320
1988
Business jets
Falcon 7X
2005
Rotorcraft
NH90
2006
UH-60MU
2011
Rotorcraft human factors research is primarily concerned
with introducing higher levels of safety through increased tactical
situational awareness. A study by the NLR identified 145
technologies for mitigating helicopter accidents, with the top 10
technologies being [73]:
Digital range image algorithms for visual guidance in low-
level flight;
EGPWS/TAWS;
Passive tower-based Obstacle Collision Avoidance Systems
(OCAS) comprising ground-based installations near power
lines that provide warnings to helicopters flying in its
vicinity [74];
New terrain following guidance algorithms for rotorcraft
displays (including SVS and EVS concepts);
LIDAR obstacle and terrain avoidance systems [75];
Predictive ground collision avoidance using digital terrain
referenced navigation (i.e., the flight path);
Full Authority Digital Engine Control (FADEC);
Engine backup systems;
Practical regime prediction approach for Health Usage
Monitoring System (HUMS) applications;
Helicopter pilot assistant systems featuring flight planning
and 4D-trajectory optimisation [76].
H-1
variants
AH-1J
1971 AH-1W
1986
AH-1Z
2010
UH-1N
1970 UH-1Y
2008
AH-64
Apache
AH-64A
1983
AH-64E
2020s
AH-64D
1997
H-60
Black Hawk UH-60A
1978
UH-60L
1989 UH-60M
2007 UH-60V
2020s
MH-60K
1990s MH-60R
2006
OH-58
Kiowa
OH-58A
1969
OH-58D
1985
OH-58F
2016
V-22
Osprey V-22
2007
Analogue
Glass
H-47
Chinook CH-47A/B/C
1962-1967
CH-47D
1980s CH-47F
2006
MH-47E
1990 MH-47G
2014
Historical evolution of some military rotorcraft, with the dates when the different variants were first introduced into service. Fig. 21.
16
Cockpits of the UH-1B (left) and UH-1Y (right, courtesy of Northrop Grumman Corporation) Fig. 22.
Degraded Visual Environments
Operations in Degraded Visual Environments (DVE) are a
significant contributing factor to rotary-wing accidents and also
lead to reduced operational effectiveness in missions involving
search and rescue, flight to remote locations and landing on
undeveloped sites. DVE operations encompass a range of
conditions, including:
Inadvertent entry into Instrument Meteorological
Conditions (IIMC), which refers to a situation where a pilot
originally planning to fly under Visual Flight Rules (VFR)
is prevented from doing so due to deteriorating visibility
due to weather (e.g., heavy rain, cloud, fog, darkness, etc.).
IIMC may lead to spatial disorientation and a loss of visual
contact with the ground.
Whiteout and brownout, which are usually associated with
military operations and refer to the loss of visibility close to
the ground when the rotor wash blows up either snow or
dust. The sudden loss of visual cues might cause the loss of
situational awareness, leading to a misjudgement of the
appropriate landing manoeuvre.
Flat light conditions typically occur over uniform
landscapes such as calm water, desert or snowy terrain and
are caused by the diffusion of light by overhead clouds,
creating a shadowless surface environment. Flat light makes
it difficult for pilots to perceive depth, distance, altitude or
motion and can give the impression of ascending or
descending when actually flying level.
Traditionally, Night Vision Imaging Systems (NVIS) have
been extensively employed in rotorcraft operations allowing for
operations in low-light environments [77]. However, a number of
programs have recently emerged that are targeting
EVS/SVS/CVS solutions (e.g., Airbus’s Sferion system [78, 79],
BAE System’s BLAST technology [80], Defence Research and
Canada’s DVEST program [81], NATO [82]) for
mitigating/avoiding accidents in DVE conditions. The systems
comprise four main components:
Sensors such as Millimetre Wave (MMW) radar, LIDAR,
thermal and low-light cameras for detecting and
recognising surface features and texture, slope and
obstacles;
Software which process the sensed data into a readily
interpretable picture. These can include classification
algorithms which allow the sensed objects to be visualised
as conformal symbols (i.e., symbols which appear to overlie
the objects they represent);
Displays, including helmet mounted, heads up and heads-
down displays, which depict the information using relevant
2D/3D symbology;
Automatic flight control systems with advanced flight
control laws utilising sensor data.
Terrain Awareness and Warning Systems
Helicopter TAWS (HTAWS) and TCAS systems are similar
to the GPWS and TCAS systems installed on fixed wing aircraft,
providing relevant cautions, warnings and advisories. However,
due to helicopter flight specificities (which are in many ways
similar to those of RPA platforms), the terrain, obstacle and
traffic warning systems used by rotorcraft typically introduce
some modifications to the alerting logics, also relying to a greater
extent on an extensive terrain/obstacle database and exploiting
on-board sensor systems. The flight specificities can include:
Operating envelopes which are different to commercial
fixed-wing aircraft, but comparable with those of RPA and
light general aviation aircraft;
Increased vertical manoeuvrability, thereby requiring ‘look-
down’ capabilities in addition to the ‘look-ahead’ modes
employed on existing systems;
Use of additional sensors for surveillance purposes (e.g.,
MMW radar, LIDAR, FLIR, etc.);
Operations involving low level navigation and terrain
following;
Operations in environments with a variety of natural and/or
man-made obstacles;
Operations in areas with high traffic densities, which might
saturate sensor capacity;
Operations involving mobile or transient obstacles (such as,
in the case of offshore operations, large ships, construction
barges or installations).
HTAWS devices integrate terrain and obstacle databases with
position information from GNSS and other sensors to generate
display information, aural and visual alerts. The HTAWS
equipment may be a stand-alone system, or may be interfaced
with the weather radar, navigation displays, or other display
systems. An example of an obstacle warning system is the Lidar
Obstacle and Warning System (LOWAS) [75] as depicted in Fig.
23, which can be used for sense-and-avoid applications by both
rotorcraft and small-to-medium sized RPA platforms. According
to RTCA DO-309 [83], HTAWS is expected to provide cautions
and warnings for a number of conditions [84]:
Excessive descent rate;
Excessive terrain closure rate;
Excessive altitude loss after take-off or go-around;
17
Unsafe terrain clearance while not in landing configuration;
Excessive downward deviation below the instrument glide
path;
Checks on pressure altitude, excessive banking and pitch
angles;
Vortex ring state, which occurs when the rotorcraft is
descending in a manner which causes it to get caught up in
the downwash of its main rotor.
A recent report on the use of HTAWS proposed a number of
modifications to current EGPWS systems to make them more
suitable for offshore environments [85]:
EGPWS Modes 1 and 4 envelopes to be modified such that
alerts are triggered at higher altitudes;
EGPWS Mode 3 envelope to monitor for loss of airspeed
rather than loss of altitude after take-off;
A new envelope to be introduced which monitors airspeed
vs. torque. At low speeds (usually < 80 kts), due to the
increased effects of induced drag, the required torque
typically increases with decreasing airspeed.
a) b)
d)
c)
HMI formats for the Lidar Obstacle Warning and Avoidance System; a) visible image, b) Planar display with detected obstacles along with Fig. 23.
ground profile represented as blue lines; c) enhanced format combining Lidar and FLIR imagery; d) Synthetic display format with wire obstacles.
Collision Avoidance
While collision avoidance systems have been a requirement
for fixed wing aircraft since the 1980s, there is currently no
mandate for the installation of such systems on rotorcraft.
However, TCAS is still typically included as optional equipment
on medium and heavy commercial helicopters and certified based
on the TCAS II standard. A number of considerations regarding
the installation of such TCAS systems have been cited in past
studies [86-88]:
TCAS algorithms are not well-suited for the lower
operating airspeeds (< 100 kts) and rate-of-climb of
rotorcraft compared to fixed wing commercial aircraft,
instead being more similar to those of light general aircraft
or RPA;
the lower airspeed of rotorcraft allows a substantially
smaller volume of protection as compared to fixed wing
aircraft, and also implies that most conflicts will result from
overtaking manoeuvres by high speed aircraft, approaching
from the critical quadrant (left/rear), thereby requiring a
protection volume optimised for such conflicts;
rotorcraft typically operate at low altitudes where the
vertical RAs provided by TCAS might not be as effective as
a horizontal RA;
rotorcraft might operate in high traffic density areas, which
might degrade the reliability of TCAS systems (TCAS II
provides reliable surveillance for traffic densities of up to
0.3 aircraft per NM2);
TCAS antennae performance can be susceptible to
interference from the rotorcraft’s main and tail rotors;
TCAS antennae are vertically polarised, which means that
there are blind spots situated directly above and below the
antennae. This presents an issue for rotorcraft, which are
able to manoeuver into these regions, thereby requiring
more accurate coverage of these areas;
using additional on-board sensors (e.g., LIDAR, MMW
radar, visual camera) for navigation and tracking can
enhance the performance of collision avoidance systems.
Flight and Mission Management
Following the introduction of glass displays, further avionics
integration in the 1990s supported the development of Mission
Management Systems (MMS) for special operation helicopters
such as the MH-47E and MH-60K [89], with similar systems
emerging for civil rotorcraft applications in the following decade.
Similar to the FMS of fixed wing aircraft, rotorcraft pilots
interact with the MMS through the CDU. Rotorcraft FMS/MMS
are however designed for more tactical applications, providing a
number of unique functionalities in addition to the flight
planning, navigation and guidance functions offered on fixed-
wing FMS, which can include:
Multi-sensor navigation (GPS, INS/GPS, VOR/DME/
TACAN, Doppler, SBAS, FLIR, Radar, etc.);
Centralised management of navigation and communication
radios and integration with tactical communications system;
Up/down link of tactical flight plans with external sources;
18
Automated flight pattern generation for search and rescue,
surveillance and surveying applications, which can include:
Rising ladder, race track, expanding square, creeping
line, parallel track, track crawl, sector search, orbit and
border patrol [90, 91];
Additional flight modes including terrain following and
transition to hover;
Centralised control of tactical sensors (e.g., FLIR, weather
radar, SAR, visual camera) and their associated video feeds;
Target acquisition, tracking and prediction;
Integration with existing Health and Usage Monitoring
Systems (HUMS);
Night vision compatible displays.
Additionally, the use of ground-based mission planning
systems, similar to those used by RPAS pilots, provide pre-flight
planning, additional decision support and post-flight debriefing
functionalities.
Manned/Unmanned Teaming
An emerging research focus area is Manned/Unmanned
Teaming (MUMT) between unmanned or remotely piloted
platforms and manned aircraft. Instead of being controlled from a
ground station, these autonomous platforms would be controlled
directly from the cockpit of a manned platform via tactical data
links. In particular, manned rotorcraft offer a number of
opportunities due to the operational synergies between the two
platforms. Most of the research and development efforts are
directed at military applications, ranging from Intelligence,
Surveillance and Reconnaissance (ISR) to cooperative targeting
and remote strike. A recent review provided the major testing and
demonstration programs in this area [92]. The different programs
are presented in Fig. 24 and were successful in demonstrating a
number of technological concepts for supporting higher levels of
RPAS interoperability, in accordance with the levels as described
in the NATO Standard Agreement (STANAG) 4586 [93]:
Level 1 interoperability: Indirect receipt/transmission of
RPA-related payload data;
Level 2 interoperability: Direct receipt of Intelligence,
Surveillance and Reconnaissance (ISR) data, where "direct"
covers reception of the RPA payload data by the remote
control system when it has direct communication with the
RPA;
Level 3 interoperability: Control and monitoring of the
RPA payload in addition to direct receipt of ISR and other
data;
Level 4 interoperability: Control and monitoring of the
RPA, less launch and recovery;
Level 5 interoperability: Control and monitoring of the
RPA, plus launch and recovery.
Most of the programs assessed technologies supporting Level
3 and 4 RPA interoperability, which require human operators to
assume some form of control of the RPA platforms while on-
board the rotorcraft. The key enabling technologies supporting
such operations include:
Tactical data links allowing for different degrees of
communication between manned and unmanned platforms;
Software architectures supporting interoperability between
heterogeneous platforms or coalition groups;
On-board functionalities including data fusion, sense-and-
avoid, as well as autonomous tactical decision making,
planning and replanning;
HMI allowing for more efficient command and control;
Models of crew behaviour, workload and situation
awareness associated with manned-unmanned operations.
1998
2000
2002
2004
2006
2008
2010
2012
2014
2016
2018
Software Enabled Control
Effort (SEC)
Future Combat Systems (FCS)
Airborne Manned/Unmanned
System Technology (AMUST)
Hunter Standoff Killer Team
(HSKT)
Unmanned Combat
Air Rotorcraft
Program (UCAR)
Manned/Unmanned
Common Architecture
Program (MCAP)
Armed VTOL UAV
Testbed Integration
(AVUTI)
Empire Challenge
Strategic Unmanned Air Vehicles (Experiment) (SU AV (E))
Apache Block III
Manned-
Unmanned
System
Integration
Capability
(MUSIC)
ScanEagle
Kutta Manned
Unmanned Teaming Kit
(MUM-TK)
Manned Unmanned
Operations
Capability
Development
Laboratory (MUMO)
Level 2
Level 3
Level 4
N/A
Key programs investigating MUMT technologies and Fig. 24. concepts, based on the review from [92].
2.8. RPAS Ground Segment
In general, RPAS possess increased automation and
autonomy, which compensate for the drawbacks associated with
RPAS operators being physically removed from the flight deck.
RPAS are designed with higher levels of autonomy and decision
authority (e.g., in the Guidance, Navigation and Control (GNC)
loop [94]) relative to their manned counterparts. The forms of
RPA control and coordination can range from remote control in
the Visual Line of Sight (VLOS), to managing and coordinating
multiple platforms via Beyond Line of Sight (BLOS) data links.
There are different sets of challenges associated with the design
of RPAS HMI2, especially for more complex operations requiring
human operators to work collaboratively with other human and
autonomous agents. Some HFE considerations include [95-98]:
Lack of physical sensory (vestibular, haptic, auditory) cues;
Data link latency and performance;
Detection and recovery from abnormal situations;
‘Out-of-the-loop’ effects and loss of situational