Conference PaperPDF Available

Lessons Learned from Increased Automation in Aviation: The Paradox Related to the High Degree of Safety and Implications for Future Research

Authors:

Figures

Content may be subject to copyright.
Proceedings of the 29th European Safety and Reliability Conference.
Edited by Michael Beer and Enrico Zio
Copyright ©2019 by ESREL2019 Organizers. Published by Research Publishing, Singapore
ISBN: 981-973-0000-00-0 :: doi: 10.3850/981-973-0000-00-0 esrel2019-paper
Lessons learned from increased automation in aviation: The paradox related to the
high degree of safety and implications for future research
T.E. Evjemo, S.O. Johnsen,
SINTEF, Trondheim, Norway. E-mail: Torerik.evjemo@sintef.no
For transportation systems to meet future environmental, productivity and safety requirements new technology is
imperative. Technologies that involve automated and autonomous transport solutions are key drivers. Within the
transport sector, sea, road and rail autonomous transport has already made entry. However, in research on
autonomous transport systems, traditional aviation has become somewhat "forgotten". Focus is often to describe
fully autonomous systems, while important transition phase(s) where more and more automated functions are
introduced gain less attention. A lot of experience is inherent in aviation that can benefit future autonomous transport
systems across domains especially considering the high safety level in aviation. Technological developments have
reduced the accident rate substantially in aviation, however a new type of aviation accidents associated with loss of
control is seen as a result of increasing degrees of automation. This paper presents a systematic literature review to
identify safety experiences relating to aviation automation focusing particularly on human-automation interaction.
The association between increasing degrees of automation and implications for safety is compounded - however,
increased aviation safety is seen in connection with reduction of human error, increased decision support and
simplified communications. Challenges associated with safety often concern issues of human automation-
interaction seen in the Air France 447 and Boeing 737 MAX accidents where automation came into conflict with
the pilots' actions. One can thus talk about a paradox associated with automation, a safety paradox informing us that
more research is needed on the role of the human operator when automation increases, which can also support the
actual design of future human-automation systems beyond aviation per se.
Keywords: Transport, safety, aviation, technology, automation, Human Factors
1. Introduction
1.1 The high safety level in aviation
Aviation is an international industry influenced by
various global trends and technological
development. At the same time, it is a national and
political goal in Norwegian transport policy to
provide safe, efficient, accessible and
environmentally friendly transport systems
systems that must be safe in such a way as to avoid
injured and killed people (The Government,
2017). In the field of aviation, the Norwegian civil
aviation authority (CAA) points out that targets
related to aviation safety in the coming years will
involve maintaining the current positive safety
statistics based on forecasts showing increasing
air traffic (Civil Aviation Authority, 2016).
When looking at developments in the accident
statistics in aviation after World War II, there is a
significant safety improvement from the early
sixties onwards (ICAO, 1998), partly explained
by the introduction of new technology including
aviation automation. However, the accident rate
increased again in the seventies, something which
was attributed "loss of situational awareness",
while accidents in the last decade are often seen
in conjunction with "loss of control" (Chialastri,
2012). Loss of control is seen in conjunction with
automation and how pilots are unable to regain
control when automation fails for example by
being incapable of monitoring automated flight
systems together with subsequent lack of manual
flying skills. According to IATA (2015), aviation
accidents are generally declining, and Fiorino
(2008) shows that technological developments
has reduced the accident rate of fatalities to 0.01
per 1 million flight hours. However, this does not
apply to accidents associated with loss of control
- often seen as a result of increased degrees of
automation. Consequently, the idea of human-
centred design or the end user becomes important,
meaning that pilots are involved in the early
stages of design processes acknowledging the
relevance of for example understanding pilots'
mental patterns when working with technology
automation.
1.2 The history of aviation automation
Automation by itself is nothing new in today's
aviation industry. Various degrees of automation
have followed aviation all the way from its timid
beginnings over a hundred years ago: "... Even
before the first powered flight in 1903, aircraft
designers had recognized the instability of their
machines and had begun to work providing pilots
with assistance in controlling the vehicles"
(Billings, 1991:8). For example, Orville Wright
received the Collier Trophy in 1913 when he
demonstrated hands-off flight by way of an
automatic stabilizer, and in the 1930s autopilots
were essential for long-distance flying. The
2 First Author and Second Author
introduction of retractable undercarriage also led
to demands for a warning system.
According to Orlady and Orlady (1999) one
major reason for introducing automation within
aircraft cockpits was the desire to reduce the
pilots' workload during flights and reduce the
need for intervention by pilots. Thus, automation
is designed to take over all or some of the pilots'
subtasks (Orlady & Orlady, 1999). Chialastri
(2012) notes two main reasons for adopting
automation in aircraft; the desire to remove
human error, and financial gains. Chialastri
(2012) identifies three main types of aircraft
automation systems; mechanical, electrical and
electronic. The latter implied the electronic
revolution of the mid-80s, i.e., electronic LCD
monitors replaced mechanical round-dial gauges
enabling a more analytic view of multiple
parameters within a limited field of vision.
Examples of system automation in modern
aircraft are autopilots (auto-land), auto-throttle
(controlling engine parameters during different
phases of flight) and Flight Management Systems
(FMS) holding navigational routes and aircraft
performance capabilities.
However, with increased automation there is
also concerns related to pilots being "out of the
loop" due to complex automation systems.
Chialastri (2012) refers to the Airbus A320
having around 190 computers located around the
fuselage interacting in ways the pilots are
unaware of. "... On a modern aircraft, the
electronic echo-system is mostly not known by its
users" (Chialastri, 2012:87). Thus, some aspects
of automation can lead to increased system
complexity. Consequently, aviation possesses a
lot of competence that can be of benefit to future
transport systems also beyond aviation.
1.3 Automation and autonomy
For transportation systems to meet future
environmental, productivity and safety
requirements new technology is imperative.
Technologies that involve automated and
autonomous transport solutions are therefore key
drivers. Within the transport sector, sea, road and
rail autonomous transport has already begun to
make entry, such as the Copenhagen metro
system. In the maritime sector, autonomous
vessels are already in use such as self-propelled
robots for sea. In selected cities, one has started
testing self-driven buses while Google has been
on the road for several years in the US with its
self-driven cars. Traditional aviation thus lags
behind the other transport sectors when it comes
to autonomous transport systems. Developments
within drones, however, has gone further than
traditional aviation, something Amazon's desire
to transport goods using remote and unmanned
drones illustrate. Furthermore, research on
autonomous transport systems often involves
describing fully autonomous systems, while
important transition phases in which more and
more automated functions are introduced receive
less focus.
The terms automation and autonomy are often
overlapping when describing systems involving
human operators, which can be confusing.
However, taxonomies are in use within the
various transport sectors to visualize different
phases where automated functions take over tasks
previously performed by humans. For example,
within rail (metro) The International Association
of Public Transport (UITO) has published a
taxonomy involving degrees of automation going
from one to four where GoA 1 (Grade of
Automation) involves Automatic Train Protection
(ATP). Tasks related to starting and stopping the
train - as well as operational disturbances are all
handled by train drivers. GoA 4 means that the
above tasks are fully automated without a driver
on board the train.
The National Academy of Sciences (NAS) has
looked at autonomy versus automation (NAS,
2014) for aviation. An autonomous aircraft
implies no pilots in the cockpit. "... A fully
autonomous aircraft would not require a pilot; it
would be able to operate independently within
civil airspace, interacting with air traffic
controllers and other pilots just as if a human
pilot were in board and in command" (NAS,
2014:2). NAS argues for an approach based on
increasingly autonomous (IA) systems, which
means that different systems relate to an axis
based on the degree of maturity of autonomous
properties at the same time distinguishing
advanced automation and advanced autonomy as
to whether systems are designed to respond to
situations beyond the normal. Parasumaran,
Sheridan and Wickens (2000) explicitly address
levels of automation relating to whether final
decision authority is given to a human operator or
the technology. At level ten all decisions are being
made by the technology, while level one means
human decision authority. This paper focuses
foremost on automation, i.e., a system or aircraft
that operates in conjunction with its environment
with the support of various degrees of automated
functions. This implies a specific focus on human-
automation interaction.
The paper is based on a research project that
focuses on safety and future autonomy in
Norwegian aviation (Evjemo, 2018). The project
aims to generate research-based knowledge on
organizational and human (in the loop) challenges
and opportunities related to increasing levels of
automation. Specifically, this paper aims to:
identify safety experiences relating to aviation
automation. The paper has the following
delineation 1. Identify safety experiences related
Instructions for Preparing Paper for ESREL 2019 3
to human-automation interactions. 2. Propose
recommendations for further research based on
the identified safety experiences.
2. Methods
This paper presents findings from a systematic
literature search in the database Scopus. Search
was performed in several rounds with different
search combinations, in January 2018. A search
on for example "aviation and autonomy" resulted
in 159 hits. A search was also performed with
"aviation safety and organization", which resulted
in 1214 hits. A further search on "aviation and
automation" gave 1497 hits, while "aviation
safety and automation" gave 377 hits. By omitting
publications before 2000, 328 remained, which is
the paper's primary data source. All the articles
have been reviewed and coded based on summary
in some cases texts were reviewed in detail. The
purpose of coding was to systematize the data
material to get an overview of relevant themes and
aspects relevant to the research questions.
Table 1. Overview of coding distribution
Domain
Aircraft/operator (106)
Service provider (106)
Regulator (10)
Drones/other (93)
Cause/effect
Cause of (107)
Effect of (174)
Other (63)
Focus
Organizational factors (137)
Technology/design (227)
Regulative/framework (26)
Other (26)
Research
design
Ascertaining (4)
Evaluating (110)
Constructive (153)
Other (22)
The domain category aircraft/operator involves
publications dealing with aircraft and pilots in the
cockpit, or aircraft operators. A service provider
preferably involves the air traffic service, while
regulator refers to the framework conditions for
the aviation industry. Among drones/other, we
primarily find different types of unmanned
aircraft - and domains that cannot be categorized
within aviation because it may be that the
publication mainly focuses on health services but
is included in hits from Scopus because aviation
is mentioned for example in the abstract. The
category cause/effect refers to the extent to which
the publication deals with causes or drivers for
automation/autonomy, or effects. An example of
driver for automation is Ahlstrom and Jaggard
(2010) and the need for automation of weather
information for air traffic controllers. The focus
category describes the publication's main theme,
for example, Gilbert and Bruno (2009) is coded
within technology/design because it is about the
introduction of the navigation system ADS-B.
The categorization research design describes the
type of research question and is based on
Kalleberg, Malnes and Engelstad (2009) where
ascertaining means a descriptive approach, which
means that the publication tries to document
reality through explaining the similarities and
differences of what one is studying. An evaluative
publication means that normative argumentation
is used to assess whether something is good or
bad, while a constructive publication explains
what to do to change or improve what one is
studying. Example of constructive coding is
Landry and Lagu (2009) where one has developed
a predictive model to ensure separation in the
airspace where the argument is that such a model
does not exist today. A publication can be coded
more than once, such as the study of Chialastri
and Pozzi (2008), which looked at structural
aspects related to organizational, technological
and regulatory issues. Of a total of 328 hits in
Scopus, 52 publications are not coded or omitted
due to the following conditions (number in
parentheses):
Abstract not available (4)
Incomplete abstract (4)
Unclear title/book contribution (3)
Author(s) not provided (26)
Coded but theme not relevant (15)
The starting point for the literature search is
specific keywords, which implies delimitation of
possible hits. This means that other keywords
would have provided different results including
also the literature presented in the paper. The
coding of the data material also entails an element
of interpretation, while the literature presented is
selected for the purpose of illustrating the main
features of the research literature. The first part of
the results section illustrates how the relationship
between automation and safety is composite.
Studies are then explored where automation is
seen in the context of increased safety - after
which automation is linked to safety challenges.
Finally, the automation paradox is described
related to safety, including a discussion on the
need to strengthen the interactions involving
humans and automation.
3. Results and discussion: technology
mediated safety as a composite
phenomenon
The research literature related to the connection
(s) between automation and safety is
characterized by variation and the fact that this
involves complex issues. Dalcher (2007) shows
how increased degree of automation meant as a
4 First Author and Second Author
safety barrier actually contributed to an accident -
automated protection made it impossible for the
pilot in a fighter plane to regain control of the
aircraft. Dalcher (2007) points out how important
it is for developers of new technology to
understand situational contexts prior to depriving
pilots of control. Janakiraman, Matthews, and
Oza (2017) describe how technology can lead to
increased safety but also lower safety by referring
to accidents with loss of control and subsequent
stalling. During departure, an aircraft's speed
normally increases - however, if the speed is
reduced, the situation quickly becomes dangerous
if corrective measures are not implemented
immediately. During departure, the workload is
high for the pilots, which is related to maintaining
control of the aircraft, continuously monitoring
systems, and managing communication with the
air traffic service (ATS). Janakiraman et al.
(2017) outline a decision support tool to
proactively identify and manage risk situations
related to loss of speed during departure. This
illustrates a paradox with automated systems. On
the one hand, one sees some types of errors
associated with increasing degree of automation -
at the same time increasing degrees of automation
can in some cases also prevent similar or other
types of errors.
Oliver, Calvard and Potočnik (2017) point out
that automation has led to overall increased safety
in the aviation industry, at the same time,
automation has led to new forms of accidents
mainly due to loss of control over the aircraft,
although initially there were no technical errors
with the aircraft. Similarly, Borst, Mulder and
Van Paassen (2010) point out that an increasing
degree of automation has resulted in less
workload in general for pilots and increased
benefits related to the technical execution of
flights - while also indicating that automation can
have a downside in the form of specific challenges
related to pilots' situational awareness including
system understanding (Borst et al., 2010). Worth
noting is the article by Millot and Boy (2012)
arguing that there are two ways to view the human
role in relation to technology - either as a resource
or as a burden by showing how humans are
involved in control and handling within the air
traffic service. I.e., via positive involvements in
being able to handle unexpected events -
alternatively via negative involvements through
the human inherent ability to act incorrectly
(Millot and Boy, 2012).
3.1 Automation leading to increased safety
The research literature describes several examples
where automation involves improving aviation
safety. Dongmo (2015) shows how automation
can prevent human error via advanced
automation assisting pilots or acting in parallel
with the pilots' control authority through the
possibility of overriding if necessary. Dongmo
(2015) describes a technical system for automatic
recovery when pilots experience loss of control
during flight. Furthermore, Khedkar and Kumar
(2011) emphasize an air traffic controller's
primary area of responsibility with regard to
optimizing the use of airspace while at the same
time preventing collision between aircraft.
Human performance in today's air traffic service
means that the system does not work optimally -
the point is, according to Khedkar and Kumar
(2011), to minimize human intervention through
increased automation. Schutte (2017) argues that
the next step in improving aviation safety is about
re-evaluating the distribution of functions
between pilot and automation. According to
Schutte (2017), today's distribution is not
necessarily optimal - to improve the performance
of the pilots, this is primarily about defining the
pilot's role - and also asking why one needs a pilot
in the first place. The argument is about defining
functions based on the needs of the role of pilot,
then using automation to support human
limitations related to the execution of the pilot
role.
Furthermore, the research literature also shows
that automation can provide decision support,
which can have a positive impact on aviation
safety. Atkins (2010) argues that automation
could have helped the pilots on US Airways flight
1549 make a quicker decision at possible landing
sites after the collision with birds immediately
after departure. However, Atkins (2010) points
out the pilots' skill that undoubtedly contributed
to a successful emergency landing on the Hudson
River. The article describes the use of an adaptive
and automated aircraft planner against the Hudson
incident where the Airbus A320 aircraft lost all
engine power after collision with a flock of birds.
Atkins (2010) demonstrates how such an
automated technology would have enabled pilots
to return safely to LaGuardia by the technology
being able to identify various emergency landing
sites for different time intervals after the engines
stopped. The conclusion was that LaGuardia was
within reach if the pilots changed course within
16 seconds after the collision with the birds.
Diffenderfer, Tao and Payton (2013) explain how
to increase productivity at airports where there is
dependence between departure and arriving
flights without compromising safety. The authors
show how existing procedures often result in
inefficient coordination between arriving and
departing aircraft, which has a negative impact on
airport efficiency. By improving the current static
arrival and departure routes through automated
integration, as well as increasing the
communication between tower service and the
terminal area's radar-based approach control
Instructions for Preparing Paper for ESREL 2019 5
(TRACON), this can according to Diffenderfer et
al. (2013) increase the flow of aircraft without
compromising on safety.
Automation can also mean simplified
communication between air traffic services and
aircraft. Gineste and Gonzalez (2010) show how
safety is one of the key drivers for automation in
the air traffic service when they describe the
transition from speech to data communication
between air traffic controllers and pilots, while at
the same time pointing out that in addition to
increased safety one will also save other
resources. Glasgow and Niehus (2012) illustrate
how general aviation's use of Iridium satellite-
based communication facilitates increased safety
for general aviation (GA) pilots by introducing
two new capabilities. Firstly, one can send
information from the ground to the aircraft, for
example, of adverse weather conditions based on
the flight plan submitted. Secondly, one can
download position reports from the aircraft,
which in a conceived situation can improve the
response if search and rescue is needed. GA pilots
are also in focus in the study of Lancaster and
Casali (2008), arguing that the use of data link
between ground and aircraft can improve the
safety of GA aircraft. Lancaster and Casali (2008)
describe how GA pilots were tested in simulator
where various combinations and forms of text
versus speech were tried out during various flight
scenarios - including evaluation of time
associated with receiving, understanding, and
executing commands. Lancaster and Casali
(2008) reported statistically significant findings
related to pilot performance, mental workload and
situational understanding between different
combinations of data. It was concluded that data
transfer to pilots in the form of text has
considerable safety potential if at the same time
one supports messages with an automated voice
component.
Experience transfer from aviation to other
industries can also be beneficial for safety.
Petersen, Görges, Ansermino and Dumont (2014)
describe modern medicine and how the use of
anesthetics during operations is not automated.
Safety is dependent on the healthcare
professional's alertness, and Petersen et al. (2014)
draws parallels to aviation where human error can
also cause serious consequences, and where
automated control regimes have been introduced
to reduce risk and enhance human performance.
The authors show how a fully automated, closed
system for intravenous drug infusion can be
designed technically to control and
simultaneously monitor multiple simultaneous
infusions of drugs on multiple patients. Also,
Farrugia, Montebello and Abela (2013) describe
how researchers in the road sector have developed
a robot control system inspired from aviation that
automatically intervenes proactively during
driving to avoid various hazards.
3.2 Automation and safety challenges
Oliver et al. (2017) analyse how Air France's
flight from Rio De Janeiro to Paris in 2009
crashed in the Atlantic without survivors. They
recognize that in some cases new technology
means that you plan and build systems that have
the consequence that operators do not fully
understand the system they are set to work in. It is
pointed out that operations "within" a system's
boundaries may sometimes counteract its purpose
in the sense that operators' ability to challenge and
question established truths may be reduced -
which may adversely affect the understanding of
the situation (situational awareness). One will
first discover such an erosion of expertise when
exposed to unknown and unexpected situations.
Oliver et al. (2017) point out that although
technology related to glass cockpits has resulted
in a decrease in the number of aircraft accidents
in recent decades, reference is made to Harris
(2011) and concerns related to the weakening of
the pilots' situation understanding - that the pilots
simply do not understand what the technology is
doing. The following automation aspects might
limit the pilots' experiences of different flight
situations. First, automation limits pilots' ability
to make dangerous maneuvers, which means that
pilots are isolated from the consequences of their
actions. Second, automation reduces pilots'
cognitive load by relieving, for example,
monitoring multiple instruments simultaneously
and manually flying the aircraft. Thirdly, one
might find that automation means that pilots'
expertise diminishes over time. Finally, there is a
risk that automation may cause ambiguous or
inappropriate signals, which in turn may enhance
an unwanted course of action.
AF447 was three and a half hours along the
way when things began to escalate out of control
inside the cockpit. The Airbus A330 flew on
autopilot in 35,000 feet - three pilots were
involved because of the flight's length - the
captain and two officers of which two of the pilots
stayed in the cockpit at all times. However, the
captain was out of the cockpit when ice crystals
blocked the aircraft's speed sensors which
resulted in the aircraft's flight computer
automatically disconnecting the autopilot, which
it was designed to do when receiving inconsistent
data from the sensors located outside the aircraft.
When switching off autopilot functions, it is
entirely up to the pilots to maintain control of the
aircraft, which in the case of AF447 meant that
the flight officer with the least experience (pilot
flying) was the one who manually took over the
control. Manually flying a passenger plane at this
altitude is not something pilots usually do or train
6 First Author and Second Author
on - which resulted in the officer's immediate
attempt to correct a slight bank by repeated
overcompensation on his own side stick, which
resulted in the plane tilting back and forth
repeatedly. At the same time, possibly without
being aware of it, the flight officer pulled the
sidestick back so that the aircraft's nose pointed
upwards, the height increased while the speed was
reduced. As a result, the aircraft entered what is
called an aerodynamic stall about one minute after
the autopilot was disconnected. AF447 was now
virtually free to fall towards the sea surface.
Ebbatson, Harris, Huddlestone and Sears
(2010) identify similar challenges when studying
pilot performance, specifically manual flight
skills and how this is related to today's widespread
use of cockpit automation and autopilot
programming. According to Ebbatson et al.
(2010) pilots who fly aircraft with a high degree
of automation may find that their own manual
flying skills are reduced as a result of a lack of
opportunities to fly manually during everyday
operations, which is very important to master if
automation fails. De Boer and Dekker (2017)
have studied models related to automation
surprise based on the fact that events of this type
pose a significant threat to aviation safety. De
Boer and Dekker (2017) describe two different
perspectives within the research literature that
each have different approaches to human
cognitive processes - more specifically how the
interaction between man and automation is
understood. As for the implications for improving
safety, the authors argue for an approach to
human-automation interaction based on a
sensemaking perspective, which involves
increased focus on systemic factors including the
complexity of the operative context - rather than
just focusing on suboptimal, individual human
performance.
3.3 Discussion: Automation paradox and the
high degree of aviation safety
Let's return to AF447. It is worth noting that the
excerpt below from the aircraft's voice recorder
does not indicate that the pilots, including the
captain who was called back to the cockpit by the
pilot who did not fly (pilot not flying), understood
that the aircraft had now stalled. The excerpt is
extracted from about one and a half minutes into
the event.
Table 2. "Confusion in the cockpit" (excerpt from
Oliver et al., 2017:9)
Flight
officer1
But we've got the engines, what's
happening (…)? Do you understand
what's happening or not?
Flight
officer2
(…) I don't have control of the
airplane any more now. I don't have
control of the airplane at all
Flight
officer1
Control's to the left (….) what is that?
Flight
officer2
I have the impression (we have) the
speed
Captain
(Noise of cockpit door opening), Er,
what are you (doing)?
Flight
officer1
What's happening? I don't know, I
don't know what's happening
Flight
officer2
We're losing control of the aeroplane
there
Flight
officer1
We lost all control of the aeroplane,
we don't understand anything, we've
tried everything
Although the aircraft's computer declared "stall"
75 times over the four and a half minutes it took
from the autopilot disconnected to the aircraft hit
the ocean surface (there was also a lot of
aerodynamic noise due to turbulence around the
wings when the plane lost its height), none of the
pilots did anything to get the plane out of the stall
before the very end of the event.
In an Airbus A330, automation works so that a
stall warning is automatically turned off when the
speed is reduced to below a certain level - to avoid
false alarms. However, the "abnormal" angle of
the aircraft during the uncontrollable descent
caused the indicated airspeed to be lower than that
level, which meant that the aircraft's stall warning
cut out and the pilots were "misinformed" with
respect to the aircraft's real situation.
Figure 1. Automation paradox and safety
(Evjemo, 2018)
Normal
operations
(technology
and human
in control)
The paradox was that when the pilots at short
moments actually pushed the flight control
forward (which meant that the aircraft's speed
increased), it resulted in the stall warning
reappearing, which contributed to more confusion
Instructions for Preparing Paper for ESREL 2019 7
related to the pilots' situation understanding.
Oliver et al. (2017) point out that under normal
circumstances, the aircraft's flight control system
would make a stall impossible. Thus, features of
the technology (automation) designed to help
pilots under normal conditions now prevented
pilots from understanding the situation. Figure 1
thus illustrates the automation paradox in that
increasing degrees of automation have primarily
led to increased aviation safety. The literature
review, however, has shown that the relationship
between automation and safety is composite in the
same way as the relationship between automation
and autonomy. Increased safety due to increasing
degrees of automation is often seen in the context
of reduction of human error, increased decision
support and simplified communication routines -
and processes. At the same time, one sees
challenges for safety, preferably related to
human-automation interaction, where the AF447
accident and automation that came into conflict
with what the pilots did is an example. The
Boeing 737 MAX accidents also tell us how
important it is for pilots to be informed and trained
on safety critical systems - while also holding the
authority to override automation, which the
MCAS (Maneuvering Characteristics
Augmentation System) did not allow them.
Moreover, designing automation can also lead to
impairment of cognitive skills, including decision
making when abnormal and surprising situations
occur. In order to prevent loss of control as a
consequence, one can argue for the importance of
securing human decision-making authority over
the technology (Evjemo, 2018), which can be
challenging as the levels of automation
(Parasumaran et al., 2000) increase through
design. This is about challenges related to
understanding often complex conditions in
today's automated aircraft system, and how
increased degrees (levels) of automation that is
initially intended to strengthen safety also has a
potential downside in terms of reduced human
performance when eventually needed.
3.4 Concluding remarks
Based on the initial literature review, the overall
safety experiences related to increased
automation and human-automation interaction
can be summarized as follows:
Increased automation can prevent human
errors - for example, systems for
automatic recovery when control is lost
during flight
Automated decision support i.e.,
prediction of technical status can
increase situational understanding
through advance warning of unwanted
events
However, automation can cause the
boundaries of a safe system to be
challenged by reinforcing the
organization's limitations, hence the
automation paradox in relation to safety
High degree / complex automation can
mean that operators do not fully
understand the systems they are set to
work in - may result in failure to handle
unforeseen situations and eventual loss
of control
Based on the automation paradox (in relation to
safety) and to improve the design of automation
technology, we argue the need for more human
centered research on the following aspects in
order to mitigate future loss of control - something
which is also relevant in transportation systems
beyond aviation an example is the road sector's
focus on even more advanced autopilots and the
idea of self-driving cars.
More research to understand where the
limit values between human and
automation lie i.e., to understand the
preconditions for human decision
authority over automation - both human
limitations and strengths when
interacting with increasing degrees of
automation technology
More research-based knowledge on the
role of humans/operators in future
automated transport systems - identify
what kind of competence needed
More research is needed to understand
safety-critical challenges that arise when
automated systems counteract each other
Acknowledgement
This paper is based on research funded by
Braathens fund, NHO Aviation and LO Aviation
in Norway.
References
Atkins, E. M. (2010). Emergency landing automation
aids: An evaluation inspired by US Airways Flight
1549. AIAA Infotech at Aerospace 2010.
Ahlstrom, U. & Jaggard, E. (2010). Automatic
identification of risky weather objects in line of
flight (AIRWOLF). Transportation Research Part
C: Emerging Technologies 18(2): 187-192.
Billings, C. E. (1991). Human-Centered Aircraft
Automation: A Concept and Guidelines. NASA
Technical Memorandum 103885.
Borst, C., Mulder, M. & Van Paassen, M. M. (2010). A
review of Cognitive Systems Engineering in
8 First Author and Second Author
aviation. IFAC Proceedings Volumes (IFAC-
PapersOnline), 11, PART 1.
Chialastri, A. (2012). Automation in aviation. I F,
Kongoli (red.), Automation, InTech, DOI:
10.5772/49949.
Chialastri, A. & Pozzi, S. (2008). Resilience in the
aviation system. Lecture Notes in Computer Science
(including subseries Lecture Notes in Artificial
Intelligence and Lecture Notes in Bioinformatics),
5219 LNCS, 86-98.
Civil Aviation Authority. 2016. Norwegian Air Safety
Program (draft).
Dalcher, D. (2007). Why the pilot cannot be blamed: A
cautionary note about excessive reliance on
technology. International Journal of Risk
Assessment and Management 7(3): 350-366.
De Boer, R. & Dekker, S. (2017). Models of Automation
Surprise: Results of a Field Survey in Aviation.
Safety (3) 1-11.
Diffenderfer, P., Tao, Z. & Payton, G. (2013).
Automated integration of arrival/Departure
schedules. Proceedings of the 10th USA/Europe Air
Traffic Management Research and Development
Seminar, ATM 2013.
Dongmo, J.-E. T. (2015). Loss-of-control autonomous
flight recovery regime using feedback linearization
and high order sliding mode control with
exponential observers. AIAA Guidance, Navigation,
and Control Conference 2015, MGNC 2015.
Ebbatson, M., Harris, D., Huddlestone, J. & Sears R.
(2010). The relationship between manual handling
performance and recent flying experience in air
transport pilots. Ergonomics 53(2): 268-277.
Evjemo, T. E. (2018). Sikkerhet og autonomi i norsk
luftfart utfordringer og muligheter. Report
2018:01451. Trondheim: SINTEF Digital.
Farrugia, J. L., Montebello, M. & Abela, J. (2013).
Hazard avoidance control system for a robotic
vehicle. Proceedings of the 5th International
Conference on Agents and Artificial Intelligence,
ICAART, 2: 505-508.
Fiorino, F. (2008). New benefits, new risks. Aviation
Week and Space Technology (New York) 169(14):
49-51.
Gilbert, T. & Bruno, R. (2009). Surveillance and
broadcast services - An effective nationwide
solution. Proceedings of the 2009 Integrated
Communications, Navigation and Surveillance
Conference, ICNS 2009.
Gineste, M. & Gonzalez, H. (2010). On the usage of a
dedicated data link layer and half duplex terminals
in satellite systems for future air traffic
management. 28th AIAA International
Communications Satellite Systems Conference,
ICSSC 2010.
Glasgow, M. J. & Niehus, G. A. (2012). Improving
General Aviation safety using low-cost Iridium
devices. AIAA/IEEE Digital Avionics Systems
Conference Proceedings.
Harris, D. (2011). Human Performance on the Flight
Deck. Farnham: Ashgate Publishing.
IATA. (2015). Loss of Control In-Flight Accident
Analysis Report, 2010-2014.
Janakiraman, V. M., Matthews, B. & Oza, N. (2017).
Finding precursors to anomalous drop in airspeed
during a flight's takeoff. Proceedings of the ACM
SIGKDD International Conference on Knowledge
Discovery and Data Mining.
Kalleberg, R., Malnes, R. & Engelstad, F. (2009).
Samfunnsvitenskapenes oppgaver, arbeidsmåter og
grunnlagsproblemer. Oslo: Gyldendal Akademisk.
Khedkar, A. & Kumar, V. (2011). Self synchronization
based air traffic control and collision avoidance
system. AIAA/IEEE Digital Avionics Systems
Conference Proceedings.
Lancaster, J. A. & Casali, J. G. (2008). Investigating pilot
performance using mixed-modality simulated data
link. Human Factors 50(2): 183-193.
Landry, S. J. & Lagu, A. V. (2009). A model of
integrated operator-system separation assurance
and collision avoidance. Lecture Notes in Computer
Science (including subseries Lecture Notes in
Artificial Intelligence and Lecture Notes in
Bioinformatics), 5620 LNCS, 394-402.
Millot, P. & Boy, G. A. (2012). Human-machine
cooperation: A solution for life-critical Systems?
Work, 41, SUPPL.1, 4552-4559.
National Academy of Sciences. (2014). Autonomy
Research for Civil Aviation: Toward a New Era of
Flight. Committee on Autonomy Research for Civil
Aviation; Aeronautics and Space Engineering
Board; Division on Engineering and Physical
Sciences; National Research Council.
Oliver, N., Calvard, T., & Potočnik, K. (2017).
Cognition, technology and organizational limits:
Lessons from the Air France 447 disaster.
Organization Studies 28(4): 729-743.
Orlady, H.W. & Orlady, L. M. (1999). Human Factors
in Multi-Crew Flight Operations. Aldershot:
Ashgate Publishing Ltd.
Parasuraman, R., Sheridan, T.B., & Wickens, C.D.
(2000). A Model for Types and Levels of Human
Interaction with Automation. IEEE Transactions on
Systems, Man, and Cybernetics-Part A: Systems
and Humans, Vol. 30, No. 3.
Petersen C.L., Görges M., Ansermino J.M., & Dumont
G.A. (2014). A Scheme-Based Closed-Loop
Anesthesia System. CM International Conference
Proceeding Series, 40-49.
Schutte, P.C. (2017). How to make the most of your
human: design considerations for human-machine
interactions. Cognition, Technology and Work, 19
(2-3).
The Government. 2017. National Transport Plan 2014-
2023.
... However, they still operate on limited levels of automation and multiple significant challenges must still be overcome for the deployment of fully automated vehicles (Hussain and Zeadally, 2019 (Airbus) studied and experimented more than a century ago (Billings, 1991) (Evjemo and Johnsen, 2019) to reduce workload, notably. Today, airliners are equipped with multiple and complex assistance systems (Evjemo and Johnsen, 2019). ...
... However, they still operate on limited levels of automation and multiple significant challenges must still be overcome for the deployment of fully automated vehicles (Hussain and Zeadally, 2019 (Airbus) studied and experimented more than a century ago (Billings, 1991) (Evjemo and Johnsen, 2019) to reduce workload, notably. Today, airliners are equipped with multiple and complex assistance systems (Evjemo and Johnsen, 2019). For instance, some are equipped with autopilot systems that can maintain a trajectory, follow navigation guidelines, and operate during different parts of the activity. ...
... Regarding aircraft automation, for example, safety levels are very high and despite very good improvements over the years, the introduction of more aviation automation and technology was followed by increased accident rates. This was, at the time, partially due to "loss of situation awareness" and "loss of control" from automation (Evjemo and Johnsen, 2019). Since then, with more recent advances, accident rates have been greatly reduced and the overall safety level improved. ...
Thesis
Rail transport has historically been a pioneer in terms of autonomous transport, notably with the deployment of automated metro systems. However, extending automation to open main lines still seems complicated. Although the activity and technologies will inevitably evolve in the direction of the ''autonomous train'', human expertise is still necessary and will likely remain so during and after this evolution. To answer the current limits of automation and to deal with various exceptional situations, telecontrol is seen as a means of recovery, allowing a human operator to interact with the train and take part in the activity without being on board. In this context, teleoperation is considered as a potential means of ensuring this transition towards automation, while retaining the possibility of relying on human expertise when necessary. Naturally, the authority and responsibility of a human operator differs considerably between manual driving in teleoperation and autonomous driving, with little to no responsibility in the latter case and changes of authority between two driving modes can prove difficult. Thus, this study focuses on the specific phase of transferring authority between the autonomous train and a remote driver from. Specifically, the thesis focuses on the needs of the human operator during transfers, when they are deprived of most of the information that is essential for driving, compared with the activity within the train cab. This loss of information is accentuated when taking over a process that is already underway and where the remote driver was unaware of most of the process and the mission before being assigned to it.A general objective of the thesis is therefore to understand this authority transfer phase which characterises the recovery of the driving activity, by observing and focusing the study on the activity of future remote drivers and their needs in order to suggest potential interface solutions and adapted assistance systems . To this end, a model of human-machine cooperation and a model of the transfer process, defined by the interactions between the agents, have been proposed, along with initial suggestions for interfaces. These interfaces have been implemented and evaluated through two experimental campaigns aiming to test their usability and relevance in the context of the extracted drivers’ needs. Although there are still many barriers to be overcome, these experiments provide a perspective for future studies to contribute to the transformation of the train driving activity and that of technical systems centred around human factors. The aim is for the designers of these future driving systems to continue to integrate the presence and needs of human beings to create an effective and appropriate cooperative environment. Thus, if teleoperation proves to be a viable mode of recovery for autonomous trains, which is the current hypothesis, this study will serve as an initial source of information and recommendations to contribute to the design of a collaborative environment, laying down the fundamentals for remote recovery.
... However, accidents associated with pilots' loss of control are increasing, examples are Air France Flight 447 and Turkish Airlines Flight 1951 in 2009, Asiana Airlines Flight 214 in 2013, as well as the two Boeing 737 MAX accidents in 2018 and 2019. These accidents where pilots lose control of the aircraft are often coupled to adverse automation interaction, something the automation paradox related to the high degree of safety in aviation illustrates (Evjemo & Johnsen, 2019). This paper introduces a novel framework named DA-S (Decision Authority and Safety) conceptualizing how airline pilots interact with automation technology per se, its varying levels with the aim of facilitating being in the loop and maintaining control over the technology. ...
... The DA-S framework is based on empirical material from a research project focusing on aviation's experiences with automation technology and the pros and cons related to safety (Evjemo, 2018). A systematic literature search and interviews with aviation professionals were conducted -the former is also presented in Evjemo & Johnsen (2019), while the interview material is presented in more detail in this paper. The results of the interviews are thematically based on the interview guide as well as what the informants themselves emphasized during the interviews, which is intended to provide an indepth presentation on topics related to increasing degrees of automation and safety. ...
... The DA-S framework demonstrates, however, analytically how a future level 10 in terms of degree of automation, not necessarily imply increased safety. This can be viewed in the context of the automation paradox related to safety (Evjemo & Johnsen, 2019), and an organization's sometimes (in)ability to mobilize internal resources during emerging and unknown situations -but first and foremost, it's about recognizing that the most advanced automated systems today still have limitations when it comes to handling situations the system's design phase didn't capture. The framework is meant to illustrate how the human factor and pilot performance are paramount and influenced by the changing roles of the pilots as the degree of automation increases. ...
Conference Paper
Full-text available
Aviation is an industry with a very good safety record, something that the technological development in recent decades has contributed to. However, the overall accident decline does not apply to aircraft accidents associated with loss of control, which is often linked with increasing degrees of aircraft automation. Aviation automation can help reduce human error, as well as provide good decision support for pilots and air traffic controllers. On the other hand, there is a paradox in that automation can lead to unwanted and sometimes unknown situations - such as when pilots experience automation surprise. These types of situations can quickly escalate out of control as today's commercial pilots, to a lesser extent, maintain manual flying skills due to widespread use of, for example, varying autopilot functions. This paper introduces a novel framework named (DA-S) meant to conceptualize (and manage) how humans interact with automation technology, its varying levels and to what extent one is able to "be in the loop" - and at the same time able to maintain control over the automation during all phases of flight.
... An important element of the applicability associated with prospective sensemaking is about transferability to other situations, as well as to other domains. It can be envisaged that such a perspective is also useful in the investigation in, for example, aviation including the Boeing 737 MAX accidents where weak human-centred design is part of the picture ( Endsley, 2019), and to issues of unfortunate human-automation interaction, i.e., the safety paradox ( Evjemo & Johnsen, 2019). The transferability of prospective sensemaking to similar domains is thus rooted in the operationalization and conceptualization of the term. ...
Chapter
Full-text available
In late 2018 the Norwegian frigate KNM Helge Ingstad collided with an oil tanker sailing on an intersecting course. The accident occurred in an area where ship traffic was monitored by a shipping control centre. Afterwards, critical questions were asked about how such an accident could occur at all, including the publication of an audio log where one could hear the ones involved communicate. The Accident Investigation Board Norway points out that misunderstanding of the situation on the frigate was the root cause, while at the same time pointing out criticisms of the shipping control centre as well as the tanker. This chapter explores the actions of the involved parties prior to the accident focusing on communicative and coordinative practices including the use of distinct technology, as well as the configuring of events and participants´ expectations. The chapter explores how one can understand this type of accident going beyond traditional sensemaking by drawing on the idea of prospective sensemaking, i.e., sensemaking processes understood analytically as primarily forward oriented. The chapter demonstrates a way to operationalize prospective sensemaking on the grounds of the KNM Helge Ingstad accident and discusses potential benefits from an operationalization of prospective sensemaking within complex organizational domains
... Concluding this section, it is evident that a lot of lessons can be learned from aviation automation safety for the AV safety work designing. This is because aviation safety has a better process and longer stable market implementation [78]. Besides, as AV is not only about reliable and safe software, the interface to the hardware such as in the context of x-by-wire for AV should also acquire the lessons from the implementation of fly-by-wire during in-flight autopilot for example [79]. ...
Conference Paper
The development of connected and autonomous vehicles (CAVs) is progressing fast. Yet, safety and standardization-related discussions are limited due to the recent nature of the sector. Despite the effort that is initiated to kick-start the study, awareness among practitioners is still low. Hence, further effort is required to stimulate this discussion. Among the available works on CAV safety, some of them take inspiration from the aviation sector that has strict safety regulations. The underlying reason is the experience that has been gained over the decades. However, the literature still lacks a thorough association between automation in aviation and the CAV from the safety perspective. As such, this paper motivates the adoption of safe-automation knowledge from aviation to facilitate safer CAV systems. The authors briefly elaborate on the widely discussed aviation themes, including autopilot and auto-throttle malfunctions, flight management system, human factors, and suggests how this knowledge can improve the safety of road CAVs use-case. Besides, the differences between the safety consideration in the two fields are also denoted. In summary, the main aim of this paper is to highlight the potential benefits of adopting aviation automation safety knowledge into safe CAV development. With the advances in the CAV, the authors are convinced that this subject could serve software developers and engineers in developing safe and standardized CAV technology.
Technical Report
Full-text available
Norsk luftfart er internasjonal og blir påvirket av globale bransjetrender og teknologisk utvikling. Luftfartstilsynet poengterer at flysikkerhetsarbeid fremover vil innebære å opprettholde dagens positive sikkerhetsstatistikk i en bransje som vil oppleve stadig vekst i årene fremover. For å møte fremtidens krav til miljø og sikkerhet er man avhengig av nye teknologier, og autonome og automatiserte transportløsninger er slik sett en viktig driver. I forskningen om autonom transport innen transportsektoren er imidlertid luftfart lite representert - fokus har også blitt flyttet for raskt mot å beskrive fullt ut autonome systemer, mens den viktige overgangsfasen der stadig flere funksjoner blir automatiserte har fått mindre oppmerksomhet. Denne rapporten beskriver erfaringer og kunnskap luftfartsbransjen har opparbeidet over mange år knyttet til forholdet sikkerhet og ulike grader av automatisering, samt peker på utfordringer og muligheter ved autonom luftfart i fremtiden. Rapporten består av litteraturstudie og intervjustudie. Litteraturstudien beskriver automatiseringsparadokset ved at økende grader av automatisering først og fremst har medført økt sikkerhet - imidlertid er forholdet automatisering og sikkerhet sammensatt hvor utfordringer ved sikkerheten kan sees i sammenheng med menneske-automatisering interaksjon, hvor Air France 447 havariet i 2009 er et eksempel. I intervjustudien med aktører i norsk luftfart sees mulighetsrommet fremover i et sikkerhetsperspektiv mot nødvendigheten av å avklare forutsetninger for design og menneskets rolle, samt hvordan fremtidige automatiserte/autonome system kan håndtere unormale og ukjente situasjoner. Rapporten beskriver forsknings og - bransjeanbefalinger for videre arbeid. Oppdragsgiver: Ludvig G. Braathens fond / NHO Luftfart / LO Luftfart
Article
Full-text available
Automation surprises in aviation continue to be a significant safety concern and the community’s search for effective strategies to mitigate them are ongoing. The literature has offered two fundamentally divergent directions, based on different ideas about the nature of cognition and collaboration with automation. In this paper, we report the results of a field study that empirically compared and contrasted two models of automation surprises: a normative individual-cognition model and a sensemaking model based on distributed cognition. Our data prove a good fit for the sense-making model. This finding is relevant for aviation safety, since our understanding of the cognitive processes that govern human interaction with automation drive what we need to do to reduce the frequency of automation-induced events.
Conference Paper
Full-text available
Aerodynamic stall based loss of control in flight is a major cause of fatal flight accidents. In a typical takeoff, a flight's airspeed continues to increase as it gains altitude. However, in some cases, the airspeed may drop immediately after takeoff and when left uncorrected, the flight gets close to a stall condition which is extremely risky. The takeoff is a high workload period for the flight crew involving frequent monitoring, control and communication with the ground control tower. Although there exists secondary safety systems and specialized recovery maneuvers, current technology is reactive; often based on simple threshold detection and does not provide the crew with sufficient lead time. Further, with increasing complexity of automation, the crew may not be aware of the true states of the automation to take corrective actions in time. At NASA, we aim to develop decision support tools by mining historic flight data to proactively identify and manage high risk situations encountered in flight. In this paper, we present our work on finding precursors to the anomalous drop-in-airspeed (ADA) event using the ADOPT (Automatic Discovery of Precursors in Time series) algorithm. ADOPT works by converting the precursor discovery problem into a search for sub-optimal decision making in the time series data, which is modeled using reinforcement learning. We give insights about the flight data, feature selection, ADOPT modeling and results on precursor discovery. Some improvements to ADOPT algorithm are implemented that reduces its computational complexity and enables forecasting of the adverse event. Using ADOPT analysis, we have identified some interesting precursor patterns that were validated to be operationally significant by subject matter experts. The performance of ADOPT is evaluated by using the precursor scores as features to predict the drop in airspeed events.
Article
Full-text available
Reconsidering the function allocation between automation and the pilot in the flight deck is the next step in improving aviation safety. The current allocation, based on who does what best, makes poor use of the pilot’s resources and abilities. In some cases, it may actually handicap pilots from performing their role. Improving pilot performance first lies in defining the role of the pilot—why a human is needed in the first place. The next step is allocating functions based on the needs of that role (rather than fitness), and then using automation to target specific human weaknesses in performing that role. Examples are provided (some of which could be implemented in conventional cockpits now). Along the way, the definition of human error and the idea that eliminating/automating the pilot will reduce instances of human error will be challenged.
Chapter
Full-text available
Article
This paper discusses selected aviation human factors challenges in the air transport environment. Because of the authors’ background, much of this paper will be based upon the growth of human factors and airline training in transport aviation in the United States. However, parallel growth and advancements have been made in other parts of the world from the beginning of air transport and are being made in many parts of the world today. This is certainly true of Great Britain. One should never forget that air transport is a world enterprise and has been since its beginning. While it is important to recognise that cultural and other differences should be considered if they are relevant, it is equally important in air transport to recognise that common problems are solved by common solutions and can be addressed by, but are not limited to, common approaches. It is recognised that the behavioral aspect of aviation human factors deserves considerable attention, particularly in the area of training and evaluation. We have also learned that previous training criteria are not satisfactory for operation with increasingly complex aircraft being operated in an increasingly complex environment. Human errors occur with greater frequency than is often believed and can be made by anyone — even an airline captain. An inevitable consequence of this entirely human characteristic is that the role of the co-pilot has been considerably enhanced. Furthermore, the monitoring function of the pilot-not-flying, whether he/she is the captain or co-pilot, is now recognised as a critical safety function by virtually all serious operational experts.
Conference Paper
Although the increased level of automation on the flight deck has resulted in lower pilot workload, improved flight technical performance, and safety enhancements, new problems related to pilot situation awareness and system understanding have also emerged. Issues such as information ambiguities, intent confusion, and counterproductiveness in unanticipated events have become new causes for accidents. Ideally, a cooperative process would be desired, in which the automation enables pilots to function to their full potential by keeping them continuously involved in the decision-making loop. This paper provides an overview of utilizing the Cognitive Systems Engineering (CSE) approach to mitigate issues in flight deck design. As such, it provides a review of CSE applications in the aviation domain and discusses their main outcomes, state of the art, challenges, and recommendations for future investigations.
Conference Paper
Due to the growth of the aeronautical traffic that is foreseen in the next few years, by the year 2020 to 2030, a new paradigm in aeronautical communications is envisaged, by defining new Air Traffic Management (ATM) concepts. EUROCONTROL and the Federal Aviation Administration (FAA) have initiated a joint study to identify potential future communications technologies to meet safety and regularity of flight communications requirements, i.e. those supporting Air Traffic Ser vices (ATS) and safety related Aeronautical Operational Control (AOC) communications [1]. The objective is to replace progressively voice communication for air traffic m anagement by data communications services for safety reasons and because it supports increased automation in the aircraft and on the ground. Potential resource savings should al so be possible when replacing voice by data communications. These data communications should then become the primary means for safety air-ground communication. The satellite is a good candidate to transport these future data communications for Air Traffic Management, because of its coverage of oceanic regions and remote areas and its flexibility in ter ms of resource usage to deal with dense traffic areas. In this paper, we are going to study whether the satellite could be a good candidate in regards to the traffic profile and the services requirements while trying to minimize the cost of the equipments, using half duplex terminals. Thus, a comparison of satellite systems using half duplex and full duplex terminals will be performed through an evaluation of performance and a cost study.