ArticlePDF Available

A Summary of Unmanned Aircraft Accident/Incident Data: Human Factors Implications

Authors:

Abstract and Figures

A review and analysis of unmanned aircraft (UA) accident data was conducted to identify important human factors issues related to their use. UA accident data were collected from the U.S. Army, Navy, and Air Force. Classification of the accident data was a two-step process. In the first step, accidents were classified into the categories of human factors, maintenance, aircraft, and unknown. Accidents could be classified into more than one category. In the second step, those accidents classified as human factors-related were classified according to specific human factors issues of alerts/alarms, display design, procedural error, skill-based error, or other. Classification was based on the stated causal factors in the reports, the opinion of safety center personnel, and personal judgment of the author. The percentage of involvement of human factors issues varied across aircraft from 21% to 68%. For most of the aircraft systems, electromechanical failure was more of a causal factor than human error. One critical finding from an analysis of the data is that each of the fielded systems is very different, leading to different kinds of accidents and different human factors issues. A second finding is that many of the accidents that have occurred could have been anticipated through an analysis of the user interfaces employed and procedures implemented for their use. This paper summarizes the various human factors issues related to the accidents.
Content may be subject to copyright.
A Summary of Unmanned
Aircraft Accident/Incident Data:
Human Factors Implications
Kevin W. Williams
Civil Aerospace Medical Institute
Federal Aviation Administration
Oklahoma City, OK 73125
December 2004
Final Report
This document is available to the public through:
The Defense Technical Information Center
Ft. Belvior, VA. 22060
The National Technical Information Service
Springeld, Virginia 22161
Ofce of Aerospace Medicine
Washington, DC 20591
DOT/FAA/AM-04/24
NOTICE
This document is disseminated under the sponsorship of
the U.S. Department of Transportation in the interest of
information exchange. The United States Government
assumes no liability for the contents thereof.
i
Technical Report Documentation Page
1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.
DOT/FAA/AM-04/24
4. Title and Subtitle 5. Report Date
December 2004 A Summary of Unmanned Aircraft Accident/Incident Data:
Human Factors Implications 6. Performing Organization Code
7. Author(s) 8. Performing Organization Report No.
Williams KW
9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)
FAA Civil Aerospace Medical Institute
P.O. Box 25082 11. Contract or Grant No.
Oklahoma City, OK 73125
12. Sponsoring Agency Name and Address 13. Type of Report and Period Covered
Office of Aerospace Medicine
Federal Aviation Administration
800 Independence Ave., S.W.
Washington, DC 20591 14. Sponsoring Agency Code
15. Supplemental Notes
Work was accomplished under approved task AM-HRR-521.
16. Abstract
A review and analysis of unmanned aircraft (UA) accident data was conducted to identify important human
factors issues related to their use. UA accident data were collected from the U.S. Army, Navy, and Air Force.
Classification of the accident data was a two-step process. In the first step, accidents were classified into the
categories of human factors, maintenance, aircraft, and unknown. Accidents could be classified into more than
one category. In the second step, those accidents classified as human factors-related were classified according to
specific human factors issues of alerts/alarms, display design, procedural error, skill-based error, or other.
Classification was based on the stated causal factors in the reports, the opinion of safety center personnel, and
personal judgment of the author. The percentage of involvement of human factors issues varied across aircraft
from 21% to 68%. For most of the aircraft systems, electromechanical failure was more of a causal factor than
human error. One critical finding from an analysis of the data is that each of the fielded systems is very different,
leading to different kinds of accidents and different human factors issues. A second finding is that many of the
accidents that have occurred could have been anticipated through an analysis of the user interfaces employed and
procedures implemented for their use. This paper summarizes the various human factors issues related to the
accidents.
17. Key Words 18. Distribution Statement
Unmanned Aerial Vehicle, Accidents, Human Factors Document is available to the public through the
Defense Technical Information Center, Ft. Belvior, Va.
22060; and the National Technical Information
Service, Springfield, VA 22161
19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price
Unclassified Unclassified 17
Form DOT F 1700.7 (8-72) Reproduction of completed page authorized
1
A SUMMARY OF UNMANNED AIRCRAFT ACCIDENT/INCIDENT DATA:
HUMAN FACTORS IMPLICATIONS
INTRODUCTION
Available reports regarding unmanned aircraft (UA)
reliability have noted that the accident rate for UA is, in
general, much higher than that of manned aircraft (DoD,
2001; Schaefer, 2003; Tvaryanas, 2004). An understand-
ing of the causal factors associated with these accidents is
important if the goal is to improve the reliability of these
aircraft to a level comparable to manned aircraft.
Human factors are consistently cited as a major cause
of manned aircraft accidents. Estimates of the percentage
of accidents that implicate human error range from 70%
to 80% (Wiegmann & Shappell, 2003). In addition, over
the past 40 years, the percentage of accidents attributable
to human error has increased relative to those attributable
to equipment failures (Shappell & Weigmann, 2000).
The review and analysis of UA accident data can
assist researchers in the identication of important hu-
man factors issues related to their use. The most reliable
source for UA accident data currently is the military.
The military has a relatively long history of UA use and
is diligent in accurately recording information pertaining
to accidents/incidents. The purpose of this report is to
review all currently available information on military UA
accidents to determine to what extent human error has
contributed to those accidents and to identify specic
human factors involved in the accidents.
Nomenclature
Designations for unmanned aircraft are almost as var-
ied as the aircraft themselves. The most common term
for these aircraft is Unmanned Aerial Vehicle (UAV).
They have also been called Uninhabited Aerial Vehicles
(also UAVs), Remotely Operated Vehicles (ROVs), and
Remotely Piloted Vehicles (RPVs). In addition, the mili-
tary has some special categories of unmanned aircraft
that require additional nomenclature. These categories
include Tactical UAVs (TUAV), Combat UAVs (UCAV),
Unmanned Combat Armed Rotorcraft (UCAR), and
“drones.” Some agencies have problems with the use
of the term vehicle” for aircraft. So there also exist
designations like Remotely Operated Aircraft (ROA),
Robotic Aircraft (RA), Remotely Piloted Aircraft (RPA),
and Unmanned Aircraft (UA). The term “Unmanned
Aircraft” (UA) will be used in this paper to designate the
large population of remotely piloted, operated, and/or
monitored aircraft.
Military Accident Classication System
Military accidents are classied based on mon-
etary damage and/or severity of injury to personnel.
All military branches have similar accident classica-
tion schemes. The most severe accident classica-
tion is Class A. Table 1 shows the accident classes
for the Army.
Table 1. Army accident classes (Department of the Army, 1994a).
Class A Class B Class C Class D
An accident in which
the resulting total cost of
property damage is
$1,000,000 or more; an
Army aircraft or missile
is destroyed, missing, or
abandoned; or an injury
and/or occupational
illness results in a
fatality or permanent
total disability.
An accident in which
the resulting total cost of
property damage is
$200,000 or more but
less than $1,000,000; an
injury and/or
occupational illness
results in permanent
partial disability, or
when three or more
personnel are
hospitalized as
inpatients as the result
of a single occurrence.
An accident in which
the resulting total cost of
property damage is
$20,000 or more but less
than $200,000; a
nonfatal injury that
causes any loss of time
from work beyond the
day or shift on which it
occurred; or a nonfatal
occupational illness that
causes loss of time from
work or disability at any
time.
An accident in which
the resulting total cost of
property damage is
$2,000 or more but less
than $20,000.
2
3
The Air Force and Navy denitions of Class A, B,
and C accidents are very similar to the Army deni-
tions. However, both the Navy and Air Force classify
their mishaps/accidents into only the three categories of
A, B, and C. They do not have a category D.
Data Sources
To collect UA accident data, personnel from the Safety
Centers of the Army, Navy, and Air Force were contacted.
Requests were made for all data related to UA accidents,
mishaps, and incidents from each of the Safety Centers.
Personnel from military research laboratories were also
contacted for information and reports summarizing ac-
cident data. In addition, an Internet search was conducted
to identify and download accident information.
Army Data
Two primary sources of accident information were
collected from the Army. The rst source was a report
entitled “The Role of Human Causal Factors in U.S.
Army Unmanned Aerial Vehicle Accidents” (Manning,
Rash, LeDuc, Noback, & McKeon, 2004). The report
was produced by the U.S. Army Aeromedical Research
Laboratory and is a summary of 56 UA accidents that
occurred between January 1995 and February 2003. The
accident data were obtained from the U.S. Army Risk
Management Information System (RMIS), maintained
by the U.S. Army Safety Center (USASC), Fort Rucker,
Alabama. The accidents were summarized using two
taxonomies, a modied version of the Human Factors
Analysis and Classication System (HFACS; Shappell &
Weigmann, 2000), and the Army accident investigation
and reporting taxonomy, DA PAM 385-40 (Department
of the Army, 1994b).
The second source of information was a direct query of
the RMIS system. The query examined all UA accidents
contained in the RMIS database that occurred between
January 1980 and June 2004. A total of 74 accidents
were identied, the earliest of which occurred on March
2, 1989, and the latest on April 30, 2004.
Navy Data
Information regarding UA accidents for the Navy
was collected from the Naval Safety Center. A summary
of UA mishaps occurring between 1986 and 2002 was
received from the Naval UA Pioneer training command
in Pensacola, Florida, via the Naval Safety Center (Kor-
deen Kor, personal communication). The summary lists
239 mishaps, including the mishap level, date, location,
and a brief description. The brief description, while not
providing much detail, allowed the general classication
of the mishap, including whether the mishap was or was
not related to human factors.
Air Force Data
Air Force accident/mishap information was collected
from the Air Force Judge Advocate General’s Corps
Web site, http://usaf.aib.law.af.mil/. The Web site gives
the executive summaries of Air Force Class A mishaps,
organized by year. Lower-level mishaps were not avail-
able. A total of 15 Class A UA mishaps were retrieved
from the Web site, covering the dates from December
6, 1999, to December 11, 2003. In addition to these
executive summaries, a complete accident investigation
board report of the December 6, 1999, accident was
received electronically from Major Curtis McNeil of the
Judge Advocate General’s Corps ofce. Also, a summary
of Air Force accidents and human factors issues related
to UA was received electronically from Major Anthony
P. Tvaryanas (Tvaryanas, 2004).
Data Reliability Issues
Unfortunately, the data regarding UA accidents are not
usually as detailed as that surrounding manned aircraft.
One reason for this lack of detail is that most UA used
in the military are much less expensive than manned
aircraft and so do not warrant the same level of analysis.
In addition, the military does not release much of the
detailed information regarding specic UA accidents to
the general public.
There are also problems regarding the classication of
UA in the military. The Army, for example, only recently
has begun to classify UA as aircraft. Before, they were
classied only as vehicles. Therefore, accidents involving
Army UA were treated in the same fashion as ground
vehicles. A similar situation existed until recently for
the Navy. In effect, there are really no highly detailed
accident investigations performed for UA accidents, with
the exception of the Air Force. The Air Force, however,
will not release detailed reports to the general public and
puts restrictions on the writing of reports based on such
detailed data. Consequently, much of the reported ac-
cident data collected for this report consists of summaries
of several accidents or simple one-sentence statements
regarding individual accidents.
Classication Procedure
Classication of the accident data was a two-step
process. In the rst step, accidents were classied into
broad categories based on whether it was clear the ac-
cident was related to human factors or was a failure of
an aircraft component. For some aircraft systems, other
categories were included based on information specic to
that aircraft. The category “Aircraft” included problems
associated with the failure of a mechanical or electrical
component of the airframe. An “Unknown” category was
used if there were accidents with insufcient information
2
3
for categorization. A category of “Maintenance” was also
included if there was evidence that an action by mainte-
nance personnel contributed to the accident. An example
would be a failure by a maintenance technician to check
the oil level prior to a ight, leading to an engine failure
during the ight. This category was separated from hu-
man factors because it did not involve a member of the
ight crew and because UA maintenance is an important
topic by itself that should be addressed separately. Note
that accidents could be classied into more than one
category. For example, an accident could be classied
as both “Aircraftand “Human Factors” if a mechani-
cal failure was also accompanied by an inappropriate or
inadequate display indication to the crew.
In the second step, those accidents classied as related
to human factors were classied according to specic
human factors issues that are commonly addressed in
current research. These issues included alerts/alarms,
display design deciencies, procedural errors, and skill-
based errors. Other human factors issues were included
for a particular aircraft if evidence was available from
the reports indicating it was an important factor in the
accident. Classication was based on the stated causal fac-
tors in the reports, the expressed opinion of safety center
personnel, and the personal judgment of the author.
RESULTS
Because of the enormous differences in the human
interfaces of the various UA systems, it did not make sense
to combine all of the accident data into a single analysis.
Instead, each system will be looked at individually to see
how their design approaches have inuenced the types
of human errors that have occurred.
There are ve primary U.S. military UA in service
currently. The Army’s Hunter and Shadow, the Navy’s
Pioneer, and the Air Force’s Predator and Global Hawk.
Other systems are being developed and have undergone
testing, such as the Mariner system for the Coast Guard
and Navy, but sufcient accident data do not exist to
warrant separate analyses of these airframes. A descrip-
tion of these ve UA systems included in this analysis
is provided in an effort to further understanding of the
accident data associated with each.
U.S. Army
Human Causal Factors Report
Manning et al. (2004) looked at Army UA accident
data using both the HFACS taxonomy and the DA PAM
385-40 analysis. A total of 56 UA accidents were analyzed,
with 18 (32%) identied as involving human error. They
did not perform an analysis separately for the Hunter
and Shadow UA; however, they noted that 17 accidents
involved the Hunter and 10 involved the Shadow, but
the report does not state how many of those involved
human error. In addition, accident data were included
in the analysis even if the UA type was only identied as
a “drone,” “trainer,” or simply “UAV.”
Regarding the HFACS analysis, the largest percentage
of accidents involving human error (61%) was attributed
to the category called Unsafe Acts. Unsafe Acts is broken
down further into the subcategories of skill-based errors,
decision errors, perceptual errors, and violations. Decision
errors accounted for the highest percentage of human error
accidents (33%), followed by skill-based errors (22%),
perceptual errors (17%), and violations (11%).
The DA PAM 385-40 analysis divides accidents into
the categories of individual failure, leader failure, train-
ing failure, support failure, and standards failure. For
denitions of these categories the reader is referred to
DA PAM 385-40. Sixty-one percent of the human-error
accidents were attributed to individual failures, followed
by standards failures (44%), leader failures (33%), train-
ing failures (22%), and nally support failures (6%). The
percentages do not add to 100% because many of the
accidents fell into more than one category.
One shortcoming of the summary by Manning et al. is
that the analysis does not identify specic human factors
issues associated with UA accidents. The use of the HFACS
taxonomy helped in the identication of the importance
of decision-making and the development of crew skills
in the prevention of accidents but did not reveal specic
design shortcomings of the various systems included in
the data. For this reason, a second analysis was conducted
of the accidents contained in the RMIS database. This
analysis examined 74 UA accidents that occurred between
March 2, 1989, and April 30, 2004.
These 74 accidents were separated into those related
specically to the Hunter (32 accidents), those involving
the Shadow (24 accidents), and those concerning other
types of UA (18 accidents). The Hunter and Shadow
accidents were further analyzed to identify how many
involved human factors issues and which issues were
associated with a specic type of aircraft.
Hunter
The Hunter (see Figure 1) is a twin-engine, short-range
(144nm) tactical aircraft, with a payload capacity of 200
pounds and endurance of up to 12 hours (Manning et al.,
2004). The aircraft weighs 1600 pounds and has a 29-
foot wingspan, a ceiling of 15,000 feet, a cruising speed
of 100 kts and a cost of $1.2M (Schaefer, 2003).
The Hunter takes off and lands using an External Pilot
(EP) standing next to the runway in visual contact with the
aircraft, operating a controller that is very similar to ones used
by radio-controlled aircraft hobbyists (see Figure 2).
4
5
As shown in Figure 1, takeoffs can be assisted using
a rocket bottle that releases from the aircraft shortly af-
ter takeoff. After takeoff and climb out, control of the
aircraft is transferred to an Internal Pilot (IP), operating
from a Ground Control Station (GCS). The IP controls
the Hunter in a more automated fashion by selecting an
altitude, heading, and airspeed for the aircraft using a set
of knobs located within the GCS. For landing, control
of the aircraft is transferred from the GCS back to an
EP. A hook located below the aircraft is used to snag the
aircraft on a set of arresting cables positioned across the
runway.
Data from the Hunter program indicated that 15 of the
32 accidents (47%) had one or more human factors issues
associated with them. Figure 3 shows the major causal
categories for Hunter accidents. Note that the percentages
add to more than 100% because some of the accidents
were classied into more than one category.
Breaking down the human factors issues further, Table
2 shows how the number and percentage of the 15 hu-
man factors-related accidents are associated with specic
human factors issues. Again, percentages exceed 100%
because of some accidents being classied under more
than one issue.
By far the largest human factors issue is the difculty
experienced by EPs during landings, with 47% of the
human factors-related Hunter accidents occurring during
this phase. An additional 20% of the accidents involved
an error by the EP during takeoff. Control difculties
are at least partially caused because when the aircraft
is approaching the EP, the control inputs to maneuver
the aircraft left and right are opposite what they would
be when the aircraft is moving away from the EP. This
reversed-control problem is present for any UA operated
by an external pilot via visual contact. Other research
has also identied this problem as an important human
factors issue related to UA (Gawron, 1998).
Besides EP control problems, other issues represented
in the table include pilot-in-command issues, alerts and
alarms, display design, and crew procedural error. A pi-
lot-in-command issue is a situation where the authority
of the controlling pilot is superceded by other personnel
in the area, violating the principle that the pilot of the
aircraft has the nal decision-making authority during a
ight. In contrast, alerts and alarms deal with situations
where a non-normal ight condition (e.g., high engine
temperature) is not conveyed effectively to the crew.
Display design issues typically manifest when not all
of the information required for safe ight is conveyed
effectively to the crew. In the instance referred to in Table
2, a ground crew member had toggled off the autopilot
feature, and no display was available to the ight crew
regarding the status of the autopilot. Since the autopilot
is typically engaged when the IP is controlling the UA,
the pilot failed to notice the status of the autopilot, thus
contributing to an accident.
Finally, the crew procedural errors referred to here
involved three occasions where the crew failed to prop-
erly follow established procedures. On one occasion an
improper start-up sequence led to data link interference
from the backup GCS. A second event occurred when
the crew failed to follow standard departure procedures
and the UA impacted a mountain. The third occasion
Figure 1. U.S. Army Hunter (RQ-5) unmanned
aircraft.
Figure 2. Controller for radio-controlled aircraft.
4
5
was when an EP failed to complete control box checks
prior to taking control of the UA and did not verify a
box switch that was in the wrong position.
Shadow
Compared with the Hunter, the Shadow 200 (Figure
4) is a smaller (9 ft in length), and lighter (330 lbs)
short-range surveillance aircraft, capable of operating at
altitudes of 14,000 ft and carrying a payload of up to
60 lbs (Manning et al., 2004). The Shadow 200 has an
operational range of 68nm, a cruising speed of 82 kts and
a cost of $325,000 per aircraft (Schaefer, 2003).
Unlike the Hunter, the Shadow does not use an external
pilot, depending instead on a launcher for takeoffs and
an automated landing system for recovery. The landing
system, called the Tactical Automated Landing System
(TALS), controls the aircraft during approach and land-
ing, usually without intervention from the GCS pilot.
A cable system, similar to the one used for the Hunter,
is used to stop the aircraft after landing. Aircraft control
during ight is accomplished by the GCS pilot through
a computer menu interface that allows selection of al-
titude, heading, and airspeed. It is interesting to note
that during landing, the GCS personnel have no visual
contact with the aircraft, nor do they have any sensor
input from onboard sensors. A command to stop the
aircraft engine is given by the GCS pilot, who must rely
on an external observer to communicate that the plane
has touched down.
The analysis of Shadow accidents shows a different
pattern from that seen with the Hunter. In contrast to
the Hunter, only 5 of the 24 Shadow accidents (21%)
were attributed to human factors issues. Figure 5 shows
the major causal factors for the Shadow accidents.
In addition to the four categories used for the Hunter
accidents, an additional category was added for Shadow
to include failures of the tactical automated landing
system. While eliminating landing accidents potentially
attributable to an EP, the use of TALS is not perfect, as
shown from the data. Use of the launcher eliminated any
EP takeoff errors for these aircraft.
4
15
16
1
0
2
4
6
8
10
12
14
16
18
Maintenance Human Factors Aircraft Unknown
Number of accidents
47%
50%
9%
3%
Figure 3. U.S. Army Hunter accident causal factors.
Table 2. Breakdown of human factors issues for Hunter
accidents.
Issue Number Percent
Pilot-In-Command 1 7%
Alerts and Alarms 2 13%
Display Design 1 7%
External Pilot Landing Error 7 47%
External Pilot Takeoff Error 3 20%
Procedural Error 3 20%
6
7
2
5
10
6
4
0
2
4
6
8
10
12
Maintenance Human
Factors
Aircraft TALS Unknown
Figure 5. U.S. Army Shadow accident causal factors.
8%
21
%
42
%
2
5%
1
7%
Figure 4. U.S. Army Shadow (RQ-7) unmanned
6
7
Breaking down the human factors-related accidents,
Table 3 shows the number and percentage of the ve
accidents related to specic human factors issues. As can
be seen from the table, the distribution of issues is evenly
divided across pilot-in-command, alerts and alarms, dis-
play design, and procedural errors.
For both the Hunter and Shadow, at least one accident
involved the transfer of control of the aircraft from one
GCS to another during ight, an activity unique to UA.
In the case of the Shadow, two aircraft were damaged
during a single mission. The rst was damaged due to a
TALS failure. After the accident, the GCS crew issued
a command to the damaged aircraft to kill its engine,
but because of damage to the antenna the command
was not received. That same GCS was then tasked with
controlling a second Shadow that was on an approach.
Unfortunately, after taking control of the second Shadow,
the aircraft received the “engine kill” command that was
still waiting for an acknowledgment from the GCS soft-
ware, causing the second Shadow to also crash. This ac-
cident was classied as both a procedural error (because
the crew failed to follow all checklist items prior to the
transfer of control of the second aircraft) and a display
design problem (because there was not a clear indication
to the crew of the status of the engine kill” command
that had been issued).
While not frequent, such accidents suggest the unique
nature of problems that can arise with UA that would not
be encountered with manned aircraft. As if to emphasize
the point, on a recent Coast Guard operational test of
an Altair UA, a problem was encountered during one of
the ights while control was being transferred from one
operator station to another (Randy Sundberg, personal
communication). The second operator station apparently
had a fuel control switch out of position, so that when
control was passed over, the engine died. Luckily, in this
instance, the aircraft was high enough that the engine
was restarted without incident.
U.S. Navy
Pioneer
The longest serving UA in the military is the Pioneer
(RQ-2), which has been used by the Navy and Marine
Corps since 1985 and has logged over 20,000 hours of
ight time (Schaefer, 2003). The Pioneer (see Figure 6)
is a single-engine, propeller-driven aircraft. It is 14 ft
long with a wingspan of 17 ft. It weighs 452 lbs and has
a payload capacity of 72 lbs. It can y for 5 hrs without
refueling, has a ceiling of 15,000 ft, and a cruising speed
of 80 kts. Each aircraft costs $650,000. Notably, in Sep-
tember 2002, the Navy discontinued operation of the
Pioneer, leaving the Marine Corps as the only operator.
Like the Army’s Hunter UA, the Pioneer requires an
EP for takeoff and landing. After takeoff, the aircraft
can be controlled from a GCS in one of three modes. In
the rst mode, the air vehicle is operated autonomously,
and the autopilot uses global positioning system (GPS)
preprogrammed coordinates to y it to each waypoint.
In the second mode, the IP commands the autopilot by
setting knobs (rotary position switches) to command
airspeed, altitude, compass heading or roll angle, and
the autopilot ies the UA. In the third mode, the IP
ies the aircraft using a joystick. The Pioneer can be
landed at a runway using arresting cables, but because it
is a Navy/Marine-operated aircraft, it can also be landed
on board a ship by ying into a net. There are plans
for implementing an automated landing system for the
Pioneer for ship-based landings.
A list of 239 Pioneer accidents was received from the
Navy Safety Center. The accidents cover the period from
1986 until 2002. Although not providing much detail,
the data did allow a general categorization of accidents
into principal causal categories. Figure 7 shows the major
causal factors for Pioneer accidents.
As can be seen from the gure, human factors-related
issues were present in approximately 28% of the accidents.
Also, a small number of mishaps (5) were attributed to
enemy actions. Breaking down the human factors-related
accidents further, Table 4 lists the number and percent-
age of the 68 accidents related to specic human factors
issues.
As with the Army Hunter accidents, the largest per-
centage of human factors accidents (68%) was associated
with the difculty experienced by the EP while landing
the aircraft. An additional 10% of the accidents were
associated with takeoffs, although the primary means
of taking off is through the use of a launcher (from
ship-based aircraft). In addition to landing and takeoff
errors, two other issues seen with the Pioneer were aircrew
Figure 6. Pioneer RQ-2 unmanned aircraft.
8
9
5
68
156
5 5
0
20
40
60
80
100
120
140
160
180
Maintenance Human
Factors
Aircraft Enemy Unknown
Figure 7. U.S. Navy Pioneer UA accident causal factors.
2
%
2
8%
65%
2
%
2
%
Table 5. Specifications for the Air Force MQ-1 and MQ-9
(from Schaefer, 2003).
MQ-1 MQ-9
Gross Weight 2,250 lbs 10,000 lbs
Length 28.7 ft 36.2 ft.
Wingspan 48.7 ft 64 ft
Ceiling 25,000 ft 45,000 ft
Radius 400 nm 400 nm
Endurance 24 + hrs 24 + hrs
Payload 450 lb 750 lb (internal)
3000 lb (external)
Cruise Speed 70 kts 220 kts
Aircraft cost (w/out sensors) $2.4 M $6 M
System Cost (4 Avs) $26.5 M $47 M
Table 4. Breakdown of human factors issues for
Pioneer accidents.
Issue Number Percent
Aircrew Coordination 9 13%
Landing Error 46 68%
Take-off Error 7 10%
Weather 6 9%
8
9
coordination, which includes procedural and communi-
cation type errors, and weather-related accidents, which
deal with pilot decision-making. Unfortunately, details
regarding these accidents were not sufcient to identify
issues beyond this level.
Fire Scout
An additional accident for which information was avail-
able was the crash of a Navy-owned Vertical Take-off and
Landing Tactical Unmanned Aerial Vehicle (VTUAV),
called the Fire Scout (see Figure 8).
The Fire Scout air vehicle (RQ-8A) is based on the
Schweizer Aircraft Corporation Model 330 manned tur-
bine helicopter. The Fire Scout has a gross takeoff weight
of 2,550 lbs, cruises at 110 kts, and is intended to loiter
on-station at 110 nm for over 3 hrs (DoD, 2002).
The investigation of the accident, which occurred on
November 4, 2000, revealed that human error, associ-
ated with damage to onboard antennas during ground
handling, led to the accident. Because of the damage
to the antennas, an incorrect signal was emitted, caus-
ing the radar altimeter system to incorrectly track the
altitude. The antennas gave a false reading that indicated
that the Fire Scout was at an altitude of 2 ft above the
ground when, in fact, it was hovering at an altitude of
500 ft (Strikenet, 2001). After the “land” command was
given, the aircraft descended two ft to 498 ft AGL. The
guidance and control system interpreted the incorrect
altitude signal as an indication that the Fire Scout had
already landed and, performing as designed, shut down
the engine. Although the Fire Scout is not widely used,
the accident is included here because it is another example
of how unique approaches to automation and procedures
with UA can lead to unique mishaps.
Air Force
Predator
The Predator made its rst ight in June 1994. There
are two Predator types, currently designated as MQ-1 and
MQ-9, also called Predator and Predator B. Figures 9 and
10 show photos of the MQ-1 and MQ-9, respectively.
The specications for both the MQ-1 and MQ-9 are
presented in Table 5. The Predator aircraft is own from
within the GCS, similarly to a manned aircraft, using a
joystick and rudder pedals and a forward-looking camera
that provides the pilot with a 30-degree eld of view. The
Figure 8. The U.S. Navy Fire Scout (RQ-8A).
Figure 9. Predator MQ-1.
Figure 10. Predator B MQ-9.
10
11
camera is used for both takeoffs and landings. Figure 11
shows a picture of the Predator GCS.
The Predator accident causal factors are shown in Fig-
ure 12. As can be seen from the gure, human factors
encompass a higher percentage (67%) than aircraft-related
causes, unlike the other aircraft examined thus far.
Table 6 shows a breakdown of the human factors is-
sues associated with Predator accidents. The majority of
human factors-related problems were concerned with
procedural errors on the part of the ight crew. One
of these accidents involved yet another problem with a
handoff of the aircraft from one GCS to another. During
the handoff, the mishap crew did not accomplish all of
the checklist steps in the proper order, resulting in turn-
ing off both the engine and the stability augmentation
system of the aircraft. The aircraft immediately entered
an uncommanded dive and crashed.
A second procedural error of note occurred when the
pilot accidentally activated a program that erased the
internal random access memory onboard the aircraft
during a ight. That this was even possible to do during
a ight is notable in itself and suggests the relatively ad
hoc software development process occurring for these
systems (Tvaryanas, 2004). Predator pilots also have
noted problems caused by the software interface (Hoff-
man, 2004). One example in particular from Hoffman
(2004) was with the assignment of menu selections to
function keys on the GCS keyboard. Hoffman noted that
a particular order of pressing function keys that controlled
the lights on the Predator was almost the same as an order
for cutting off the engine.
The report by Tvaryanas (2004) on mishap epidemiol-
ogy states that Human System Interface (HSI) issues are
discussed in 89% of the Predator accidents and are cited
2
8
5
0
1
2
3
4
5
6
7
8
9
Maintenance Human Factors Aircraft
Figure 12. Air Force Predator accident causal factors.
67%
1
7%
42
%
Table 6. Breakdown of human factors issues for
Predator accidents.
Issue Number Percent
Alerts & Alarms 1 13%
Display Design 2 25%
Landing Error 1 13%
Procedural Error 6 75%
Figure 11. Predator ground control station layout.
10
11
as a causal factor in 44% of those accidents. Tvaryanas
cites four HSI issues involved with Predator mishaps:
1) design of the head-up display (HUD); 2) design of
the head-down display (HDD); 3) alerts and alarms
(Tvaryanas calls them “warnings and cautions”); and 4)
functioning of the autopilot.
Specic problems cited for the HUD design are that
the eld of view (30 degrees) is too narrow, the attitude
indicator is inadequate, the engine RPM indicator needs
improvement, the symbology becomes obscured during
low-link conditions, some of the symbology lacks sufcient
contrast against the external view, and other symbology
is inadequate. There is currently an effort to replace the
Predator HUD with a new design based on ghter aircraft
HUD designs. However, even the new HUD design does
not address all of the problems listed above.
Four specic problems were cited for head-down dis-
play design. These were: 1) too many levels of pages to
maneuver through to access information; 2) the unintui-
tive manner of information display; 3) critical commands
were unprotected or unemphasized; and 4) operational
ranges of values were inconsistent within the display. The
lack of protection for critical commands was a key feature
in the mishap cited above where the internal memory of
the aircraft was erased during a ight.
The following deciencies of alerts and alarms were
noted: 1) alerts do not command attention; 2) audio warn-
ings were insufcient or absent; 3) information provided
was inadequate or poorly prioritized; 4) information was
invalid; and 5) data that need to be compared are not
always co-located on the same display page. Tvaryanas also
states that the alerts and alarms on the Predator violate
multiple human factors design principles.
Finally, four problems were cited by Tvaryanas regard-
ing the functioning of the autopilot. The rst is that
there is no indication on the HUD of the status of the
autopilot. The second problem is that the ight controls
cannot control the aircraft while the autopilot is engaged
(i.e., no override capability). In addition, the pilot must
navigate through four separate menus on the HDD to
deactivate the autopilot. This requires approximately 7
sec on average to accomplish. The third problem cited is
that the autopilot functionality does not fully consider
the capabilities of the aircraft, which results in the com-
manding of extreme maneuvers (unusual attitudes) and
the possible overstressing of the aircraft by the autopilot.
The nal problem is that the autopilot functionality does
not conform to Air Force standards, using pitch to adjust
airspeed instead of power. This could result in the pilot
not being fully aware of what changes were being made
by the autopilot during maneuvering.
Global Hawk
The Global Hawk (see Figure 13), made by Northrop
Grumman, is the largest and newest of the ve military
systems discussed. Global Hawk rst ew in February
1998, and it became the rst UA to cross the Pacic
Ocean in April 2001 when it ew from the United States
to Australia (Schaefer, 2003).
The specications for the Global Hawk are listed in
Table 7. The Global Hawk is the most automated of all
the systems discussed. All portions of the ight, includ-
ing landing and takeoff, are pre-programmed before the
ight, and the basic task of the crew during the ight is
simply to monitor the status of the aircraft and control
the payload. While this makes ying the Global Hawk
very simple, the mission planning process is unwieldy
and requires a great deal of time to accomplish.
The following description of the mission-planning
process for the Global Hawk is taken from an accident
investigation board report dated February 2000:
Mission planning is a long and involved process where
everything that the pilot and crew of a manned aircraft
do in conjunction with a mission must be programmed
Figure 13. Air Force Global Hawk (RQ-4).
Table 7. Global Hawk specifications
(from Schaefer, 2003).
RQ-4A
Weight 26,750 lbs
Length 44.4 ft
Wingspan 116.2 ft
Ceiling 65,000 ft
Radius 5,400 nm
Endurance 32 hrs
Payload 1,950 lbs
Cruise Speed 345 kts
Aircraft Cost $20 M
System Cost $57 M
12
13
into the air vehicle. This process begins up to 270 days
prior to ight in an exercise to negotiate agreement on
the mission, targets, and ight routes. Mission planning
involves people from many different organizations. Mis-
sion planners become actively involved 90 days prior to
the ight. Once the target sets are nalized, it takes three
to ve weeks to write and validate a mission plan. Mission
plans are generated using a mission planning system that
consists of the standard core application and a Global-
Hawk-specic module. Validation is scheduled to take
10 days, starting 18 days prior to the planned ight and
includes any iterations necessary to nalize the plan.
Only three accident reports were available for the Global
Hawk. Of these three reports, one did not provide suf-
cient information for classication, a second faulted a
failure in a fuel nozzle, which led to an engine failure,
and the third was a human factors issue centering on the
complicated mission-planning process. In that accident,
the mishap aircraft suffered an inight problem with
temperature regulation of the avionics compartment and
landed at a preprogrammed alternate airport for servic-
ing. After landing, the aircraft was commanded to begin
taxiing. Unknown to the crew, a taxi speed of 155 knots
had been inputted into the mission plan at that particular
waypoint as a result of a software bug in the automated
mission planning software in use at the time. The aircraft
accelerated to the point it was unable to negotiate a turn
and ran off the runway, collapsing the nose gear and
causing extensive damage to the aircraft.
An analysis of human factors problems with the Global
Hawk by Tvaryanas centers primarily on the high levels
of automation involved with the system. The report
suggests that system operators do not closely monitor
the automated mission-planning software, resulting in
lowered levels of situation awareness and a lowered ability
to deal with system faults when they occur. In particu-
lar, status reports were difcult to interpret because they
were coded in hexadecimal and provided no trend data
to the operators.
CONCLUSIONS
Figure 14 summarizes the data reported in this report
across each UA system. One conclusion apparent from
the data reported here is that, for most of the systems
examined, electrical and mechanical reliability play as
much or more of a role in the accidents as human error.
Mishaps attributed at least partially to aircraft failures
range from 33% (Global Hawk) to 67% (Shadow) in
the data reported here. A recent report by the Ofce
of the Secretary of Defense (Schaefer, 2003) reviewed
several factors affecting the electromechanical reliability
of UA. The most critical factor cited is that cost savings
are more important for UA than for manned aircraft.
Unfortunately, cost-saving measures have a tendency to
impact component reliability, system redundancy, and
the inclusion of new component technologies. For ex-
ample, cost-saving techniques such as the use of wooden
propellers and less attention to watertight sealing leave
some UA more vulnerable to precipitation than manned
aircraft. In addition, the relatively smaller size of many
UA also has an adverse effect on their response to both
precipitation and icing. Schaefer points out that “a one-
tenth inch accumulation [of ice] on a Pioneer’s wings is
equal to one inch on a Boeing 747” (p. 33).
An improvement in electromechanical reliability will
probably come only through an increase in the cost of the
aircraft. However, a reduction of human errors leading
to accidents might not necessarily entail increased costs
if suggested changes can be incorporated early in the
design process. In the systems analyzed, human factors
issues were present in 21% (Shadow) to 67% (Predator)
of the accidents. These numbers suggest there is room
for improvement if specic human factors issues can be
identied and addressed.
In that regard, it is important to note that many
of the human factors issues identied are very much
dependent on the particular systems being own, the
type of automation incorporated, and the user interface
0
10
20
30
40
50
60
70
Maintenance Human
Factors
Aircraft Other
Hunter
Shadow
Pioneer
Predator
Global Hawk
Figure 14. Summary accident percentages across aircraft systems.
12
13
employed. For example, both the Pioneer and Hunter
systems have problems associated with the difculty
external pilots have in controlling the aircraft. For both
of these systems, the majority of accidents due to hu-
man error can be attributed to this problem. However,
the other three systems discussed do not use an EP and
either use an IP (Predator) or perform landings using an
automated system (Shadow and Global Hawk).
Notably, however, the use of automation to overcome
human frailties does not completely solve the problem,
as the automation itself can fail (as with the Shadows
TALS) and the automation can introduce other problems
for the crew, such as the complicated mission planning
process required for the Global Hawk. The effectiveness
of automation is dependent on how it is incorporated
within the interface (Parasuraman & Riley, 1997). There
are current research efforts to understand the effect that
automation has on UA pilot/operator workload, particu-
larly if that automation is not completely reliable (Dixon
& Wickens, 2004; Ruff, Calhoun, Draper, Fontejon &
Guilfoos, 2004).
The design of the user interfaces of these systems are,
for the most part, not based on previously established
aviation display concepts. Part of the cause for this is
that the developers of these system interfaces are not
primarily aircraft manufacturers. Another reason is that
these aircraft are not “own” in the traditional sense of
the word. Only one of the aircraft reviewed (Predator)
has a pilot/operator interface that could be considered
similar to a manned aircraft. For the other UA, control
of the aircraft by the GCS pilot/operator is accomplished
indirectly through the use of menu selections, dedicated
knobs, or preprogrammed routes. These aircraft are not
own but commanded.” This is a paradigm shift that
must be understood if appropriate decisions are to be
made regarding pilot/operator qualications, display
requirements, and critical human factors issues to be
addressed.
Besides the EP accidents, most of the other human
factors-related accidents were unique in the sense that a
problem that occurred for one type of aircraft would never
be seen for another because the user interfaces for the
aircraft are totally different. On the other hand, a common
theme across many of the mishaps reported involved a
problem with the command interface to the system. The
issues reported in this paper regarding alerts and alarms,
display design, and procedural problems are mostly as-
sociated with providing information to the pilot/operator
about the commanded status of the aircraft.
If the aircraft is commanded to begin taxiing, there
should be information available regarding the intended
taxi speed. If the aircraft is being handed off from one
station to another, the receiving station personnel should
be aware of what commands will be transmitted to the
aircraft after control is established. Interface develop-
ment needs to be focused around the task of the pilot/
operator. For most of these aircraft, that task is one of
issuing commands and verifying that those commands
are accepted and followed. Understanding this task and
creating the interface to support it should help to improve
the usability of the interface and reduce the number of
accidents for these aircraft. This is especially important as
these aircraft begin to transition to the National Airspace
System, conducting civilian operations in among civilian
manned aircraft.
REFERENCES
Department of the Army (1994a). Accident reporting
and records. Washington, DC. Army Regulation
385-40.
Department of the Army (1994b). Army accident in-
vestigation and reporting. Washington, DC. Army
Pamphlet 385-40.
Dixon, S.R. and Wickens, C.D. (2004). Automation
reliability in unmanned aerial vehicle ight control.
Proceedings of the Human Performance, Situation
Awareness and Automation Conference, March 22-
25, Daytona Beach, FL, 395-9.
DoD (2001). Unmanned aerial vehicles roadmap, 2000-
2025. Ofce of the Secretary of Defense, Depart-
ment of Defense, Washington, DC, April 2001.
DoD (2002). RQ-8A Firescout vertical takeoff and
landing tactical unmanned aerial vehicle system
(VTUAV). From the Director of Operational Test
and Evaluation FY2001 Annual Report, Depart-
ment of Defense. Downloaded on 6/16/2004 from
URL http://globalsecurity.org/military/library/
budget/fy2001/dot-e/index.html.
Gawron, V.J. (1998). Human factors issues in the devel-
opment, evaluation, and operation of uninhabited
aerial vehicles. AUVSI ’98: Proceedings of the Asso-
ciation for Unmanned Vehicle Systems International,
Huntsville, AL, 431-8.
14
Hoffman, J. (2004). The U.S. Air Force at the crossroads:
Dealing with unmanned aerial vehicles human fac-
tors. Presented at the Human Factors of Uninhabited
Aerial Vehicles First Annual Workshop, May 24-25,
2004, Chandler, AZ.
Manning, S.D., Rash, C.E., LeDuc, P.A, Noback, R.K.,
and McKeon, J. (2004). The role of human causal
factors in U.S. Army unmanned aerial vehicle acci-
dents. U.S. Army Aeromedical Research Laboratory
Report # 2004-11.
Parasuraman, R. and Riley, V. (1997). Humans and
automation: Use, misuse, disuse, abuse. Human
Factors, 39(2), 230-53.
Ruff, H.A., Calhoun, G.L., Draper, M.H., Fontejon, J.V.,
and Guilfoos, B.J. (2004). Exploring automation
issues in supervisory control of multiple UAVs.
Proceedings of the Human Performance, Situation
Awareness and Automation Conference, March 22-
25, Daytona Beach, FL, 408-412.
Schaefer, R. (2003). Unmanned aerial vehicle reliability
study. Ofce of the Secretary of Defense, Washing-
ton, DC, February 2003.
Shappell, S.A. and Wiegmann, D.A. (2000). Human
factors analysis and classication system - HFACS.
Department of Transportation, Federal Administra-
tion: Washington, DC. DOT/FAA/AM-00/7.
Strikenet (2001). VTUAV P1 accident investigation board
report. Downloaded on 4/15/2004 from URL
http://www.strikenet.js.mil/pao/vtuav_crash_in-
vest_news_release.html.
Tvaryanas, A.P. (2004). USAF UAV mishap epidemiol-
ogy, 1997-2003. Presented at the Human Factors of
Uninhabited Aerial Vehicles First Annual Workshop,
Phoenix, AZ, May 24-25, 2004.
Wiegmann, D.A. and Shappell, S.A. (2003). A human
error approach to aviation accident analysis: The
Human Factors Analysis and Classication System.
Burlington, VT: Ashgate.
... When talking about RPA (Remotely piloted aircraft) and UAV (Unmanned Aerial Vehicle), there are two misconceptions to dispel: first they are easier to mann and second they are safer compared to conventional aircraft. Data from accidents reports and the challenges for training standards development tell the opposite: the lack of a unified view on training and the accidents are higher compared to conventional aviation suggest a lot of hidden variables in this domain (Williams, 2004). ...
... The interest in this regard arises pragmatically from the generally high rate of accidents for RPA (Williams, 2004) and their relation with human error, the lack of care and sufficient comprehension of human factors and cognitive ergonomics. This scenario has posed the increasing urge of building more knowledge. ...
... According to US Department of Defense statistics , the incident/accident rates for unmanned aerial vehicles (UAVs) may be up to 30 times greater than those for manned aircraft (Williams, 2004). ...
Thesis
This study focused on stress among operators of Unmanned Aerial Vehicles (UAVs) for operations in which pilots frequently encounter emergency situations. While the literature on stress in conventional aviation is extensive, the field of UAV missions remains relatively unexplored. This research addressed three primary research questions: unravelling the concept of stress within the UAV aviation context, defining the scope of stress assessment to determine the most suitable approach for the UAV setting and conceptualizing the underlying factors contributing to the stress experience. The thesis project was structured by combining two complementary studies. A descriptive review was conducted using the PRISMA method to address the concept of stress and its assessment in UAV operations. It followed an empirical study involving semi-structured interviews and a questionnaire with the purpose of depicting the stress experience from the operators’ perspective. The results of the descriptive review revealed two main clusters of findings: factors and mechanisms underlying stress in UAV setting and approaches to stress assessment. The first bundle included aspects about the unique nature of stress responses in this setting, the relationship with the concepts of workload and startle, the cognitive impact of stress and primary coping strategies. Trust in automation emerged as a key factor for stress in this domain: different levels of automation reliability lead to different intensities of stress effects. The second bundle provided a discussion on various methodologies for physiological and psychological stress assessment. A combination of objective and subjective (i.e., self-report) methods are suggested to meet the piloting setting limitations and ensure the measures’ validity. The empirical research, which involved semi-structured interviews (N=4) and a questionnaire (N=10), allowed for an in-depth exploration of the stress experience from the perspective of UAV pilots. The outcomes of this study were many across the themes of individual characteristics, stress beliefs, perceived safety, and coping strategies. A valuable finding regarded the operators' confidence in a series of skills. On one side participants showed high confidence in problem-solving and quick decision-making. On the other, it was observed low reliance on organizational and time management skills which are great contributors to stress.
... One of the challenges in marine use of aerial vehicle (manned or not) is landing on a platform that is randomly moving in three axes of sea motion (pitch, roll, yaw) affected by sea level waves. This problem makes it nearly impossible for human pilots to land safely [4]. In the many uses of UAV (Unmanned Aerial Vehicle) a pilot uses real-time telemetry to take-off, fly and land the craft with continuous communication between ground station and the UAV on-board computer. ...
... The main motivation for dealing with autonomous landing is the difficulty in performing a successful landing even with a pilot controlling the UAV. As it seems by statistics showed in [4], most of the accidents related to RPAS (Remotely Piloted Aircraft Systems) occur when the pilot tries to land the UAV. Extensive research has been done on the subject to explore the various situations, technologies and methods to engage this problem. ...
... The landing pad designed as a plate with five markers -one in the center and four others on each corner: Every ArUco marker has an ID as described above, which can be determined when the marker gets detected, and by that we can easily center the drone location above the landing pad even if only one or two markers are in view. This landing pad has markers with ID values of: [18,28,17,25,4] selected randomly, but once selected they are very important to the implementation since the training data linked to the classifiers used as will be discussed later. ...
Article
Quadcopters are four rotor Vertical Take-Off and Landing (VTOL) Unmanned Aerial Vehicle (UAV) with agile manoeuvring ability, small form factor and light weight – which makes it possible to carry on small platforms. Quadcopters are also used in marine environment for similar reasons – especially the ability to carry on small boats, instead of using helicopters on larger boats. Pilots of both manned and unmanned aerial vehicles over sea, face the challenge of landing their aircraft on a boat that is continuously rocking in all three axes (pitch, roll, yaw) and with movement of the boat, usually at steady course and speed. In this paper, we present a new approach for autonomous landing a quadcopter on a moving marine platform. Our approach is based on computer-vision algorithms using markers identification as input for the decision by Stochastic Gradient Descent (SGD) classifier with Neural Network decision making module. We use OpenCV with its built-in ArUco module to analyze the camera images and recognize platform/markers, then we use Sci-Kit Learn implementation of SGD classifier to predict landing optimum angle and compare results to manually decide by simple calculations. Our research includes real-time experiments using Parrot Bebop2 quadcopter and the Parrot Sphinx Simulator.
... The Pioneer RQ-2 UAV (Williams, 2004) was developed by the AAI Corporation. The RQ-2 is an autonomous fixed-wing UAV with a length and wingspan of 4.3m and 5.1m, respectively, and a gross weight of 205kg. ...
... Most accidents involving UAVs are caused by human errors related to operating and maintenance failures [10]. Therefore, automating as many tasks as possible is crucial to increase system reliability. ...
Conference Paper
Unmanned Aerial Vehicles (UAVs) are continuously being explored as important means of decreasing human intervention and improving system reliability. The more challenging tasks when operating with UAVs are take-off and landing. Most low-size UAVs can have their take-off performed by hand, but the capacity to perform autonomous landing is essential. The autonomous landing environment is usually challenging since we are considering a moving platform, and the system must be able to deal with Global Position System (GPS)-denied environments. The proposed system is based on a ground-based stereoscopic vision system with temporal filtering based on an Extended Kalman Filter (EKF) to track the position of a rotary-wing UAV during landing. The obtained position detects and guides the rotary-wing UAV during landing. The initial results indicate that this setup has the potential to track with low error, demonstrating its suitability for exploration and further improvements.
... 2024. № 3. С.[24][25][26][27][28][29][30][31][32] ...
Article
The need to train high-tech specialists is associated with the digitalization of modern society. Accordingly, in order to be a leader in the technological process in Russia, it is necessary to create our own new infrastructure systems that will meet the challenges of this new world. In the context of digitalization of modern society, there is a need for high-tech specialists capable of creating new infrastructure systems and responding to market changes. There is also a need for entrepreneurs and specialists who are ready to respond to changes in the market and technology. The article outlines the prospects for the development of modern technologies in the context of a new technological structure. The purpose of the study was a theoretical study of the personal potential and professionally important qualities of specialists in global high-tech markets. Based on a study of scientific psychological literature using the example of Aeronet sulfur, the PICs of specialists working in the areas of markets of the National Technology Initiative are prescribed. As a result of the analysis we chose the concept of personal potential by D.A. Leontyev and professionally important qualities of Yu.P. Povarenkov to describe the specifics of the professional activities of high-tech specialists, using the example of the Aeronet sphere. It was revealed that the structure of personal potential correlates with three key challenges that are associated with the specific work of high-tech specialists: uncertainty, goal achievement and the challenge of threat or pressure. The professionally important qualities of Aeronet specialists are spelled out. The results of the study can be used in the practice of career guidance with schoolchildren interested in technologies of the National Technology Initiative, and their parents. They can also be useful for developing programs for professional psychological selection and training of high-tech specialists.
... Unmanned Aerial Vehicles (UAVs) are increasingly used for maritime surveillance operations [1], [2]. Most accidents involving UAVs are caused by human errors related to operating and maintenance failures [3], and therefore, automating as many tasks as possible is crucial to increase system reliability. The most challenging operations during UAV deployment from ships are take-off and landing, and it is essential to automate them to increase their operational reliability completely [4], [5]. ...
Conference Paper
Unmanned Aerial Vehicles (UAVs) have significantly widened the range of possible applications due to their unmanned capabilities. UAVs are commonly utilized in Intelligence, Surveillance, and Reconnaissance (ISR) operations, enabling them to gather important real-time data. To ensure operational reliability, performing the landing operation autonomously is crucial. However, performing an autonomous landing on a ship is challenging, as it is intended to be executed in an outdoor environment, over a moving platform, where good coordination and precise position estimation of the UAV position relative to the ship is crucial. We present a ship-based binocular vision system with temporal filtering based on an Extended Kalman Filter (EKF) to track the position of a UAV during landing. The developed system can detect noise and failures in one or both cameras, enabling accurate detection and guidance in these scenarios. The preliminary results show that this simple setup can perform tracking with low errors, demonstrating its potential for real-world applications.
Article
Проблема создания психолого-педагогических условий подготовки специалистов глобальных рынков высоких технологий является одной из актуальных в современном образовании. Необходимо учитывать личностные характеристики при обучении студентов в современных социально-экономических условиях. В статье обозначены перспективы развития современных технологий в контексте нового технологического уклада. На основании изучения научной психологической литературы прописаны личностные характеристики специалистов, работающих в сферах рынка Аэронет Национальной технологической инициативы на примере операторов беспилотных летательных аппаратов. Обозначена необходимость профессионального психологического отбора и подготовки специалистов высоких технологий. В качестве образовательной технологии, учитывающей личностные характеристики студентов в процессе подготовки специалистов высоких технологий, нами была предложена концепция ConceiveDesign-Implement-Operate (CDIO, Придумывай-Разрабатывай-Внедряй-Управляй). Проведен сравнительный анализ стандартов CDIO и традиционной системы обучения в вузе. Обозначены направления учета личностных характеристик студентов, обучающихся по специальности «управление в технических системах» в процессе обучения в вузе. The article describes the problem of creating psychological and pedagogical conditions for training specialists in global high-tech markets is one of the pressing areas in modern education. It is necessary to take into account personal characteristics when teaching students in modern socio-economic conditions. The article outlines the prospects for the development of modern technologies in the context of a new technological structure. Based on a study of scientific psychological literature, the personal characteristics of specialists working in the areas of the Aeronet market of the National Technology Initiative are described using the example of unmanned aerial vehicle operators. The need for professional psychological selection and training of high-tech specialists is outlined. We proposed the CDIO concept as an educational technology that takes into account the personal characteristics of students in the process of training high-tech specialists. Invent-Develop-Implement-Manage is the basic principle of an innovative educational environment for training a new generation of engineers. The goal of CDIO: an engineer is a person who can change the world for the better. A comparative analysis of CDIO standards and the traditional university education system was carried out. The directions for taking into account the personal characteristics of students studying in the specialty “Management in Technical Systems” during their studies at the university are indicated.
Article
Fixed-wing mini-UAVs (Unmanned Aerial Vehicles) face difficulties due to the need for runways during take-off and landing. While fixed-wing UAVs are capable of using catapults during take-off, various landing systems are required for landing. Therefore, in this study, a parachute system design and production were carried out for the safe landing of fixed-wing mini-UAVs. The produced parachute utilized ultra-lightweight ripstop nylon fabric and suspension lines, while a carbon fiber tube was chosen for the launching system for its lightweight and strength. The parachute deployment system was triggered by a servo motor with low power consumption and high torque. During tests, the parachute was activated at a height of 47 meters during flight. The parachute deployment was completed in 1.42 seconds, and the descent with the parachute lasted 11 seconds. The vertical descent speed of the parachute during landing was measured at 4.27 m/s. The produced parachute landing system was manufactured at 71% lower cost compared to existing parachute landing systems in the literature and on the market. Additionally, the ultra-light ripstop parachute weighed 56 grams, making it 12% lighter than similar systems. Considering the advantages in terms of cost and weight, it is anticipated that parachute landing systems will be increasingly used for fixed-wing UAVs in the future.
Article
Full-text available
An evaluation was conducted on a generic UAV operator interface simulation testbed to explore the effects of levels-of-automation (LOAs) and automation reliability on the number of simulated UAVs that could be supervised by a single operator. LOAs included Management-by-Consent (operator consent required) and Management-by-Exception (action automatic unless operator declines). Results indicated that the tasks were manageable, but performance decreased with increased number of UAVs supervised and reduced automation reliability. Performance with the two LOAs varied little and did not show a consistent trend across measures. Analyses indicated that participants typically did not utilize the automation. A follow-on study was conducted that employed shorter LOA time limits. Results showed participants' workload and confidence ratings were less favorable for the shorter limits and they still exercised the automation rarely, although more frequently. Further research is needed to explore the complex relationship between LOAs, time limits, perception of workload, vigilance effects, and confidence.
Article
Full-text available
The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based upon Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. Specifically, HFACS was applied to commercial aviation accident records maintained by the National Transportation Safety Board (NTSB). Using accidents that occurred between January 1990 and December 1996, it was demonstrated that HFACS reliably accommodated all human causal factors associated with the commercial accidents examined. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena.
Article
For the period FY95-FY03, each Unmanned Aerial Vehicle (UAV) accident was reviewed and classified by a series of characteristics using two approaches. The first was a variant on a methodology referred to as the Human Factors Analysis and Classification System (HFACS). The HFACS captures data for four levels of human-related failure: Unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences. The second analysis approach was based on the accident methodology defined in Department of the Army Pamphlet 385-40, "Army accident investigation and reporting." Human causal factors are identified during this analysis and broken down into five types of failure: Individual failure, leader failure, training failure, support failure, and standards failure. Where the assignment of cause included human error, the accident data including narrative and findings were analyzed to identify specific human causal factors (e.g., high/low workload, fatigue, poor crew coordination, etc.). No single human causal factor was responsible for all accidents. However, both methods of analysis identified individual unsafe acts or failures as the most common human-related causal factor category (present in approximately 61 percent of the 18 human error related accidents).
Article
This paper addresses theoretical, empirical, and analytical studies pertaining to human use, misuse, disuse, and abuse of automation technology. Use refers to the voluntary activation or disengagement of automation by human operators. Trust, mental workload, and risk can influence automation use, but interactions between factors and large individual differences make prediction of automation use difficult. Misuse refers to over reliance on automation, which can result in failures of monitoring or decision biases. Factors affecting the monitoring of automation include workload, automation reliability and consistency, and the saliency of automation state indicators. Disuse, or the neglect or underutilization of automation, is commonly caused by alarms that activate falsely. This often occurs because the base rate of the condition to be detected is not considered in setting the trade-off between false alarms and omissions. Automation abuse, or the automation of functions by designers and implementation by managers without due regard for the consequences for human performance, tends to define the operator's roles as by-products of the automation. Automation abuse can also promote misuse and disuse of automation by human operators. Understanding the factors associated with each of these aspects of human use of automation can lead to improved system design, effective training methods, and judicious policies and procedures involving automation use.
Automation reliability in unmanned aerial vehicle flight control
  • S R Dixon
  • C D Wickens
Dixon, S.R. and Wickens, C.D. (2004). Automation reliability in unmanned aerial vehicle flight control. Proceedings of the Human Performance, Situation Awareness and Automation Conference, March 22-25, Daytona Beach, FL, 395-9.
Accident reporting and records
  • Department
Department of the Army (1994a). Accident reporting and records. Washington, DC. Army Regulation 385-40
The U.S. Air Force at the crossroads: Dealing with unmanned aerial vehicles human factors. Presented at the Human Factors of Uninhabited Aerial Vehicles First Annual Workshop
  • J Hoffman
Hoffman, J. (2004). The U.S. Air Force at the crossroads: Dealing with unmanned aerial vehicles human factors. Presented at the Human Factors of Uninhabited Aerial Vehicles First Annual Workshop, May 24-25, 2004, Chandler, AZ.
USAF UAV mishap epidemiology Presented at the Human Factors of Uninhabited Aerial Vehicles First Annual Workshop
  • A P Tvaryanas
Tvaryanas, A.P. (2004). USAF UAV mishap epidemiology, 1997-2003. Presented at the Human Factors of Uninhabited Aerial Vehicles First Annual Workshop, Phoenix, AZ, May 24-25, 2004.
Unmanned aerial vehicles roadmap, 2000- 2025. Office of the Secretary of Defense, Department of Defense
  • Dod
DoD (2001). Unmanned aerial vehicles roadmap, 2000- 2025. Office of the Secretary of Defense, Department of Defense, Washington, DC, April 2001.