Content uploaded by Kristian Husum Laursen
Author content
All content in this area was uploaded by Kristian Husum Laursen on Aug 19, 2019
Content may be subject to copyright.
Towards a Tool for Assessing UAS Compliance with the JARUS SORA
Guidelines
Kristian Husum Terkildsen1and Kjeld Jensen2
Abstract— The Unmanned Aerial Systems (UAS) legislation
in the European Union will expectedly become unified by 2020.
This legislation divides UAS operations into three categories,
one of which is the Specific category, in which operations of
higher risk will be placed. Permission for UAS operations in the
Specific category is granted based on a Specific Operations Risk
Assessment (SORA). The risk assessment categorizes operations
into Specific Assurance and Integrity Levels (SAILs), which
determines the level of the requirements for the operation such
as operational procedures, training of the remote crew, UAS
specification and maintenance etc. However, a conceptual gap
exists between the defined technology requirements for UAS and
the available UAS platforms. In this work, a prototype assess-
ment tool based on the SORA defined technology requirements
has been developed. The tool is a web-based questionnaire,
which pulls the questions from a data structure and stores
the answers in a database for analysis. Upon completion of
the questionnaire, the user receives an automatically generated
evaluation report. The evaluation report points out the areas
of inadequacy and provides suggestions for mitigating these.
The tool in its current state has been used to evaluate four
commonly used UAS. The four UAS have been tested for SAIL
II through IV, and the results show that only one of the selected
UAS complies with SAIL II. Future work is to refine the tool
including the list of questions and the UAS evaluation report
generator, based on feedback from UAS domain experts and to
expand it to cover fixed-wing UAS.
I. INTRODUCTION
Unmanned Aerial Systems (UAS) applications are cur-
rently evolving towards operations with a higher risk profile,
such as flight Beyond Visual Line Of Sight (BVLOS) for
emergency response, long distance inspections and cargo
transportation [1], [2] etc. The UAS regulation is evolving
as well: in June 2018, the European Parliament agreed to
unify the UAS regulations across all EU member states [3],
and this EU regulation is expected to be implemented by
2020. As a result of this, UAS operations are split into three
categories: Open, Specific and Certified. Permission for UAS
operations in the Specific category is granted based on a Spe-
cific Operations Risk Assessment (SORA). Countries such
as Denmark and Germany are already issuing permissions
based on the SORA process. The SORA 2.0 [4] has been
published in 2019 and puts the UAS under a new level of
scrutiny: where the previous SORA 1.0 [5] required generic
technical documentation for subsequent evaluation by the
authorities, the SORA 2.0 states specific requirements to the
UAS, which the UAS technology, materials and production
1SDU UAS Center, University of Southern Denmark, Odense, Denmark
khte@mmmi.sdu.dk
2SDU UAS Center, University of Southern Denmark, Odense, Denmark
kjen@mmmi.sdu.dk
methods must fulfil. The SORA 2.0 evaluates the air- and
ground risks for the given UAS operation, which leads to
a categorization into one of six Specific Assurance and
Integrity Levels (SAIL). The SAIL, in turn, results in a
series of Operational Safety Objectives (OSOs) to be met
by the UAS operation. The SORA process requires the
capability of the UAS operator to adequately define the entire
operation with a focus on safety and the associated risk,
which is not a trivial task for most commercial and public
operators. To this end, an operator can mitigate the SORA
process, if the UAS operation is covered by a Standard
scenario description, which is a concept of operation with
an associated risk assessment that has been approved by the
Civil Aviation Authority (CAA). As an example, the Danish
CAA has published two standard scenario descriptions for
national and municipal emergencies to simplify the use of
UAS in emergency scenarios [6]. Both the SORA and the
Standard scenario descriptions, however, define requirements
to the UAS technology which pose a challenge to many UAS
operators.
This work aims to reduce the gap between the SORA
requirements and the UAS. The SORA 2.0 OSOs containing
technological requirements have been identified, and a list
of questions have been formulated to concretize the require-
ments. The result is a tool designed as a questionnaire, which
automatically generates an evaluation report, stating whether
the UAS complies with the technical requirements of a given
SAIL, and specifies the non-complying areas and how to
mitigate these.
II. BACKGROU ND: UAS SPE CIFI C OPERATIONS RIS K
ASS ESS MENT (SORA)
This section lists some examples of current approaches
to approval of UAS operations in the Specific category, in
addition to looking into similar work by other authors.
In Denmark and Switzerland permission for UAS opera-
tions in the Specific category are granted based on a SORA.
For example, in Switzerland a blood sample delivery concept
project currently being tested in the cities Lugano and Zurich
has been approved by the Swiss Federal Office of Civil
Aviation based on a SORA [7].
In Germany, a simplified version of the SORA, named
SORA-GER is used. Several companies have put an online
SORA-GER assessment tool on their web page, which takes
operators through the process of evaluating the air- and
ground risks of the operation [8], [9], [10]. One important
notice is that UAS operations are regulated on a state level,
meaning that approvals may vary across the country.
2019 International Conference on Unmanned Aircraft Systems (ICUAS)
Atlanta, GA, USA, June 11-14, 2019
978-1-7281-0332-7/19/$31.00 ©2019 IEEE 460
In the USA, the legislative framework for Open category
UAS operations is similar to that of most European countries.
However, applying for permission to operate in the Specific
category is done by formally applying for a waiver for the
part of the operation which violates the law. In this waiver,
the applicant describes what safety measures they intend to
carry out to mitigate the risk caused by the operation [11].
While the SORA is a less in-depth approach to risk
assessment, compared to using high fidelity risk models, the
resulting risk evaluation captures a sufficient level of detail
for the intended use, according to [12].
Automatic tools for assessment of technical requirements
and assurances is not a new topic. AdvoCATE is an assurance
case automation tool created by NASA, to construct and
assess safety cases for their UAS based on the goal structur-
ing notation [13]. Others have made similar tools based on
assurance and certification frameworks. Some examples can
be found in [14] and [15].
III. METHODS AND MATER IAL S
The developed SORA UAS compliance assessment tool
is based on the OSO requirements defined by the SAIL
provided by JARUS [4]. These OSOs pose general require-
ments, which have to be elaborated upon, in order to assess
UAS compliance. This section is divided into two parts.
In the first part, the relevant OSOs are identified and the
SAIL requirements determined. In the second part, the OSOs
and their requirements are elaborated, leading to in-depth,
comprehensible questions.
A. Identifying the Relevant OSOs
The OSOs are found in the SORA 2.0 Annex E and
describe integrity and assurance level requirements for the
UAS operator, manufacturer and maintenance [4]. In this
work, only the OSOs concerning the UAS technology are
evaluated. Integrity and assurance are the two components
of the SORA concept robustness. The level of integrity is
defined by the safety gain provided by each mitigation, while
the level of assurance is defined as the proof that the claimed
safety gain can be achieved [4]. Integrity and assurance
each have three designated levels: Low, Medium and High.
The general guidance for assurance levels is defined below.
However, for some OSOs the criteria for certain assurance
levels vary.
•Low: the operator declares that the required level of
integrity has been achieved.
•Medium: the operator provides supporting evidence
that the required level of integrity has been achieved,
typically by testing.
•High: validation of the achieved integrity has been
accepted by a competent third party.
In table I the level of robustness is defined based on the
level of integrity and assurance.
The required level of robustness is dependent on the SAIL,
which can be seen in table II, in which the identified OSOs
and their corresponding SAIL requirements are listed.
Low
assurance
Medium
assurance
High
assurance
Low
integrity
Low
robustness
Low
robustness
Low
robustness
Medium
integrity
Low
robustness
Medium
robustness
Medium
robustness
High
integrity
Low
robustness
Medium
robustness
High
robustness
TABLE I: Definition of level of robustness based on the level
of integrity and assurance. The level of robustness is never
higher than the lowest assurance or integrity level [4].
OSO # OSO description
SAIL robustness requirement
(II, III and IV)
Optional Low Medium High
OSO #02
UAS Manufactured by
Competent and/or Proven
Entity
II III IV -
OSO #03
UAS Maintained by
Competent and/or Proven
Entity
- II III, IV -
OSO #04
UAS Developed to
Authority Recognized
Design Standards
II, III IV - -
OSO #05
UAS is Designed
Considering System
Safety and Reliability
II III IV -
OSO #06
C3 Link Performance is
Appropriate for the
Operation
-II, III IV -
OSO #10
and #12 Safe Design - II III, IV -
OSO #18
Automatic Protection of
the Flight Envelope from
Human Errors
II III IV -
OSO #19 Safe Recovery from
Human Error II III IV -
OSO #20
A Human Factors
Evaluation has been
Performed and the HMI
Found Appropriate for the
Mission
-II, III IV -
OSO #24
UAS Designed and
Qualified for Adverse
Environmental Conditions
II - III IV
TABLE II: The listed OSOs concern the UAS technology
and are shown alongside their SAIL-defined requirements
for level of robustness. If the requirement is Optional, the
OSO is not needed for the particular SAIL.
B. OSO Elaboration
The selected OSO requirements for Low, Medium and
High levels of robustness have been reformulated as com-
prehensible questions. The selected OSOs will be described
in the following.
OSO #02 - UAS Manufactured by Competent and/or
Proven Entity This OSO concerns the UAS manufacturer
and manufacturing processes including focus points such as
durability of materials, specifications, configuration control,
traceability, inspections and manufacturing processes. To
evaluate this OSO, one has to investigate which industry stan-
dards, ISO certificates and relevant conformity declarations
the manufacturer has obtained and follows. Typically, this
can be answered by looking up the declaration of conformity
for the specific UAS, either online or in the UAS manual.
OSO #03 - UAS Maintained by Competent and/or Proven
Entity Describes the maintenance procedures, team and log-
461
ging. The operator must be qualified to perform maintenance
on the UAS. For higher levels of robustness, a detailed
maintenance program must be formulated and adhered to.
OSO #04 - UAS Developed to Authority Recognized
Design Standards According to the OSO description in the
SORA, National Aviation Authorities (NAAs), can define
standards and/or means of compliance, which they consider
adequate. None have been defined at the time of writing.
However, a relevant standard is the ASTM F2910-14 [16].
OSO #05 - UAS is Designed Considering System Safety
and Reliability Investigates whether ”The equipment, sys-
tems and installations have been designed to minimize haz-
ards to the UAS in the event of a probable malfunction
or failure” [4]. A probable malfunction means that it is
estimated to happen one or more times during the lifetime
of a given subsystem or component. In the case of UAS,
several potential malfunctions exist. While quality materials
and proper manufacturing can increase the durability of
subsystems and components, the risk cannot be eliminated.
A solution to this is to add redundancy to mission critical
subsystems and components. For instance, a hexarotor or
octorotor can typically sustain normal flight or perform a
safe landing if lift from one of the rotors is lost. Another
example is independent battery packs providing power to the
UAV in case one battery pack fails. Redundancy on mission
critical subsystems and components is key to reduction of
UAS hazards [17]. A generic UAS may have redundancy on
the following subsystems and components:
•Batteries, power transmission components and cables.
•Rotors, Electronic Speed Controller, controller data ca-
bles.
•Flight controllers, flight controller software, Iner-
tial Measurement Units (IMUs), Global Naviga-
tion Satellite System (GNSS) receivers and other mis-
sion critical sensors.
•Radio Frequency (RF) communication links, antennas,
frequencies used.
In addition to redundancy, there is also a series of hazard
mitigation systems which can be in place, such as:
•Battery Management System (BMS) [18], which mon-
itors the State of Charge (SoC), State of Health (SoH)
and protects against short-circuits, extreme SoCs, etc.
•Diversity antennas for sustaining the C2/C3 radio link
during flight, the three Cs stand for Command, Control
and Communication.
•A UAV flight mode which does not rely on GNSS
navigation.
•Fail-safe actions such as Return To Home (RTH) [19],
autonomous landing and flight termination.
•An independent Fail-safe System (independent power
source, sensor, computational unit, actuators, radio link
etc.) providing redundant positioning and state monitor-
ing as well as activation of flight termination [20].
OSO #06 - C3 Link Performance is Appropriate for the
Operation Describes C3 links. The Command and Control is
between the remote pilot and the UAV, while Communication
pertains to the local Air Traffic Service (ATS). Most UAS
do not offer the Communication as an integrated part of
their system, as this is relevant only during a BVLOS
flight in controlled airspace, where other means of voice
communication to the ATS is not accepted. This work deals
with C2 links only.
The operator must have a national license for the radio
frequencies and spectrums used. In case of unlicensed fre-
quencies or spectrums the operator must conform to any
requirements imposed by the national RF regulation.
The previously mentioned SAIL II standard scenario de-
scribes a case in which the UAV may momentarily be
operated BVLOS, behind trees or buildings [6]. This puts
a requirement on the UAS C2 link ability to sustain an
acceptable link performance during flight Beyond Radio Line
Of Sight (BRLOS). This needs to be tested in a realistic
environment by the operator.
OSO #10 and #12 - Safe Design These OSOs have
been grouped and concern the risk of fatality due to a
crash when operating over populous areas or gatherings of
people. Therefore, these OSOs are closely related to OSO
#05, as the UAS ability to handle failures correlates with
the fatality risk. A way of mitigating the risks posed by
lack of redundancy is the aforementioned fail-safe system.
A design and installation appraisal has to be in place,
describing and concluding on UAS independence, separation
and redundancy. Depending on the SAIL, analyses and tests
have to be conducted as well.
OSO #18 - Automatic Protection of the Flight Envelope
from Human Errors Describes the aerial performance of
the UAS and speaks of the UAVs flight envelope. The flight
envelope of an aircraft is the configurations in which it
maintains stable flight. For a typical multirotor, exiting the
flight envelope typically means rolling over. For a fixed-wing,
a stall could cause a flight envelope exit. The listed human
errors are control input errors. Therefore, the UAS must have
built-in limitations on all control inputs, such that the remote
pilot cannot:
•Cause a rollover by pitching or rolling the control sticks.
•Cause instability by applying too much throttle.
•Make the UAV exit its flight envelope by rapidly de-
creasing the throttle.
•Make the UAV exit its flight envelope by performing
an extreme combination of throttle, roll, pitch and yaw
commands.
•Disarm the UAV mid-air (exception: flight termination
command).
Besides, turning off or losing connection to the Human
Machine Interface (HMI) should not have any adverse effect
on the UAV.
OSO #19 - Safe Recovery from Human Error Covers
procedures, checklists, crew training and UAS design. The
UAS design requirements posed by this OSO concern the
development of the implemented detection and recovery
systems. It is the opinion of the authors that the OSO should
also include specific system requirements such as:
462
•UAS self-test detecting sensor calibration errors, miss-
ing or inadequately charged batteries, degraded radio
link performance, etc.
•Detection of temperatures outside UAS specifications,
in case of higher risk operations.
•Detection of winds exceeding UAS specifications, in
case of higher risk operations.
•Propeller attachment designed such that propellers can-
not be attached in a way in which they do not lock in
place.
•Battery compartment designed such that batteries cannot
be fitted in a way in which they do not lock in place.
OSO #20 - A Human Factors Evaluation has been Per-
formed, and the HMI Found Appropriate for the Mission
The HMI should not cause fatigue or confusion to the remote
pilot. In addition, it must display UAS information regarding
the following topics:
•Position, orientation, altitude, speed.
•GNSS signal quality, C2 signal quality, video signal
quality.
•Battery state of charge or battery remaining.
Also, the HMI should warn the remote pilot of:
•Poor or lost GNSS position fix.
•Poor or lost C2 signal.
•Poor or lost video signal (if applicable).
•Low or critically low battery.
OSO #24 - UAS Designed and Qualified for Adverse
Environmental Conditions This OSO concerns environmen-
tal conditions. Many UAS today are not built for adverse
weather conditions. According to manufacturer specifica-
tions, many are unable to operate in any type of precipitation
and are limited to winds up to about 10 m/s. If the operation
is defined only to take place under these conditions, the OSO
is considered to be adhered to.
IV. SO RA AS SES SME NT TOO L
A. Overview
The assessment tool takes the shape of a web-based
questionnaire. The user begins by entering general info and
selecting a SAIL. In this work, only SAIL II, III and IV have
been implemented. The user is then presented with a set of
questions derived from the integrity and assurance demands
posed by the SAIL. Figure 1 shows the welcome page and
figure 2 and 3 show examples of the question layout. The
user has the answering options ’Yes’, ’Don’t know’ and ’No’
for each question. When answering, the user is prompted
to specify from where they have obtained the information
leading to the answer. Examples are ’UAS manual’, ’other
web page’ or ’UAS inspection’.
B. Design and Implementation
The assessment tool is web-based and has been written pri-
marily in PHP 7.2.15 [21] with HTML5 [22] and Bootstrap
for front-end design. A question generator script displays
questions, and the OSOs and their elaborated questions are
retrieved from a database in which they are stored. Upon
Fig. 1: On the welcome page, the user is asked to fill out
name, e-mail, UAS and the desired SAIL to reach.
Fig. 2: Each question can be answered with three choices.
Fig. 3: Each answer has a set of knowledge source choices
for the user to choose from.
completion of the questionnaire, the evaluation report is
automatically generated by a PHP function and e-mailed to
the user. At the same time, the user information, along with
their answers, is stored in a database for subsequent analysis.
463
The purpose is to acquire sufficient data enabling a thorough
analysis of the most commonly used UAS. The personal user
info, name and e-mail address, is encrypted and the user is
asked to consent with the data storage, in order to comply
with the EU General Data Protection Regulation (GDPR)
[23]. A simple system diagram is shown in figure 4.
Fig. 4: This figure shows a simplified overview of the
assessment tool.
As it is anticipated that more questions will emerge when
evaluating the assessments performed using the tool, it is
essential to design the software with modifiability in mind.
The OSO data and their corresponding questions are for this
reason stored in a JSON [24] [25] database. Each OSO makes
up an object, as shown in listing 1.
1"OSO10":{
2"title" :"UAS Safe Design",
3"next_step" :[
4"OSO20",
5"OSO18",
6"OSO18"
7],
8"sail_reqs" :[
9"low",
10 "medium",
11 "medium"
12 ],
13 "questions_low" :[
14 "Will any kind of probable
failure,such as...",
15 "Has a design and
installation appraisal
been made...",
16 "Is the kinetic energy of the
UAS in free fall..."
17 ],
18 "questions_medium" :[
19 ""Is the software and
hardware developed to
adequate...",
20 "Has the identification of
single failures of...",
21 "Are pre-flight checks
performed in order to
mitigate..."
22 ],
23 "questions_high" :[
24 "Calculation of the
probability of operation
outside of...",
25 "Are software and hardware
elements whose...",
26 "Has a competent third party
verified safety analyses
..."
27 ],
28 }
Listing 1: The structure of an OSO object in the JSON
database: the OSO title, a next step array that indicates
which OSO to move to after this one, based on the
SAIL from II to IV. As shown, OSO #18 and #19 are
skipped if SAIL II is chosen, as it is optional for this
SAIL. Then follows the SAIL requirement array, which
indicates the required levels of robustness for SAIL II
to IV. Finally, the OSO object contains three arrays
with questions, one for each level of robustness. If, for
example, a medium level of robustness is required, both
the low and medium questions are posted. The depicted
OSO’s questions have been abbreviated.
With the chosen data structure, modifying existing OSOs
is simple, and insertion of a new OSO only requires adding
a new object and updating the next step array in the previous
OSO to include the new one.
C. Example result
Upon completing the questionnaire, the user is presented
with an evaluation of the UAS compliance to the selected
SAIL. An example of an evaluation report can be seen in
figure 5.
Fig. 5: This figure shows an example evaluation for a SAIL
II. Notice how the UAS does not comply with this SAIL,
which is explained in the evaluation, along with explanatory
text for the areas of issue.
464
SAIL II SAIL III SAIL IV
OSO # a b c d a b c d a b c d
02
03
04
05
06
10 and 12
18
19
20
24
TABLE III: The compressed evaluation results for the four
UAS, which are: a) DJI Phantom 4 Pro, b) DJI Inspire 2,
c) DJI Matrice 200, d) DJI Matrice 600 Pro. Green signifies
passed, red signifies failed and purple signifies that testing
is required in order to answer.
The SORA tool will be made publicly available at the time
of publication of this paper.
V. R ESU LTS
The developed tool has been used to assess four different
DJI UAS models. This manufacturer has been chosen, as DJI
has the largest global market share at 74% in 2017 according
to a market sector report by Skylogic Research [26]. The
chosen UAS are the Phantom 4 Pro, the Inspire 2, the Matrice
200 and the Matrice 600 Pro. The chosen UAS are known to
be in operation by emergency services [27]. This evaluation
serves a dual purpose of evaluating both the tool and the four
UAS. Please note that this tool is at an immature state, and
that the evaluation of the selected UAS is done by a single
person. Therefore, the results should not be seen as anything
more than preliminary. Table III shows an overview of the
evaluation results.
The developed tool is available online1. The software and
devised questions have been published on GitLab2, under a
BSD 3 Clause license, for others to build upon.
VI. DISCUSSION
From evaluating the UAS based on the developed tool,
it has become apparent that the most significant issue is
to figure out which standards are required and whether
those standards have been adhered to while developing,
for example, the redundancy, fault detection and HMI. To
ease the process for UAS manufacturers to adhere to these
requirements, it would be beneficial if JARUS added specific
standard requirements to their OSOs. Some of the questions
formulated in the tool needs improvement, with multiple
questions in one question, or numerous interpretations. A
more thorough evaluation will reveal whether some subjects
have been left untouched by the questions. Also, further
testing is needed in order to ensure that the automatically
generated evaluation report always correctly weighs, for
1http://uaswork.org/sdu/sora/index.php
2https://gitlab.com/uaswork/sora/sora_uas_
compliance_assessment_tool
instance, the combination of redundancies for generating the
correct result.
As seen from table III, none of the evaluated UAS comply
with any of the SAIL. Once again, these results are to be
taken with a grain of salt, as the tool is at an early stage.
The Matrice 600 Pro will comply with SAIL II once a design
and installation appraisal becomes available, describing the
UAS independence, separation and redundancy, in addition
to any relevant particular risk associated with the operation,
such as precipitation or electromagnetic interference. The
three quadrotors (a, b and c) fail the OSO #10 and #12
requirements, as their lack of rotor redundancy makes it
possible for a single failure to cause a crash, which can lead
to a fatality if a person is hit. None of the UAS are equipped
with fail-safe systems with emergency parachutes, which
could mitigate this issue. OSO #06 and OSO #20 require
testing of the C2 link performance and HMI respectively,
which at the time of writing has not been performed. The
red boxes for OSO #10 and #12 at SAIL III, in addition to
most of the ones in the SAIL IV column, are due to lacking
information on whether the UAS and its subsystems have
been designed according to relevant standards.
While elaborating the OSOs, some improvement sugges-
tions have arisen: OSO #03, which describes maintenance,
specifies that the maintenance should include the manufac-
turer’s guidelines for the model when applicable. However,
as UAS models vary in design, the authors recommend that
it becomes a prequisite that a model specific maintenance
manual is published by the manufacturer. OSO #10 and #12
puts forth requirements for the handling of probable failures.
However, by only treating probable failures, a primary focus
is put on the hardware. Another risk faced by UAS is the
introduction of errors from software upgrades. Therefore, it
is recommended to add requirements for the testing of new
versions of software for UAS components or subsystems,
before being introduced to UAS operations with an elevated
risk.
VII. CONCLUSION AND FUTURE WORK
In this work, the SORA has been analyzed. From this
analysis, a prototype assessment tool based on the SORA
defined technology requirements has been developed. The
tool is a web-based questionnaire that based on the SORA
OSO requirements presents the user questions that can be
answered by ’Yes’, ’No’ or ’Don’t know’. Upon completion
of the questionnaire, the user receives an automatically
generated evaluation report
A preliminary evaluation of the tool has been performed
based on four UAS. The prototype tool indicates that none of
the selected UAS are SAIL II, III or IV compliant. However,
the authors must again stress the fact that theses results
are preliminary and based on an unfinished tool. For SAIL
II, fitting the UAS with emergency parachutes, performing
a design and installation appraisal and testing the C2 link
and HMI will result in compliance for one of the tested
UAS. However, achieving SAIL III and IV requires the
465
manufacturer to document adherence to relevant standards,
which is not the case at present.
Future work is to improve the quality of the tool by
inviting UAS domain experts to use and evaluate the tool.
The questions currently implemented have been developed
based on multirotors. In order to include fixed-wing UAS,
questions about subjects such as redundancy will have to
be modified, and new fixed-wing specific questions should
be added. The tool could potentially be expanded to cover
not only the technological aspects, but also the operational
aspects of the SORA.
ACK NOWL EDG EME NTS
This work is part of the projects Free the Drones (FreeD)
and Healthcare Cargo and Person Transportation Drones
(HealthDrone), co-funded by the Innovation Fund Denmark.
The authors would like to thank Jussi Hermansen, UAS
Specialist, and Erling Hansen, Manager, Test & Airwor-
thiness, for their contributions to testing the tool, Martin
Peter Christiansen, Postdoc, for his advice and proofreading,
and Frederik Mazur Andersen, Student Programmer, for
assistance with web development.
REFERENCES
[1] J. Xia, K. Wang, and S. Wang, “Drone scheduling to monitor vessels
in emission control areas,” Transportation Research Part B: Method-
ological, vol. 119, pp. 174–196, 2019.
[2] T. K. Amukele, J. Street, K. Carroll, H. Miller, and S. X. Zhang,
“Drone transport of microbes in blood and sputum laboratory spec-
imens,” Journal of Clinical Microbiology, vol. 54, pp. 2622–2625,
2016.
[3] European Parliament. (2018, Jun.) Drones: new
rules for safer skies across europe. [Online].
Available: http://www.europarl.europa.eu/news/en/headlines/economy/
20180601STO04820/drones-new-rules- for-safer-skies-across-europe
[4] JARUS guidelines on Specific Operations Risk Assessment (SORA)
V2.0, JARUS Std. 2.0, 2019.
[5] JARUS guidelines on Specific Operations Risk Assessment (SORA),
JARUS Std. 1.0, 2017.
[6] Danish Transport, Construction and Housing Authority.
(2019, Apr.) Droner i redningsberedskabet. [Online]. Avail-
able: https://www.trafikstyrelsen.dk/DA/Luftfart/Flyveoperationer/
Luftfartserhverv/Droner-i-redningsberedskabet.aspx
[7] Daimler. (2017, Sep.) Vans & drones in zurich: Mercedes-
benz vans, matternet and siroop start pilot project for
on-demand delivery of e-commerce goods. [Online]. Available:
https://media.daimler.com/marsMediaSite/en/instance/ko/Vans--
Drones-in- Zurich-Mercedes-Benz- Vans-Matternet- and-siroop-
start-pilot- project-for-on-demand-delivery-of-e- commerce-
goods.xhtml?oid=29659139
[8] Bundesverband Copter Piloten. Sora-ger - rechner. [Online]. Available:
https://bvcp.de/sora-rechner/
[9] Copteruni. Sora-tool. [Online]. Available: http://copteruni.de/sora
[10] Flynex. Sora ger. [Online]. Available: https://sora.flynex.de/
[11] Federal Aviation Administration. Part 107 waivers. [Online]. Avail-
able: https://www.faa.gov/uas/commercial operators/part 107 waivers/
[12] A. La Cour-Harbo, “The value of step-by-step risk assessment for
unmanned aircraft,” in International Conference on Unmanned Aircraft
Systems (ICUAS), Dallas, TX, Jun. 2018, pp. 149–157.
[13] E. Denney and G. Pai, “Tool support for assurance case development,”
Automated Software Engineering, vol. 25, no. 3, pp. 435–499, 2018.
[14] A. Ruiz, X. Larrucea, and H. Espinoza, “A tool suite for assurance
cases and evidences: Avionics experiences,” in Computer Safety,
Reliability, and Security, vol. 543, Sep. 2015, pp. 63–71.
[15] J. G´
orski, A. Jarze¸bowicz, J. Miler, M. Witkowicz, J. Czy˙
znikiewicz,
and P. Jar, “Supporting assurance by evidence-based argument ser-
vices,” in Computer Safety, Reliability, and Security, vol. 7613, Berlin,
Heidelberg, 2012, pp. 417–426.
[16] F2910-14 Standard Specification for Design and Construction of a
Small Unmanned Aircraft System (sUAS), ASTM Std., 2014.
[17] K. P. Arnold, “The uav ground control station: Types, components,
safety, redundancy, and future applications,” International Journal of
Unmanned Systems Engineering, vol. 4, no. 1, pp. 37–50, 2016.
[18] K. Chang, P. Rammos, S. A. Wilkerson, M. Bundy, and A. Gadsden,
“Lipo battery energy studies for improved flight performance of
unmanned aerial systems,” in Unmanned Systems Technology XVIII,
vol. 9837, Baltimore, Maryland, United States, May 2016.
[19] J. L. Drury, L. Riek, and N. Rackliffe, “A decomposition of uav-related
situation awareness,” in Proceedings of the 1st ACM SIGCHI/SIGART
conference on Human-robot interaction, 2006, pp. 88–94.
[20] V. K. Tofterup and K. Jensen, “A methodology for evaluating com-
mercial off the shelf parachutes designed for suas failsafe systems,”
in International Conference on Unmanned Aircraft Systems (ICUAS),
Atlanta, GA, Jun. 2019.
[21] A. V. Royappa, “The php web application server,” Journal of Com-
puting Sciences in Colleges, vol. 15, no. 3, pp. 201–211, 2000.
[22] S. J. Vaughan-Nichols, “Will html 5 restandardize the web?” Com-
puter, vol. 43, no. 4, pp. 13–15, 2010.
[23] EU 2016/679 (General Data Protection RegulationEuropean Union,
European Union Std., 2016.
[24] D. Crockford, “The application/json media type for javascript object
notation (json),” The Internet Society, Reston, Virginia, United States,
Tech. Rep., Jul. 2006.
[25] T. Bray, “The javascript object notation (json) data interchange for-
mat,” Internet Engineering Task Force, Fremont, California, United
States, Tech. Rep., Dec. 2017.
[26] Skylogic Research, “2017 drone market sector report prospectus,”
Tech. Rep., Sep. 2017.
[27] EENA, DJI, “The use of remotely piloted aircraft systems (rpas) by
the emergency services,” Tech. Rep., Nov. 2016.
466