Content uploaded by Georgios Karatzinis
Author content
All content in this area was uploaded by Georgios Karatzinis on Jan 25, 2023
Content may be subject to copyright.
1
Multimodal Data Collection System for UAV-based
Precision Agriculture Applications
Emmanuel K. Raptis, Georgios D. Karatzinis, Marios Krestenitis, Athanasios Ch. Kapoutsis, Kostantinos
Ioannidis, Stefanos Vrochidis, Ioannis Kompatsiaris and Elias B. Kosmatopoulos
Abstract—Unmanned Aerial Vehicles (UAVs) consist of emerg-
ing technologies that have the potential to be used gradually
in various sectors providing a wide range of applications. In
agricultural tasks, the UAV-based solutions are supplanting the
labor and time-intensive traditional crop management practices.
In this direction, this work proposes an automated framework
for efficient data collection in crops employing autonomous path
planning operational modes. The first method assures an optimal
and collision-free path route for scanning the under examination
area. The collected data from the oversight perspective are used
for orthomocaic creation and subsequently, vegetation indices
are extracted to assess the health levels of crops. The second
operational mode is considered as an inspection extension for
further on-site enriched information collection, performing fixed
radius cycles around the central points of interest. A real-world
weed detection application is performed verifying the acquired
information using both operational modes. The weed detection
performance has been evaluated utilizing a well-known Convolu-
tional Neural Network (CNN), named Feature Pyramid Network
(FPN), providing sufficient results in terms of Intersection over
Union (IoU).
Index Terms—Path planning, Unmanned Aerial Vehicles
(UAVs), Data gathering, Plant Inspection, Weed Detection, Pre-
cision Agriculture
I. INTRODUCTION
Remote sensing holds a key role in Precision Agriculture
(PA) and Smart Farming (SM) serving as the core tool for
monitoring cultivated fields [1]. The UAVs’ inherent attributes
of agility [2], adaptability [3], non-destructive data acquisition
[4], their Internet of things (IoT) enabling nature [5], as well
as their ultra-high spatial resolution of acquired images [6],
compose them the preferred solution over their competitors.
Satellite imagery provides data for large areas but with low
spatial resolution and manned aircraft can cover large areas
quickly but they are expensive alternatives. Ground systems
This research has been financed by the European Regional Development
Fund of the European Union and Greek national funds through the Opera-
tional Program Competitiveness, Entrepreneurship and Innovation, under the
call RESEARCH - CREATE - INNOVATE (T1EDK-00636) and from the
European Commission under the European Union’s Horizon 2020 research
and innovation programme under grant agreement no 101021851 (NESTOR).
(Corresponding author: Emmanuel K. Raptis)
Emmanuel K. Raptis, Georgios D. Karatzinis and Elias B. Kosmatopoulos
are with Department of Electrical and Computer Engineering, Democritus
University of Thrace, Xanthi, Greece and Information Technologies Institute,
The Centre for Research & Technology, Hellas, Thessaloniki, Greece (erap-
tis@ee.duth.gr, gkaratzi@iti.gr and kosmatop@iti.gr)
Marios Krestenitis, Athanasios Ch. Kapoutsis, Konstantinos Ioannidis, Ste-
fanos Vrochidis and Ioannis Kompatsiaris are with Information Technologies
Institute, The Centre for Research & Technology, Hellas, Thessaloniki, Greece
(mikrestenitis@iti.gr, athakapo@iti.gr, kioannid@iti.gr, stefanos@iti.gr and
ikom@iti.gr)
Fig. 1: Holistic Path Planning Scheme for Precision Agricul-
ture Applications.
are best suitable for close-up inspections but in a destructive
manner, while also they cannot cover large areas. Thus, UAVs
can be virtually assumed to be depicted on the switching
spot between spatial resolution and data acquisition simplicity,
creating an ever-increasing demand for UAV-related appli-
cations in the PA domain. Different sensors can be placed
on UAVs, resulting in various sensory data that can mainly
range between [7]: a) RGB imaging (Red, Green, and Blue
wavebands); b) Multispectral imaging (several wavebands in
visible and near-infrared range); c) Hyperspectral (hundreds to
thousands of wavebands in visible and near-infrared regions);
d) Thermal infrared imaging, and; e) Light detection and
ranging (LiDAR) sensory data.
UAV technologies have been applied in a wide range of PA
applications such as weed detection [8], semantic segmentation
[9], [10], classification procedures [11], vegetation health
monitoring [12], diseases detection [13], plant stress detection
[14], crops spraying [15] and yield estimation [16]. Extensive
taxonomy studies have been presented in the literature for
UAV-based applications in PA regarding the type of UAVs,
the equipped sensors, health indexes, and evaluation tasks
[14], [17], [18]. Weed mapping is one of the most popular
applications in PA aiming to identify non-desirable plants that
cause losses to crop yields during their growth, as well as
at harvesting. Therefore, accurate weed mapping leads to the
reduction of herbicide usage preventing also the evolution of
2
herbicide-resistant weeds.
More often than not, different types of UAV missions are
needed to cover specific PA needs [19]. The most common
type of UAV mission includes a uniform mapping over an
area of interest. Although several research works provide this
kind of capability, they do not offer the capability of handling
obstacles and no-fly-zones [20], [21]. Additionally, several
close-up inspections are usually needed to verify and examine
detections and intuitions with higher granularity. Usually,
these inspections are needed at different stages of growth to
constantly assess problematic areas and are executed by either
non-UAV (employing specialized farmers and agronomists to
be on the field) or manually controlled UAV missions.
Although there are UAV-based solutions that provide ef-
ficient approaches in drainage pipe mapping [22], irrigation
management [23] and yield estimation [16], do not foresee
path planning development for mapping and intensive tar-
geted data collections actions. Individual practicability features
should also be included towards unlocking the utilization of
existing UAVs and enhancing adopt-ability from farmers and
agronomists. It is of great importance to minimize human
intervention during flight missions and maximize human co-
operation capabilities at the same time, serving as an integral
tool for a wide range of PA applications.
To tackle the aforementioned challenges, a holistic two-
mode path planning solution is proposed that is capable of
providing an automated way for data acquisition in agriculture
applications. At a glance, the contributions of the developed
system, which is outlined in Figure 1, are:
•A low-cost, yet automated, UAV-based solution for data
acquisition in PA applications. Two operational modes
emerge a) A path planning algorithm is responsible for
producing the optimal route complete coverage of a given
area of interest, while, at the same time, respecting no-
fly zones. Subsequently, image data are processed by a
ground control station and health indices can be extracted
to assess the biomass levels of crops. b) A circular-
like inspection around points of interest in the field
is performed in lower altitudes to collect high-quality
images providing enriched information.
•Custom android application to enable the intercommuni-
cation between UAV and ground control station, enabling
the direct incorporation with existing, commercial UAVs.
•A real-world weed detection application is performed on
a wheat and cotton field. The data have been acquired
by the proposed PA path planning approach, using both
operational modes. The quality of the gathered data is
being verified by evaluating with a well-known Convolu-
tional Neural Network (CNN), more specifically Feature
Pyramid Network (FPN), in terms of Intersection over
Union (IoU) providing sufficient results.
The material of this paper is organized as follows: Section
II and Section III presents the UAV-based data collection
system describing the path planning operational modes and
the developed user-friendly interface, respectively. In Section
IV the verification of the automated UAV-based data collection
framework is performed, while also a real-world weed detec-
tion problem is presented. Finally, Section V summarizes the
main innovations and functionalities of the proposed system.
II. UAV-BA SE D DATA COLLECTION SYSTE M
To cover a continuous area, it is essential to deploy a method
capable of providing safe and efficient paths, while ensuring
high-percentage coverage of the agricultural field. To deal with
robot-related challenges faced in real-world scenarios, one
should follow an end-to-end Coverage Path Planning (CPP)
technique. CPP is the determination of a collision-free path
that the robot must follow to pass through all the points
of an area or Region Of Interest (ROI). A comprehensive
review of CPP methods including their advances in many field
applications is depicted in [24].
A. STC-based Coverage Path Planning PA Module
In this work, in terms of a CPP method, we utilized the
Spanning Tree Coverage (STC) algorithm [25], [26]. This
algorithm deploys an off-line CPP algorithm while knowing in
advance any related information about the environment. The
main steps of the aforementioned methodology are outlined in
Figure 2.
For ease of understanding, it is assumed that the agricultural
field is constrained within a rectangle in Cartesian coordinates
(x, y). As a first step, the proposed algorithm discretizes the
area of interest into a set of equal cells of size D based
on the user-defined information1; the finite number of the
cells represents the level of spatial resolution and the sensing
capabilities of the robot used. The discretized cells are then
grouped into large square-shaped cells of size 2D each of
which is either entirely blocked (black) or entirely unblocked
(white) as shown in Figure 2(a). It must be noted, that the only
algorithm’s requirement, is that the stationary obstacle areas
within the grid cannot be smaller than a 2D cell. As a next step,
as shown in Figure 2(b), every unoccupied 2D cell is translated
into a node and an edge is introduced following the Von
Neumann neighborhood, resulting an undirected graph. For
the resulting graph, a minimum spanning tree is constructed
by applying any Minimum Spanning Tree (MST) methodology
(e.g. Kruskal or Prim [27]) as illustrated in Figure 2(c)-(d). A
path is then created by circumnavigating this tree Figure 2(e)
via a clockwise direction starting from a cell of size D while
traversing every D cell which lies in the MST, producing,
thus, an optimal covering path as shown in Figure 2(f). This
navigation route (Figure 2(f)) has the advantage of avoiding
unnecessary movements of the robot which do not contribute
to the process of scanning the area. A mission starts and ends
in the same place with each cell being covered only once while
avoiding any obstacle/no-fly-zone possible defined within the
operational area. In this way, the paths created are very suitable
for energy-efficient applications taking into account energy
constraints such as the battery consumption of the UAV.
Having illustrated all the main features that establish a safe
and efficient path, we proceed to present an “adaptive” navi-
gation algorithm including path planning switching features
1obstacles (x, y, z), image overlap (%) and flight altitude (h)
3
(a) Grid representation of the field (b) Cell to Node conversion
(c) Minimum Spanning Tree (d) Minimum Spanning Tree as nav-
igation guide
(e) Circumnavigate MST (f) Final coverage path
Fig. 2: Main steps of the Spanning Tree Coverage algorithm.
for monitoring agricultural resources tailored to field char-
acteristics. Algorithm 1 outlines in pseudocode the proposed
navigation algorithm.
In line 1, the user-selected area of the field, hereafter men-
tioned as Polygon, is transformed from the spatial reference
system WGS84 to a North-East-Down (NED) frame where the
coordinates can be defined as a Cartesian Plane shape (x, y).
The reason why this transformation is needed is that Polygon
is initially defined by the user in a set of latitude/longitude
pairs while the Overlap between two corresponding images in
adjacent flight path lines is translated in meters by introducing
aScanning distance parameter (line 2), as sketched in Figure 3.
Therefore, the NED projection was employed to represent both
input parameters using a common metric system in meters.
Having transformed the input coordinates Polygon to a
Cartesian Plane shape, in lines 2-7 the entire field is grouped
into large 2D cells and each cell is represented as a node
to which an edge is assigned. These nodes-edges (V, E )form
the graph Gwhich defines the allowed movements of the UAV
within the field (lines 7-9). In lines 8-9, a graph minimization
methodology is applied to the previously formed graph G
Algorithm 1 STC-based Coverage Path Planning method
Require: Polygon,Obstacles,Overlap,Altitude
Ensure: A sequence of waypoints
1: WGS84 to NED ▷project to NED
2: {Overlap,Altitude} −→ Scanning distance ▷[26]
3: for row, col to x, y do
4: D−→ 2D cells ∈R2▷No. of cells based on Overlap
5: end for
6: for each node ∈grid do
7: G= (V, E )▷add edge between nodes
8: end for
9: Apply MST algorithm to G ▷ MST ∈G
10: Traverse MST through D cells. Halt when starting cell is
encountered again.
11: if flight direction is T rue then ▷graph rotation
12: rotate Gby ϑ
13: go back to line 8
14: end if
15: NED to WGS84 ▷project to WGS84
16: return Coverage path
Fig. 3: Overlap between two sequential images in adjacent
parallel flight path lines for one UAV in two different time-
steps upon the extracted coverage path.
and the circumnavigation phase takes place. Hence, to provide
an efficient path tailored to the field, Gmust have the most
favorable -in terms of coverage- representation within the grid.
To do so, in lines 10-13 a rotation technique is introduced. The
key idea is the rotation of each node within the grid by an
angle ϑin a counter-clockwise direction around its center-
point. Modifying the flight’s direction, there might appear
more favorable configurations of Ginside the grid, where the
number of nodes is greater. Such configurations are more likely
to provide paths that achieve a higher percentage of coverage
for the given field as illustrated in Figure 4.
Fig. 4: Flight path direction regulated by the ϑ.
Once a path within the grid has been determined, in line 14,
4
the extracted points are projected back to the WGS84 system
to generate a sequence of waypoints to be used in real-world
scenarios. Finally, in line 15, the output of the algorithm gen-
erates a simple closed path (Lat1, Lon1), (Lat2, Lon2), . . . ,
(Latm, Lonm), where mdenotes the number of waypoints.
B. Targeted High-Detail Data Acquisition Module
Based on the resulted CPP flight, a number of aerial images
will be collected and processed. After the post-processing
analysis, it is usual to identify some specific spots that require
more detailed “on-site” inspection. This task can be also
automated by using UAVs with the appropriate path planning
mechanism.
From the motion planning perspective, the planning of
the route for accessing a finite set of points is limited to
the problem of finding the optimal path with the minimum
realization cost. For the visual “on-site” inspection, we applied
the Travelling Salesman algorithm [28], a renowned method
for finding the shortest path within an undirected graph. More
specifically, this algorithm treats the problem as an unoriented
stationary graph, with the finite points being the vertices of
the graph and the edges being the paths that the UAV can
follow during its mission, as illustrated in Figure 5(a). It
is a problem that starts and ends at a specific vertex after
having visited all the other vertices of the graph, exactly once.
After calculating the optimal path, the points - vertices of the
graph are “transformed” into centers of circular shapes, and
a set of madditional points on the perimeter of the circle
are generated. These additional points are created in the 3D
space given a desired altitude h, an angle ϕ∈[0°, 360°]and
a constant radius r. This way, along with the path formed
by TSP, the UAV repeatedly performs fixed radius cycles
around all central points of interest as depicted in Figure 5(b).
Apparently, this operation provides enriched “on-site”/local
information, as images are acquired from lower altitude levels
(e.g., 10 meters) and different angles. Moreover, during the
execution of the mission, the user can also use the physical
remote control to modify the speed and altitude of the aircraft.
(a) Travelling salesman problem (b) Circular flight
Fig. 5: Visual “on-site” inspection around a point of interest
One must note that, for both path planning modes, to ac-
quire high-resolution images, a number of parameters (namely
Overlap, Altitude, Flight Direction, Speed) should be tuned
accordingly.
III. ANANDROID-BASED MIDDLEWARE APPLICATIO N
FO R UAVS
To provide a data collection solution able to perform
coverage and inspection missions for real-life use, a UAV-
based middleware application is needed to enable message
interactions and real-time communication. Towards this direc-
tion, a custom and user-friendly interface based on the UAV’s
handling capabilities have been developed with the use of
the DJI API offering a simplified form of on-site commands
as illustrated in Figure 6. Specifically, the developed android
application is used as a data transceiver between the automatic
navigation algorithms and the flight control system, allowing
the UAV to follow the waypoints to cover or inspect an
area and collect data. Note that the aforementioned android
application is configured to provide i) autonomous missions
and ii) dynamic display of flight data capable of integrating
different types of commercial UAVs. In addition, it also acts
as an on-the-field pilot, allowing the users to monitor the
missions live, enabling them to take control and handle the
UAV instantly, if needed.
Fig. 6: Custom-developed android application
Based on this application, the UAV during the mission
automatically stores images or video footage in an embedded
micro-SD card from the area covered. Note that the sampling
frequency of the captured images and the UAV’s speed should
be adjusted to meet the appropriate requirements of overlap
and quality (noise, blurring effect). This custom android UAV-
based application is open-source and publicly available2to the
research community.
IV. DEM ON ST RATIVE EXPE RI ME NTA L RES ULTS
This section presents two real experiments where the pro-
posed automated UAV-based data collection framework was
used to acquire qualitative and quantitative data. The objectives
of the experiments were to thoroughly test the proposed
autonomous navigation system and collect aerial UAV footage
for two different types of precision agriculture applications
i) crop-field monitoring, geo-mapping, and assessment and
ii) weed detection based on image processing and computer
vision techniques.
A. Hardware components
For both experiments, a commercial UAV (DJI Phantom
4 Pro) equipped with a 1-inch 20-megapixel RGB sensor
was used for the data acquisition. The flight path planning
for crop monitoring and weed detection was done by the
developed algorithms described in the subsections II-A and
2https://github.com/CoFly-Project/waypointmission
5
II-B, respectively. To manage and control the automated
missions, an android smartphone device (Xiaomi Mi Max
2) was used to run the UAV control application described
in subsection III. Last but not least, a portable 4G router
(tp-link M7350) was used to load the maps to adjust the
UAV missions based on the field characteristics. However, it
should be noted that the transferring and reception of data
across devices was made internet-free via a local network
enabling real-time communication with minimum latency even
in locations where the internet is inaccessible.
B. STC-based Coverage Path Planning PA Module: A wheat
field case
This subsection consist an experimental projection of the
path planning algorithm described in II-A having an ulterior
motive to provide the crops’ health condition from an oversight
perspective. More specifically, the under examination area is
a wheat field. First, the mission planner designs the shortest
possible path, avoiding unnecessary movements of the UAV
(II-A), the waypoints are received by the UAV via radio
signal and the autonomous mission of data collection takes
place (III). The low-cost device is equipped with an RGB
camera capturing images at a pre-defined sampling frequency,
overlap and altitude. The off-line image processing includes
orthophoto generation aiming to enhance the overall visualiza-
tion extracting useful information about the crop monitoring
and management. Vegetation Indices (VIs) that are based
on the reflective properties of vegetation in the visible light
spectral range were considered as described in Table I. The
overall assessment of plant health for the under investigation
field, Figure 7(a), is depicted in Figure 7(b)-(e). For better
visualization and immediate monitoring perception, a color
scale is applied, with low values presented in red and high
values in green. The results show that the selected vegetation
indices can capture the crop’s state in terms of plant health
enabling effective monitoring for decision-making objectives.
C. Targeted High-Detail Data Acquisition Module: A weed
detection application
In this subsection, an experimental evaluation is conducted
in terms of a vision-based weed detection method using a
deep learning model that effectively detect weeds in UAV-
captured images. In this case, we utilized the agronomist-
annotated CoFly-WeedDB-dataset [29], which contains a set
of 201 RGB images of size 1280 ×720 pixels, depicting
different types of weed species among crop plants. Note that
the RGB images were acquired by a Phantom 4 Pro UAV while
it was performing an inspection mission as described in II-B.
More specifically, in this dataset, three types of weeds were
identified: 1) Johnson grass; 2) Field bindweed; 3) Purslane. In
Figure 8 indicative examples of the annotated dataset, for each
weed class are presented. All three weed classes have been
unified as Weed class, while the remaining area is considered
as Background. The objective is to validate the information
content of the deployed dataset and its ability to lead to robust
weed detection models.
(a) Orthomosaic
(b) VARI (c) GLI
(d) NGRDI (e) NGBDI
Fig. 7: The produced orthomosaic for a wheat crop and the
corresponding vegetation indices.
Fig. 8: Annotated images for each type of identified weed.
The designed validation approach was focused on the se-
mantic segmentation task. In specific, a robust deep Convo-
lutional Neural Network (CNN) for semantic segmentation,
with novel backbone architecture, was employed as a base-
line. Feature Pyramid Network (FPN) is a typical model for
semantic segmentation which has reported promising results
in various cases was utilized. FPN is based on the well-
known scheme of encoder-decoder. In particular, the input
image is firstly processed by the encoding block, which
aims at progressively reducing the spatial dimensions while
simultaneously increasing the depth dimension of the extracted
tensor. The aforementioned process leads to an encoded image
representation yet, enclosing high-level information. In the
decoding stage, the inverted process takes place, where the
compact image representation is gradually upsampled while
6
TABLE I: Vegetation indices description
RGB-based Visual Atmospheric Green Normalized Green Normalized Green
Vegetation Index Resistance Index Leaf Index Red Difference Index Blue Difference Index
Abbreviation VARI GLI NGRDI NGBDI
Formula G−B
G+R−B
2∗G−R−B
2∗G+R+B
G−R
G+R
G−B
G+B
the channel dimension is reduced, to produce the segmented
outcome that meets the initial input dimensions. The main
aspect of FPN is the employment of skip connections that
add the extracted feature maps from the individual encoding
levels to the corresponding layers of the decoder and employ
a1×1convolution to further fine-tune the extracted outcome.
Regarding the backbone architecture employed in the encoding
stage, the family of EfficientNet networks was utilized which
contains state-of-the-art results in image classification tasks.
The employed dataset was randomly split into a training
and testing set, following an 80% −20% rule. The specific
process was repeated 3times, to derive different split subsets
containing divergent data distributions and thus, allow to con-
duct of a more detailed and concrete evaluation. Regarding the
training process, the FPN model was trained for 500 epochs
with a batch size equal to 16 and the Adam optimization
algorithm. Since the designed dataset is imbalanced, a focal
loss function was utilized aiming to tackle the specific issue.
For the training stage, several augmentation techniques were
employed aiming to increase the amount of processed data and
thus, increase the model’s efficiency. In detail, the following
augmentation methods are deployed on every training image
with a chance of 50%: horizontal/vertical flip, random rotation,
random rescale, gaussian blurring, and random change of
the image brightness. The aforementioned techniques were
selected taking into consideration that processed data are
UAV-acquired imagery and our aim was to simulate different
capturing scenarios by the UAV camera. Finally, the patch
of 256 ×256 pixels is randomly cropped from every image,
operating as another augmentation process, and forwarded as
input to the model. All experiments were conducted on a
GeForce RTX 3090 GPU.
Table II presents the accuracy of the FPN pre-described
model in terms of Intersection-over-Union (IoU). Specifically,
it shares EfficientNetB1 as the encoder backbone, which
is pretrained on the well-known dataset ImageNet. Results
are reported for each one of the three data splits with the
corresponding mean value.
TABLE II: Evaluation performance of FPN in terms of IoU
for three different split subsets
Model Split Background Weed mIoU
FPN
1 95.45 44.58 70.01
2 94.93 38.81 66.87
3 96.29 42.58 69.44
Results imply that the quality of the collected data is
adequate to create robust weed detection methods based on
deep learning models empowering the UAV robots also as a
tool for automatic visual inspection in precision farming.
V. CONCLUSIONS
In this work, an UAV-based data collection system for
precision agriculture applications has been proposed. Two path
planning operational modes have been presented in order to
provide compact and effective solutions for a wide range of
PA-related tasks from the data collection perspective. The
main path planning/navigation algorithm produces an optimal
and collision-free route for data gathering in monitoring-
related applications. However, for more detail-demanding ap-
plications which require intensive crop inspection, such as
weed detection, a second operational path planning mode
is presented. More precisely, circular-like inspection takes
place around points of interest to collect enriched local field
information. All the UAV’s path planning capabilities are
internet-free functioning on mobile devices including user-
friendly on-site commands and flight data tracking that ease
the utilization of UAVs. The functionality of the proposed
system was tested and validated in two different types of
precision agriculture applications. In both experiments, the
proposed framework preserved all the desired features while
at the same time the preliminary evaluation results showed
that the collected dataset from the visual inspection mission
can lead to robust weed detection models. These features
are of paramount importance in agribusiness, as they can
be utilized to design sustainable legume-supported cropping
systems for monitoring, assessment, weed detection, and pest
control management. As future work, we plan to extend our
system to include the incorporation of several UAVs in the
same mission, using a combination of offline path-planing [30]
and an online adaptation [31], to reduce the execution time by
several orders of magnitude.
ACK NOW LE DG EM EN TS
This research has been financed by the European Regional
Development Fund of the European Union and Greek national
funds through the Operational Program Competitiveness, En-
trepreneurship and Innovation, under the call RESEARCH -
CREATE - INNOVATE (T1EDK-00636) and from the Euro-
pean Commission under the European Union’s Horizon 2020
research and innovation programme under grant agreement no
101021851 (NESTOR). Also, we gratefully acknowledge the
support of NVIDIA Corporation with the donation of GPUs
used for this research.
REFERENCES
[1] R. P. Sishodia, R. L. Ray, and S. K. Singh, “Applications of remote
sensing in precision agriculture: A review,” Remote Sensing, vol. 12,
no. 19, p. 3136, 2020.
[2] P. Foehn, E. Kaufmann, A. Romero, R. Penicka, S. Sun, L. Bauersfeld,
T. Laengle, G. Cioffi, Y. Song, A. Loquercio et al., “Agilicious: Open-
source and open-hardware agile quadrotor for vision-based flight,”
Science Robotics, vol. 7, no. 67, p. eabl6259, 2022.
7
[3] A. Loquercio, A. Saviolo, and D. Scaramuzza, “Autotune: Controller
tuning for high-speed flight,” IEEE Robotics and Automation Letters,
vol. 7, no. 2, pp. 4432–4439, 2022.
[4] A. Koutsoudis, G. Ioannakis, P. Pistofidis, F. Arnaoutoglou, N. Kazakis,
G. Pavlidis, C. Chamzas, and N. Tsirliganis, “Multispectral aerial
imagery-based 3d digitisation, segmentation and annotation of large
scale urban areas of significant cultural value,” Journal of Cultural
Heritage, vol. 49, pp. 1–9, 2021.
[5] P. Grippa, A. Renzaglia, A. Rochebois, M. Schranz, and O. Simonin,
“Inspection of ship hulls with multiple uavs: Exploiting prior informa-
tion for online path planning,” in IEEE/RSJ International Conference on
Intelligent Robots and Systems, 2022.
[6] B. Fu, M. Liu, H. He, F. Lan, X. He, L. Liu, L. Huang, D. Fan, M. Zhao,
and Z. Jia, “Comparison of optimized object-based rf-dt algorithm and
segnet algorithm for classifying karst wetland vegetation communities
using ultra-high spatial resolution uav data,” International Journal of
Applied Earth Observation and Geoinformation, vol. 104, p. 102553,
2021.
[7] C. Xie and C. Yang, “A review on plant high-throughput phenotyping
traits using uav-based sensors,” Computers and Electronics in Agricul-
ture, vol. 178, p. 105731, 2020.
[8] A. dos Santos Ferreira, D. M. Freitas, G. G. da Silva, H. Pistori,
and M. T. Folhes, “Weed detection in soybean crops using convnets,”
Computers and Electronics in Agriculture, vol. 143, pp. 314–324, 2017.
[9] T. Barros, P. Conde, G. Gonc¸alves, C. Premebida, M. Monteiro, C. S. S.
Ferreira, and U. J. Nunes, “Multispectral vineyard segmentation: A deep
learning comparison study,” Computers and Electronics in Agriculture,
vol. 195, p. 106782, 2022.
[10] G. D. Karatzinis, S. D. Apostolidis, A. C. Kapoutsis, L. Pana-
giotopoulou, Y. S. Boutalis, and E. B. Kosmatopoulos, “Towards an
integrated low-cost agricultural monitoring system with unmanned air-
craft system,” in 2020 International conference on unmanned aircraft
systems (ICUAS). IEEE, 2020, pp. 1131–1138.
[11] L. P ´
adua, A. Matese, S. F. Di Gennaro, R. Morais, E. Peres, and
J. J. Sousa, “Vineyard classification using obia on uav-based rgb and
multispectral data: A case study in different wine regions,” Computers
and Electronics in Agriculture, vol. 196, p. 106905, 2022.
[12] M. P. Christiansen, M. S. Laursen, R. N. Jørgensen, S. Skovsen, and
R. Gislum, “Designing and testing a uav mapping system for agricultural
field surveying,” Sensors, vol. 17, no. 12, p. 2703, 2017.
[13] M. Kerkech, A. Hafiane, and R. Canals, “Deep leaning approach with
colorimetric spaces and vegetation indices for vine diseases detection
in uav images,” Computers and electronics in agriculture, vol. 155, pp.
237–243, 2018.
[14] J. G. A. Barbedo, “A review on the use of unmanned aerial vehicles and
imaging sensors for monitoring and assessing plant stresses,” Drones,
vol. 3, no. 2, p. 40, 2019.
[15] A. Tellaeche, X. P. BurgosArtizzu, G. Pajares, A. Ribeiro, and
C. Fern´
andez-Quintanilla, “A new vision-based approach to differential
spraying in precision agriculture,” computers and electronics in agricul-
ture, vol. 60, no. 2, pp. 144–155, 2008.
[16] I. Wahab, O. Hall, and M. Jirstr¨
om, “Remote sensing of yields: Applica-
tion of uav imagery-derived ndvi for estimating maize vigor and yields
in complex farming systems in sub-saharan africa,” Drones, vol. 2, no. 3,
p. 28, 2018.
[17] D. C. Tsouros, S. Bibi, and P. G. Sarigiannidis, “A review on uav-based
applications for precision agriculture,” Information, vol. 10, no. 11, p.
349, 2019.
[18] C. Ju and H. I. Son, “Multiple uav systems for agricultural applications:
Control, implementation, and evaluation,” Electronics, vol. 7, no. 9, p.
162, 2018.
[19] W. H. Maes and K. Steppe, “Perspectives for remote sensing with
unmanned aerial vehicles in precision agriculture,” Trends in plant
science, vol. 24, no. 2, pp. 152–164, 2019.
[20] C. Di Franco and G. Buttazzo, “Coverage path planning for uavs
photogrammetry with energy and resolution constraints,” Journal of
Intelligent & Robotic Systems, vol. 83, no. 3, pp. 445–462, 2016.
[21] I. Maza and A. Ollero, “Multiple uav cooperative searching operation
using polygon area decomposition and efficient coverage algorithms,”
in Distributed Autonomous Robotic Systems 6. Springer, 2007, pp.
221–230.
[22] B. Allred, N. Eash, R. Freeland, L. Martinez, and D. Wishart, “Effective
and efficient agricultural drainage pipe mapping with uas thermal
infrared imagery: A case study,” Agricultural Water Management, vol.
197, pp. 132–137, 2018.
[23] L. Quebrajo, M. Perez-Ruiz, L. P´
erez-Urrestarazu, G. Mart´
ınez, and
G. Egea, “Linking thermal imaging and soil remote sensing to enhance
irrigation management of sugar beet,” Biosystems Engineering, vol. 165,
pp. 77–87, 2018.
[24] E. Galceran and M. Carreras, “A survey on coverage path planning for
robotics,” Robotics and Autonomous systems, vol. 61, no. 12, pp. 1258–
1276, 2013.
[25] Y. Gabriely and E. Rimon, “Spanning-tree based coverage of contin-
uous areas by a mobile robot,” Annals of mathematics and artificial
intelligence, vol. 31, no. 1, pp. 77–98, 2001.
[26] S. D. Apostolidis, P. C. Kapoutsis, A. C. Kapoutsis, and E. B. Kos-
matopoulos, “Cooperative multi-uav coverage mission planning platform
for remote sensing applications,” Autonomous Robots, vol. 46, no. 2, pp.
373–400, 2022.
[27] J. Jarvis and D. Whited, “Computational experience with minimum
spanning tree algorithms,” Operations Research Letters, vol. 2, no. 1,
pp. 36–41, 1983.
[28] D. L. Applegate, R. E. Bixby, V. Chvatal, and W. J. Cook, The traveling
salesman problem: a computational study. Princeton university press,
2006.
[29] M. Krestenitis, E. K. Raptis, A. C. Kapoutsis, K. Ioannidis, E. B.
Kosmatopoulos, S. Vrochidis, and I. Kompatsiaris, “Cofly-weeddb: A
uav image dataset for weed detection and species identification,” Data
in Brief, p. 108575, 2022.
[30] A. C. Kapoutsis, S. A. Chatzichristofis, and E. B. Kosmatopoulos, “Darp:
divide areas algorithm for optimal multi-robot coverage path planning,”
Journal of Intelligent & Robotic Systems, vol. 86, no. 3, pp. 663–680,
2017.
[31] D. I. Koutras, A. C. Kapoutsis, and E. B. Kosmatopoulos, “Autonomous
and cooperative design of the monitor positions for a team of uavs to
maximize the quantity and quality of detected objects,” IEEE Robotics
and Automation Letters, vol. 5, no. 3, pp. 4986–4993, 2020.