Content uploaded by João L. M. Pedroso de Lima
Author content
All content in this area was uploaded by João L. M. Pedroso de Lima on Apr 02, 2018
Content may be subject to copyright.
Content uploaded by Eyal Ben-Dor
Author content
All content in this area was uploaded by Eyal Ben-Dor on Mar 14, 2018
Content may be subject to copyright.
Review
1
On the Use of Unmanned Aerial Systems for
2
Environmental Monitoring
3
Salvatore Manfreda1,*, Matthew McCabe2, Pauline Miller3, Richard Lucas4, Victor Pajuelo
4
Madrigal5, Giorgos Mallinis6, Eyal Ben Dor7, David Helman8, Lyndon Estes9, Giuseppe Ciraolo10,
5
Jana Müllerová
11; Flavia Tauro12, M. Isabel De Lima13, Joao L.M.P. De Lima13, Felix Frances14,
6
Kelly Caylor15, Marko Kohv16, Antonino Maltese10, Matthew Perks17, Guiomar Ruiz-Pérez18,
7
Zhongbo Su19, Giulia Vico18, Brigitta Toth20
8
1. Dipartimento delle Culture Europee e del Mediterraneo: Architettura, Ambiente, Patrimoni Culturali
9
(DiCEM), Università degli Studi della Basilicata, Matera, Italia. E-mail: salvatore.manfreda@unibas.it
10
2. Water Desalination and Reuse Center, King Abdullah University of Science and Technology, E-mail:
11
matthew.mccabe@kaust.edu.sa
12
3. The James Hutton Institute, Aberdeen, Scotland UK, E-mail: pauline.miller@hutton.ac.uk
13
4. Department of Geography and Earth Sciences, Aberystwyth University, Aberystwyth, Ceredigion, UK. E-
14
mail: richard.lucas@aber.ac.uk
15
5. Svarmi ehf., Háskóli Íslands, Iceland, E-mail: victor@svarmi.com
16
6. Department of Forestry and Management of the Environment and Natural Resources, Democritus
17
University of Thrace, Greece. E-mail: gmallin@fmenr.duth.gr
18
7. Department of Geography and Human Environment, Tel Aviv University (TAU), Tel Aviv, Israel. E-mail:
19
bendor@post.tau.ac.il
20
8. Department of Geography and the Environment, Bar-Ilan University, Ramat Gan, Israel. E-mail:
21
david.helman@biu.ac.il
22
9. Graduate School of Geography, Clark University, Worcester, MA USA Email: lestes@clarku.edu
23
10. Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, University of Palermo, Italia.
24
E-mail: giuseppe.ciraolo@unipa.it; malteseantonino@gmail.com
25
11. Department GIS and Remote Sensing, Institute of Botany, The Czech Acad. Sciences, Průhonice, Czech
26
Republic, E-mail: jana.mullerova@ibot.cas.cz
27
12. Centro per l’Innovazione Tecnologica e lo Sviluppo del Territorio (CINTEST), Università degli Studi della
28
Tuscia, Viterbo, Italia. E-mail: flavia.tauro@unitus.it
29
13. Department of Civil Engineering (MARE), University of Coimbra, Coimbra, Portugal. E-mails:
30
iplima@uc.pt; plima@dec.uc.pt
31
14. Universidad Politecnica de Valencia, Spain. E-mail: ffrances@hma.upv.es
32
15. Department of Geography, University of California, Santa Barbara, USA. E-mail: caylor@ucsb.edu
33
16. Department of geology, University of Tartu, E-mail: marko.kohv@gmail.com
34
17. School of Geography, Politics and Sociology, Newcastle University, UK. E-mail:
35
matthew.perks@newcastle.ac.uk
36
18. Department of Crop Production Ecology, Swedish University of Agricultural Sciences (SLU), Uppsala,
37
Sweden. E-mail: guiomar.ruiz.perez@slu.se; giulia.vico@slu.se
38
19. Department of Water Resources in Faculty of Geo-Information and Earth Observation, University of
39
Twente, Netherlands. E-mail: z.su@utwente.nl
40
20. Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian
41
Academy of Sciences, Budapest, Hungary; Dept. of Crop Production and Soil Science, University of
42
Pannonia, Keszthely, Hungary. E-mail: toth.brigitta@agrar.mta.hu
43
Abstract:Environmental monitoring plays a central role in diagnosing climate and management
44
impacts on natural and agricultural systems, enhancing the understanding hydrological processes,
45
optimizing the allocation and distribution of water resources, and assessing, forecasting and even
46
preventing natural disasters. Nowadays, most monitoring and data collection systems are based
47
upon a combination of ground-based measurements, manned airborne sensors or satellite
48
observations. These data are utilized in describing both small and large scale processes, but have
49
spatiotemporal constraints inherent to each respective collection system. Bridging the unique spatial
50
and temporal divides that limit current monitoring platforms is key to improving our
51
understanding of environmental systems. In this context, Unmanned Aerial Systems (UAS) have
52
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
© 2018 by the author(s). Distributed under a Creative Commons CC BY license.
2 of 27
considerable potential to radically evolve environmental monitoring. UAS-mounted sensors offer
53
an extraordinary opportunity to bridge the existing gap between field observations and traditional
54
air- and space-borne remote sensing, by providing not just high spatial detail over relatively large
55
areas in a cost-effective way, but as importantly providing an entirely new capacity for enhanced
56
temporal retrieval. As well as showcasing recent advances in the field, there is also a need to identify
57
and understand the potential limitations of UAS technology. For these platforms to reach their
58
monitoring potential, a wide spectrum of unresolved issues and applications specific challenges
59
require focused community attention. Indeed, to leverage the full potential of UAS-based
60
approaches, sensing technologies, measurement protocols, post-processing techniques, retrieval
61
algorithms and evaluations techniques need to be harmonized. The aim of this paper is to provide
62
a comprehensive general overview of the existing research on studies and applications of UAS in
63
environmental monitoring in order to suggest users and researchers on future research directions,
64
applications, developments and challenges.
65
Keywords: UAS; remote sensing; environmental monitoring; precision agriculture; vegetation
66
indices; soil moisture; river monitoring.
67
68
1. Introduction
69
Despite the recent and rapid increase in the number and range of Earth observing satellites (e.g.,
70
Drusch et al, 2012; Hand, 2015), current high spatial resolution satellite sensors are generally too
71
coarse in temporal resolution for many quantitative remote sensing applications, and are thus of
72
limited use in detecting and monitoring dynamics of environmental processes. Recent advances in
73
earth observation are opening new opportunities for environmental monitoring at finer scales. For
74
instance, CubeSat platforms represent a promising satellite technology operating predominantly in
75
the visible to near-infrared portion of the electromagnetic spectrum, but with very high temporal
76
resolution (e.g., McCabe et al., 2017a, 2017b). Nevertheless, most of these satellites are operated by
77
commercial organizations, so that, if short revisit times are required (i.e. for high frequency
78
monitoring), the cost of image acquisition can become a limiting factor. While manned airborne
79
platforms can in principle provide both high spatial resolution and rapid revisit times, in practice
80
their use is routinely limited by operational complexity and cost. Their use becomes feasible only
81
over medium-size areas and it is currently adopted by several commercial operators. Recent advances
82
in Unmanned Aerial Systems (UAS) technology present an alternative monitoring platform that
83
provides a low-cost opportunity to capture the spatial, spectral, and temporal requirements across a
84
range of applications (Berni et al., 2008). They offer high versatility and flexibility compared to
85
airborne systems or satellites, and the potential to be rapidly and repeatedly deployed to acquire high
86
spatial and temporal resolution data (Pajares, 2015).
87
While UAS systems cannot compete with satellite imagery in terms of spatial coverage, they
88
provide unprecedented spatial and temporal resolutions unmatched by satellite alternatives.
89
Furthermore, they do so at a fraction of the satellite acquisition cost. For example, a newly tasked
90
high resolution natural colour image (50 cm/pixel) from a satellite (e.g., GeoEye-1) can cost up to
91
3,000 USD. On the other hand, the initial outlay to acquire a UAS with a natural colour camera can
92
be purchased for less than 1,000 USD, delivering datasets of high spatial resolution (several cm/pixel)
93
and a temporal resolution limited only by the number of flights (and power supply). The costs for
94
acquiring UAS imagery are usually derived from the initial investment, the processing software and
95
the cost of fieldwork. However, after the initial investment, datasets can be delivered more often and
96
at a higher resolution than any other earth observing system.
97
Beyond allowing the high spatial and temporal resolutions needed for many applications, UAS-
98
mounted sensors have several additional advantages, which are key across a range of applications.
99
First, they provide rapid access to environmental data, offering the near real-time capabilities
100
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
3 of 27
required in many applications. The most mature of these is the capacity to share orthomosaic and
101
elevation data, using both commercial and open-source alternatives (Schuetz, 2016). Second, UAS
102
satisfy also safety requirements and accessibility issues for inspection of inaccessible sites or hazard
103
monitoring (Watts et al., 2012). Third, the great advantage of UAS is their capacity to collect data in
104
under the cloud conditions that would otherwise obscure remote retrieval. Analysis of
105
meteorological data has shown that, even with daily re-visits of earth observation satellites, the
106
probability of operating a monitoring service based on optical satellite imagery in rainy regions is
107
about 20%, while the probability of obtaining a usable image with UAS is between the 45% and 70%
108
(Wal et al., 2013). Finally, operations with UAS are not limited to specific hours (as with sun-
109
synchronous satellite sensor), and thus UAS can be used for round-the-clock environmental
110
monitoring.
111
Mentioned capabilities, together with the increasing variety and affordability of both UAS and
112
sensor technologies, have stimulated an explosion of interest from researchers across numerous
113
domains (Anderson and Gaston, 2013; Whitehead and Hugenholtz, 2014; Whitehead et al., 2014;
114
Adão et al., 2017). Among others, Singh and Frazier (2018) provided a detailed meta-analysis on
115
published articles highlighting the diversity of processing procedures used in UAS applications
116
clearly identifying the critical need for a harmonization among the many possible strategy to derive
117
UAS-based products.
118
Dynamic nature and spatial variability of environmental processes that are often happening at
119
very fine scales generate need for high spatial and temporal resolution data. For successful and
120
efficient monitoring, timely data are necessary, and high flexibility makes the UAS imagery ideal for
121
the task. Specific timing and frequent acquisition of data at very fine scales enable targeted
122
monitoring of rapid (inter-annual) changes of environmental features, among others plant phenology
123
and growth, extreme events, and hydrological processes. For these reasons, environmental studies
124
were among the first civil applications of the technology in 1990’s. Thanks to the significant cost
125
reduction of both vehicles and sensors, and recent developments in data processing software, the
126
UAS applications expanded rapidly in last decade, stimulating a number of additional and
127
complementary topics spanning full automation of a single or multiple vehicles, tracking and flight
128
control systems, hardware and software innovations, tracking of moving targets, and image
129
correction and mapping performance assessment. This growing interest in UAS applications is
130
reflected in the number of UAS-based research papers published in the last 27 years, with a special
131
interest to those using UAS technology for environmental monitoring (based on a search of the ISI-
132
web of knowledge using the keywords “UAS” or “UAV”, and “environment”), with a particularly
133
prominent increase during the last five years (Figure 1).
134
In addition to the increasing availability of UAS, recent advances in sensor technologies and
135
analytical capabilities are rapidly expanding the number of potential UAS applications. Increasing
136
miniaturization allows multispectral, hyperspectral and thermal imaging, as well as Synthetic
137
Aperture Radar (SAR) and LiDAR sensing to be conducted from UAS (e.g., Anderson and Gaston,
138
2013). As examples of recent UAS-based environmental monitoring applications, work has focused
139
on: a) land cover mapping (e.g., Bryson et al., 2010; Akar, 2017); b) vegetation state, phenology and
140
health (e.g., Bueren et al., 2015; Ludovisi et al., 2017), c) precision farming/agriculture (e.g., Zhu et al.,
141
2009; Urbahs, 2013; Jeunnette and Hart, 2016), d) monitoring crop growth, and invasive species
142
infestation (e.g., Samseemoung et al., 2012; Alvarez-Taboada et al., 2017), e) atmospheric observations
143
(e.g., Witte et al., 2017), f) disaster mapping (e.g., Stone et al., 2017), g) soil erosion (e.g., Frankenberger
144
et al., 2008; d’Oleire-Oltmanns, 2012; ), h) mapping soil surface characteristics (e.g., Quiquerez et al.,
145
2014; Aldana-Jague et al., 2016) and i) change detection (e.g., Niethammer et al., 2012).
146
The aim of this paper is to depict the state-of-the-art in the field of UAS applications for
147
environmental monitoring, with a particular focus on hydrological variables, such as vegetation
148
conditions, soil properties and moisture, overland flow and streamflow. This review provides a
149
common shared knowledge framework useful to guide and address the future activities of the
150
international research network being promoted by the recently funded HARMONIOUS COST
151
Action. The Action is funded by the European Cooperation in Science and Technology (COST)
152
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
4 of 27
programme, that supports networking activities to improve our current knowledge and disseminate
153
research outcomes. The aim of the HARMONIOUS COST Action is to channel all competencies,
154
knowledge, and technologies of a wide international network involving more than 90 scientists from
155
different parts of the world. This challenge will be achieved by sharing and further developing the
156
experience, data, tools and technology possessed by the numerous institutions involved in this
157
Action. Using a common strategy and a continuous interaction, the HARMONIOUS Action will
158
enhance the actual capabilities of environmental analysis and support the definition of optimized and
159
standardized procedures for UAS-based applications.
160
We divide our review into three sections that focus on different aspects of UAS-based
161
environmental monitoring: 1) data collection and processing; 2) monitoring natural and agricultural
162
ecosystems; 3) monitoring river systems. We finish by summarizing issues, roadblocks and
163
challenges in advancing the application of UAS in environmental monitoring.
164
165
166
Figure 1. Number of articles extracted from the database ISI web of knowledge published from 1990
167
up to 2017 (last access 15/01/2018).
168
2. Data Collection, Processing and Limitations
169
While offering an unprecedented platform to advance spatiotemporal insights across the earth
170
and environmental sciences, UAS are not without their own operational, processing and retrieval
171
problems (Gay et al., 2009). These range from image blur due to the forward motion of the platform
172
(Sieberth et al., 2016), resolution impacts due to variable flying height, orthorectification issues and
173
geometric distortion associated with inadequate image overlap (Colomina and Molina, 2014), and the
174
spectral effects induced by variable illumination during flight. These and other factors can all affect
175
the subsequent quality of any orthorectified image and subsequently the derived products, as well
176
described in a recent review paper by Whitehead and Hugenholtz (2014). As such, it is essential to
177
consider best practice in the context of a) mission and flight planning; b) pre-flight camera/sensor
178
configuration; c) in-flight data collection; d) ground control/ radiometric calibration and correction;
179
e) geometric and atmospheric corrections; f) orthorectification and image mosaicking; and g)
180
extracting relevant products/metrics for remote sensing application. Items a) and b) are pre-flight
181
tasks, c) and d) are conducted in the field at the time of survey, and e) – g) are post-survey tasks.
182
Together, these aspects can be considered as fundamentals of data acquisition and post-processing,
183
which deliver the necessary starting point for subsequent application-specific analysis. However,
184
despite the existence of well-established workflows in photogrammetry, manned aircraft, and
185
satellite-based remote sensing to address such fundamental aspects, UAS systems introduce various
186
additional complexities, which to date have not been thoroughly addressed. Consequently, best
187
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
5 of 27
practice workflows for producing high quality remote sensing products from UAS are still lacking,
188
and further studies that focus on validating UAS-collected measurements with robust processing
189
methods are important for improving the final quality of the processed data (Rieke et al., 2011; Mesas-
190
Carrascosa et al., 2014; Ai et al., 2015).
191
2.1. Pre-flight planning
192
Flight or mission planning is the first essential step for UAS data acquisition and has a profound
193
impact on the data acquired and the processing workflow. Similar to other remote sensing
194
approaches, a host of parameters must be considered before the actual flight, such as platform
195
specifications, the extent of the study site (area-of-interest), ground sampling distance, payload
196
characteristics, topography of the study site, goals of the study, meteorological forecasts and local
197
flight regulations. UAS have additional aspects that require further consideration, including the skill
198
level of the pilot, platform characteristics and actual environmental flight conditions: all of which
199
affect the data characteristics and subsequent phases of processing.
200
Due to the proliferation of low-cost, off-the-shelf digital cameras, photogrammetry has been the
201
primary implementation of UAS. James and Robson (2014) highlighted how unresolved elements of
202
the camera model (lens distortion) can propagate as errors in UAS-derived DEMs, and how this can
203
be addressed by incorporating oblique images. Other studies have highlighted the importance of
204
flight line configurations (Peppa et al., 2014), as well as minimising image blur (Sieberth et al., 2016).
205
There is a need to consolidate this evidence to develop best practice guidance for optimizing UAS
206
SfM measurement quality, whilst maintaining ease of use and accessibility.
207
Accurate absolute orientation (georeferencing) is an important element for UAS surveys, and is
208
fundamental for any multi-tempoal monitoring or comparison to other datasets. This task is often
209
referred to as registration, and is conventionally dependent on establishing ground control points
210
(GCPs) which are fixed by a higher order control method (usually Global Navigation Satellite System
211
- GNSS survey). A number of studies have examined the effect of GCP networks (number and
212
distribution) in UAS surveys, showing that significant errors are expected in SfM-based products
213
where GCPs are not adopted (Eltner and Schneider, 2015; Peppa et al., 2016). Nevertheless, systematic
214
DEM error can be significantly reduced by including properly defined GCPs (James et al., 2017a) or
215
incorporating oblique images in the absence of GCP (James et al., 2014).
216
Best practice can also be drawn from manned aerial photogrammetry. Direct-georeferencing is
217
standard practice in aerial photogrammetry, where the position and orientation of the platform is
218
precisely determined using on-board survey-grade differential GNSS and inertial measurement unit
219
(IMU) data combined through an inertial navigation system (INS) (Toth and Jóźków, 2016). This
220
allows the camera station (exposure) position and orientation to be derived directly, thus eliminating
221
or minimizing the need for ground control points. Therefore, as discussed by Colomina and Molina
222
(2014), there is an increasing drive towards achieving cm-level direct-georeferencing for UAS using
223
alternative GNSS/IMU configurations, precise point positioning (PPP) and dual frequency GNSS.
224
2.2 Sensors
225
The large availability of UAS equipped with visible (VIS) commercial camera (see Table 1) has
226
been the main driver for several researches exploring the potential use of low cost sensors for
227
vegetation monitoring (Geipel et al., 2014; Torres-Sanchez et al., 2014; Saberioon et al., 2014; Jannoura
228
et al., 2015). Among the many available visible spectral indices, the Normalized Green-Red Difference
229
Index - NGRDI, Excessive Green - ExG and VEG indices achieved the good accuracy in the vegetation
230
mapping. Such vegetation indices may be a cost-effective tool for biomass estimation and establishing
231
yield variation maps for site-specific agricultural decision-making.
232
Over the last five to eight years, near-infrared (NIR) multi and hyperspectral sensors have
233
become more widely available for UAS. Modified off-the-shelf RGB cameras - initially very popular
234
(e.g., Hunt et al, 2010) - have now started to be replaced by dedicated multispectral or hyperspectral
235
cameras, as the latter have reduced in cost and weight. For instance, light weight hyperspectral
236
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
6 of 27
sensors for UAS are now available from different vendors (e.g., SPECIM; HYSPEX; HeadWall). This
237
progress offers more defined and discrete spectral responses than the modified RGB or multi-band
238
camera. Multispectral cameras commonly employ multiple lenses, which introduces band-to-band
239
offsets that should be adequately corrected in order to avoid artefacts introduced into the combined
240
multi-band product (Laliberte et al., 2011; Jhan et al., 2017). Furthermore, radiometric calibration and
241
atmospheric corrections are needed to convert the recorded digital numbers (DN) to surface
242
reflectance values to enable reliable assessment of ground features, comparison of repeated
243
measurements and reliable determination of spectral indices (Lu and He, 2017). Although DN are
244
frequently utilized directly to derive vegetation indices (e.g., NDVI), illumination differences
245
between (and within) surveys, and differing (and unknown) spectral responses between sensors
246
make it difficult to utilize such data.
247
Radiometric calibration normally involves in-field measurement of reference natural or artificial
248
targets with a field spectroradiometer (e.g., Brook and Ben-Dor, 2011; Zarco-Tejada et al., 2012; Lu
249
and He, 2017) and requires significant additional effort. Some current multispectral cameras (e.g.,
250
Parrot Sequoia, MicaSense RedEdge) include a downwelling irradiance sensor and calibrated
251
reflectance panel in order to address some of the requirements of radiometric calibration. This is
252
beneficial, but it does not address the full complexity of radiometric calibration and artefacts will
253
remain. Other aspects, such as bidirectional reflectance (modelled through the bidirectional
254
reflectance distribution function (BRDF)) and image vignetting, introduce further uncertainties for
255
image classification. While the most appropriate workflow for dealing with multispectral imagery to
256
some extent depends on the complexity of the subsequent application (e.g., basic vegetation indices
257
or reflectance-based image classification), the growing body of literature and recent sensor
258
developments support the development of best practice guidelines for the environmental UAS
259
community.
260
Hyperspectral sensors (Table 3) can be briefly mentioned as extensions of the discussion
261
surrounding multispectral sensors above, and related considerations of radiometric calibration and
262
atmospheric correction. Over the last five years, there has been increasing interest in hyperspectral
263
imaging sensors (e.g., Lucieer et al., 2014; Honkavaara et al., 2017). While these are still more
264
expensive than multispectral systems, they offer significant potential for quantitative soil vegetation
265
and crop studies. UAS hyperspectral imagers typically offer contiguous narrow bands in the VIS-
266
NIR portion of the spectrum. Existing cameras include pushbroom and more recently frame capture
267
technology. Depending on the capture mechanism, there are typically artefacts related to non-
268
instantaneous (time delay) capture across bands, or physical offsets between bands (Honkavaara et
269
al., 2017). There has also been interest in (non-imaging) UAS-mounted (hyperspectral) spectrometers
270
(e.g. Burkart et al., 2015).
271
In the hyperspectral domains, high radiometric accuracy and accurate reflectance retrieval are
272
key factors to further exploit this technology (Ben-Dor et al., 2009). Accordingly, practices from the
273
manned hyperspectral sensor can be adopted in UAS applications, such as the new super-vicarious
274
calibration method suggested by Brook and Ben-Dor (2011, 2015). To this end, they used artificial
275
targets to account for the radiometric accuracy and further to generate a high quality reflectance data-
276
cube. Technology have recently introduced also light sensors in the SWIR region for UAS applications
277
(HeadWall).
278
UAS broadband thermal imaging sensors (see Table 4) measure brightness temperature of the
279
Earth’s surface typically between 7.5–13.5 μm. Key considerations relate to spatial resolution and
280
thermal sensitivity, with the latter now achieving 40-50 mK. Thermal UAS remote sensing also
281
requires consideration of radiometric calibration and accounting for vignetting and other systematic
282
effects, as discussed by Smigaj et al. (2017). An example of thermal image providing the surface
283
temperature in Celsius degree obtained over a vineyard of Aglianico is given in Figure 2. Here, one
284
can appreciate the high level of details offered by this technology in the description of a patchy
285
vegetation cover.
286
LiDAR sensors (see Table 5) are also becoming more commonplace on UAS platforms, as
287
increasingly lightweight systems become achievable (although <3 kg maximum take-off weight is
288
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
7 of 27
still challenging). There is of particular interest in UAS LiDAR for forestry applications, particularly
289
in relation to classifying and quantifying structural parameters (e.g., forest height, crown dimensions;
290
Sankey et al., 2017).
291
A comprehensive review of the available cameras and sensors is given in the appendix to guide
292
future studies and activities in this field.
293
294
Figure 2. A thermal survey over an Aglianico vineyard in the Basilicata region (southern Italy) overlaying
295
an RGB orthophoto obtained by a multicopter mounting both an optical and a FLIR Tau 2 camera.
296
Insets A and B provides a magnified portion of the thermal map where is possible to distinguish
297
pattern of vegetation and distribution of the surface temperature.
298
2.3. Softwares
299
Finally, alongside sensor technological developments, low cost and particularly open source
300
software has been vital in enabling the growth in UAS for environmental and other applications. This
301
includes proprietary structure-from-motion (SfM) software such as Agisoft Photoscan and Pix4D,
302
which is significantly more affordable than most conventional photogrammetric software. In
303
particular, photogrammetry has been the primary implementation of UAS.
304
UAS-based photogrammetry can produce products of a similar accuracy to those achievable
305
through manned airborne systems (Colomina and Molina, 2014). This has been underpinned by the
306
development of SfM software, which offers a user-friendly and low-cost alternative to conventional
307
digital photogrammetric processing. While this has made photogrammetry more accessible to non-
308
experts, quantification of uncertainty remains an ongoing challenge (James et al., 2017b). This is
309
because SfM relaxes some of the conventional expectations in terms of image block geometry and
310
data acquisition.
311
Cloud-based platforms such as DroneDeploy or DroneMapper offer the possibility to integrate
312
and share aerial data, but also to derive orthomosaics with light processing workloads. Moreover,
313
there has also been development of open source SfM software, including VisualSfM, Bundler, Apero-
314
MicMac, OpenDroneMap, etc. Many open source GIS and image processing software (e.g. QGIS,
315
GRASS, SAGA GIS, Orfeo Toolbox, ImageJ) support the subsequent exploitation of this data,
316
including applications such as image classification and terrain analysis. All these offers the
317
opportunity to develop high quality measures with low cost sensors and software that emphasized
318
even more the potential number of applications of the available tools (Sona et al. 2014, Ouédraogo et
319
al., 2014, Kaiser et al., 2014.)
320
3. Monitoring Agricultural and Natural Ecosystems
321
Natural and agricultural ecosystems are influenced by climatic forcing, physical characteristics
322
and management practices that are highly variable in both time and space. Moreover, vegetation state
323
changes may occur within short time (Manfreda and Caylor, 2013; Manfreda et al., 2017), due to
324
50°C
25°C
0100 200 400 600m
(A)
(B)
(A)
(B)
040 80 120m
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
8 of 27
unfavourable growing conditions or climatic extremes (e.g., heat waves, heavy storms, etc.).
325
Therefore, in order to capture such features, monitoring systems need to provide accurate
326
information over large areas with a high revisit frequency (Atzberger, 2013). UAS is one such
327
technology that is enabling new horizons in vegetation monitoring. For instance, the high resolution
328
of UAS-imagery has led to a significant increase in the overall accuracy in species-level vegetation
329
classification, monitoring vegetation status, monitoring weed infestations, estimating biomass,
330
predicting yields, detecting crop water stress and/senescent leaves, reviewing herbicide applications,
331
and pesticide control.
332
3.1. Vegetation Monitoring and Precision Agriculture
333
Precision agriculture (Zhang and Kovacs, 2012) has been the most common environmental
334
monitoring application of UAS. High spatial resolution UAS imagery enables much earlier and cost-
335
effective detection, diagnosis, and corrective action of agricultural management problems compared
336
to low resolution satellite imagery. Therefore, UAS may provide the required information to address
337
farmers’ needs at the field scale, enabling them to take better management decisions with minimal
338
costs and environmental impact (Huang et al., 2013; Link et al., 2013; Zhang, 2014).
339
Vegetation state can be evaluated and quantified through different vegetation indices from
340
images acquired in the visible, red edge and near-infrared spectral bands that display a strong
341
correlation with soil coverage and Leaf and Green Area Index (LAI and GAI), Crop Nitrogen Uptake
342
(QN), chlorophyll content, water stress detection, canopy structure, photosynthesis, yield, and/or
343
growing conditions (e.g., soil moisture) (e.g., Shahbazi, 2014; Helman et al., 2015; Gago et al., 2015;
344
Helman et al., 2017). These vegetation indices can be exploited to monitor biophysical parameters as
345
an alternative to destructive in situ measurements.
346
Among the many available vegetation indices, the normalized difference vegetation index
347
(NDVI) is one that is most widely used (Lacaze et al., 1996; Gigante et al., 2009; Helman 2018). UAS-
348
NDVI maps can be at least comparable to those obtained from satellite visible observations, which is
349
highly relevant for a timely assessment of crop health status with capacity to provide immediate
350
feedback to the farmer. NDVI surveys performed with UAS, aircraft, and satellite demonstrated that
351
low resolution images would fail in representing intra-field variability and patterns in fields
352
characterized by small vegetation gradients and high vegetation patchiness (Matese et al., 2015).
353
Moreover, UAS-derived NDVI showed a better agreement with ground-based NDVI observations
354
compared to satellite-derived NDVI in several crop and natural vegetation types (Gay et al., 2009;
355
Primicerio et al., 2012; McGwire et al., 2013; Hmimina et al., 2013). The significant difference between
356
vegetation patterns observed by satellite and UAS can be observed in Figure 3 where a date-palm
357
field is described. In particular, UAS-based observation can be considered comparable to field
358
observations.
359
360
Figure 3. Comparison between a CubeSat NDVI map of a date-palm plantation at 3m of resolution
361
(A) and a UAS-derived NDVI at 3cm of resolution (B).
362
In the last decade, particular attention has been given to the monitoring of vineyards with UAS
363
because of their high economic value. Johnson et al. (2003) proposed one of the first applications
364
(A) (B)
0100 200 400 600m
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
9 of 27
where different sensors are used for determining measures related to: chlorophyll function and
365
photosynthetic activity, LAI, and plant health status (among others variables) to map vigour
366
differences within fields. More recently, Zarco-Tejada et al. (2012, 2013a, 2013b, 2013c) demonstrated
367
the potential for monitoring specific variables such as crop water stress index, photosynthetic activity
368
and carotenoid content in vineyards the using multispectral, hyperspectral camera and thermal
369
camera.
370
Based upon the authors’ experiences, farmers have expressed particular interest in monitoring
371
crop conditions for the quantification of water demand, nitrogen status or infestation treatments.
372
Several of the variables or indices described above may be used for rapid detection of crop pest
373
outbreaks or to map the status of crops.
374
Monitoring soil water content is critical for determining efficient irrigation scheduling. Hassan-
375
Esfahani et al. (2015) derived topsoil moisture content using RGB, NIR and thermal bands. The
376
effective amount of water stored in the subsurface can be obtained by exploiting mathematical
377
relationships between surface measurements and the root-zone soil moisture, such as the SMAR
378
(Manfreda et al. 2014; Baldwin et al. 2017).
379
As an example, Sullivan et al. (2007) observed that the thermal infrared (TIR) emittance was
380
highly sensitive to canopy response and can be used for monitoring soil water content, stomatal
381
conductance, and canopy cover. TIR has similarly been used for the monitoring and estimation of soil
382
surface characteristics such as microrelief and rill morphology (de Lima and Abrantes, 2014a), soil
383
water repellency (Abrantes et al., 2017), soil surface macropores (de Lima et al., 2014b) and skin
384
surface soil permeability (de Lima et al. 2014a). Another application is the use of TIR in surface
385
hydrology for estimating overland and rill flow velocities by using thermal tracers (de Lima and
386
Abrantes, 2014b; Abrantes et al., 2018).
387
More specifically, the TIR emittance displayed a negative co-relation with stomatal conductance
388
and canopy closure, indicating increasing canopy stress as stomatal conductance and canopy closure
389
decreased. An additional strategy is represented by the use of the crop water stress index (CWSI -
390
Jackson et al., 1981; Cohen et al., 2017) calculated from leaf water potential that can be used to
391
determine the required frequency, timing and duration of watering. In this regard, the CWSI, derived
392
with a UAS equipped with a thermal camera, is frequently adopted to quantify the physiological
393
status of plants, and more specifically leaf water potential in experimental vineyards (Zarco-Tejada
394
et al., 2012; Baluja et al., 2012; Tejada et al. 2013b; Gago et al., 2014; Bellvert et al., 2014) and orchards
395
(Gonzalez-Dugo et al., 2013; 2014). The derived CWSI maps can serve as important inputs for
396
precision irrigation. Time-series of thermal images can also be used to determine the variation in
397
water status (Santesteban et al, 2017).
398
Using the VIS-NIR (0.4-1.0m) hyper spectral and multispectral analyses of simulated data have
399
shown that soil attributes can be extracted from these spectral regions, particularly those most
400
commonly used by the current UAS platforms (Ben-Dor and Banin, 1994, 1996; Soriano-Disla et al.,
401
2014). These studies demonstrated that the VIS-NIR spectral region alone can open up new frontiers
402
in soil mapping (as well as soil moisture content retrieval) using on-board multi and hyper spectral
403
UAS sensors without using heavy-weight sensors of the SWIR (1-2.5m) region. Aldana-Jague et al.
404
(2016) mapped soil surface organic carbon content (<0.5 cm) at 12 cm resolution exploiting six bands
405
between 450 and 1050 nm of low-altitude multi-spectral imaging. D’Oleire-Oltmanns et al. (2012)
406
showed the applicability of UAS for measuring, mapping and monitoring soil erosion at 5 cm
407
resolution with an accuracy between 0.009 and 0.027 m in the horizontal directions and 0.007 m in
408
the vertical direction. Detailed information about soil erosion can enhance proper soil management
409
at the plot scale (Quiquerez et al., 2014).
410
Such tools were further explored by Zhu et al. (2009), who investigated the ability to quantify
411
the differences in-soil nitrogen application rates using digital images taken from an UAS in
412
comparison with ground-based hyperspectral reflectance and chlorophyll content data. They
413
suggested that aerial photography from UAS has the potential to provide input in support of crop
414
decision-making processes minimizing field sampling efforts, saving both time and money, and
415
enabling accurate assessment of different nitrogen application rates. Therefore, such information may
416
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
10 of 27
serve as inputs to other agricultural systems, such as tractors or specific drones, that optimise
417
fertilizer management.
418
Besides monitoring, UAS can also improve agronomical practices. Costa et al. (2012) described
419
an architecture that can be employed to implement a control loop for agricultural applications where
420
UAS are responsible for spraying chemicals on crops. Application of chemicals is controlled by the
421
feedback obtained from the wireless sensor network (WSN) deployed on the crop field. They
422
evaluated an algorithm to adjust the UAS route under changes in wind (intensity and direction) to
423
minimize the waste of pesticides. Pena et al. (2013; 2015) explored the optimization of herbicide
424
applications in weed-crop systems using a series of UAS multispectral images. The authors compute
425
multiple data, which permits calculation of herbicide requirements and estimation of the overall cost
426
of weed management operations in advance. They showed that the ability to discriminate weeds was
427
significantly affected by the imagery spectral (type of camera) used as well as the spatial (flight
428
altitude) and temporal (the date of the study) resolutions.
429
Among these technical advantages and constrains, the importance of the limitation of
430
operational rules in using UAS in several countries needs to be highlighted. As an example, Jeunnette
431
and Hart (2016) developed a parametric numerical model to compare aerial platform options to
432
support agriculture in developing countries characterized by highly fragmented fields, but manned
433
systems are still more competitive from an operational and cost/efficiency point of view because of
434
the present limitations in altitude, distance and speed of UAS. In particular, UAS becomes cost-
435
competitive when they are allowed to fly higher than 300m AGL. Nevertheless, all the applications
436
described highlight the potential use of UAS in developing advanced tools for precision agriculture
437
applications and for vegetation monitoring in general. With time, both technological advances and
438
legislation will evolve and likely converge, further advancing the efficient use of such technologies.
439
3.2.Monitoring of Natural Ecosystems
440
As with agricultural ecosystems, the proliferation of UAS-based remote sensing techniques have
441
opened also new opportunities for monitoring and managing natural ecosystems (Anderson and
442
Gaston, 2013; Tang and Shao, 2015; Torresan et al., 2017; Ventura et al., 2017). In fact, drones provide
443
options and opportunities to collect data at appropriate spatial and temporal resolutions to describe
444
ecological processes and allow better surveying of natural ecosystems placed in remote, inaccessible
445
or difficult and/or dangerous to access sites. As examples, some habitats (e.g., peat bogs) can be
446
damaged through on-ground surveys, while drones positioned several meters above the surface can
447
provide a near comparable level of information as that obtained through plot-based measurements
448
(e.g., canopy cover by species). Drones are also useful for undertaking rapid surveys of habitats such
449
as mangroves, where access is often difficult and plot-based surveys take far longer to complete (see
450
Figure 4).
451
UAS therefore offer the potential to overcome these limitations and have been applied to
452
monitor a disparate range of habitats and locations, including tropical forests (Paneque-Gálvez et al.,
453
2014), riparian forests (Dunford et al., 2009; Dufour et al. 2013), dryland ecosystems (Cunliffe et al.,
454
2016), boreal forests, and peatlands (Puliti et al., 2015). Pioneering researchers have been using UAS
455
to monitor attributes such as plant population (e.g., Jones et al., 2006; Chabot and Bird, 2012);
456
biodiversity and species richness (e.g., Getzin et al., 2012; Koh and Wich, 2012); plant species invasion
457
(e.g., Michez et al., 2016; Müllerová et al., 2017a); restoration ecology (e.g., Reif and Theel, 2017);
458
disturbances (e.g., Gonçalves et al., 2016; McKenna et al., 2017); phenology (e.g., Klosterman and
459
Richardson, 2017; Müllerová et al., 2017b); pest infestation in forests (Lehmann et al., 2015; Minarik
460
and Langhammer, 2016), and land cover change (e.g., Ahmed et al., 2017).
461
Many studies have focused on the retrieval of vegetation structural information to support forest
462
assessment and management (e.g., Dandois and Ellis, 2013; Puliti et al., 2015). Information on the
463
plant and canopy height can also be obtained from stereo images (Dittmann et al., 2017; Otero et al.,
464
2018), which can be further used to estimate above ground biomass (see for example Figure 4). 3D
465
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
11 of 27
maps of canopy can also be used to distinguish between trunks, branches and foliage and can be used
466
by logging companies and farmers (Sankey et al., 2017).
467
a)
b)
c)
Figure 4. a) RGB image of mangrove forest clearances, Matang Mangrove Forest Reserve, Malaysia, as
468
observed using an RGB digital camera mounted on a DJI Phantom 3, b) RGB orthoimage from which
469
individual (upper canopy) tree crowns can be identified as well as different mangrove species and c) the
470
Canopy Height Model (CHM) derived from stereo RGB imagery, with darker green colours representing
471
tall mangroves (typically > 15 m).
472
UAS represents a promising option enabling timely, fast and precise monitoring important for
473
many plant species, invasive ones in particular (Calviño-Cancela et al., 2014; Michez et al., 2016; Hill
474
et al., 2017; Müllerová et al. 2017a). Flexibility of the data acquisition enabled by the UAS mean is
475
very important since plants are often more distinct from the surrounding vegetation in certain time
476
of their growing season (Müllerová et al. 2017b). Besides fast monitoring of newly invaded areas, the
477
UAS methodology enables prediction/modelling of invasion spread that is driven by combination of
478
many factors, such as habitat and species characteristics, human dispersal, and disturbances
479
(Rocchini et al., 2015). Legal constrains limiting use of UAS to unpopulated areas can be especially
480
problematic for invasive species that tend to prefer urban areas, still the UAS technology can greatly
481
reduce costs of extensive field campaigns and eradication measures (Lehmann et al., 2017).
482
UAS are also revolutionizing the management of quasi-natural ecosystems such as restored
483
habitats and managed forests. They have been used to quantify spatial gap pattern in forests in order
484
to support planning common forest management practices such as thinning (Getzin et al., 2014) or to
485
support restoration monitoring in uneven habitats at risk. For example, Quilter et al. (2000) used UAS
486
for monitoring streams and riparian restoration projects in inaccessible areas on Chalk Creek (Utah).
487
Knoth et al. (2013) applied a new mapping technique to support the monitoring of restored cut-over
488
bogs using a UAS-based NIR remote sensing approach. TIR data were also used by Ludovisi et al.
489
(2017) to determine the response of forest to drought in relation to forest-tree breeding programs and
490
genetic improvement.
491
4. River Systems and Floods
492
Satellite data are widely used to monitor natural hazards (e.g. floods, earthquakes, volcanic
493
eruptions, wildfire, etc.) at national and international scales (Tralli et al. 2005). This popularity is due
494
to their wide coverage, spectral resolution, safety, and rate of update (Gillespie et al. 2007; Joyce et al.
495
2009). Nevertheless, UAS have also been widely used for rapid assessment following natural extreme
496
events and in the context of humanitarian relief and infrastructure assessment (Stone et al., 2017).
497
According to Quaritsch et al. (2010), UAS should be utilized as a component of a network of sensors
498
for natural disaster management. Although, there are a number of technological barriers, which must
499
be overcome before UAS can be utilized in a more automated and coordinated manner, their potential
500
for disaster response is significant (Erdelj et al., 2017).
501
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
12 of 27
An interesting example is given by the Hurricane and Severe Storm Sentinel (HS3) program
502
launched by NASA (2015) that deployed different high-tech UAS to monitor hurricane formation and
503
evolution. UAS “catch” data inside the storm (winds and precipitation) and in the surrounding
504
environment using multiple sensors that include a radar scanner and wind LiDAR, multi-frequency
505
radiometer, and a microwave sounder. Such technology may provide information never measured
506
before on hurricanes.
507
Given UAS potentials, we expect significant advances in fields of hydrology and hydraulics
508
where there is a significant potential for the use of UAS for monitoring river systems, overland flows
509
or even urban floods.
510
2.1. Flow monitoring
511
River systems and stream flow can be monitored by remotely integrating the techniques of water
512
body observation, vegetation mapping, DEM generation, and hydrological modelling. Satellite
513
sensors in the visible, infrared, and microwave range are currently used to monitor rivers and to
514
delineate flood zones (Syvitski et al. 2012; Yilmaz et al. 2010; D’Addabbo et al., 2016). These methods
515
are generally used only over large rivers or areas of inundation in order to detect changes at the
516
pixel level. UAS can describe river dynamics, but with a level of detail that is several orders of
517
magnitude greater and can enable distributed flow measurements over any river system and in
518
difficult-to-access environments.
519
In this context, the integration of UAS imagery and optical velocimetry techniques has enabled
520
full remote kinematic characterization of water bodies. Optical techniques, such as Large Scale
521
Particle Image Velocimetry (LSPIV, Fujita et al., 1997) and Particle Tracking Velocimetry (PTV, Brevis
522
et al., 2011), are efficient yet non-intrusive flow visualization methods that yield spatially distributed
523
estimations of the surface flow velocity field based on the similarity of image sequences. Proof-of-
524
concept experiments have demonstrated the feasibility of applying LSPIV from manned aerial
525
systems to monitor flood events (Fujita and Hino, 2003; Fujita and Kunita, 2011). More recently,
526
videos recorded from UAS have been analysed with LSPIV to reconstruct the surface flow velocity
527
field of natural stream reaches (Detert and Weitbrecht, 2015; Tauro et al., 2015). This allow to gain a
528
detailed Lagrangian insight into river dynamics that is valuable in calibrating numerical models.
529
Most of these experimental observations entail a low-cost UAS hovering above the region of
530
interest for a few seconds (the observation time should be adjusted to the flow velocity and camera
531
acquisition frequency). An RGB camera is typically mounted on-board and installed with its optical
532
axis perpendicular to the captured field of view to circumvent orthorectification (Tauro et al., 2016a).
533
To facilitate remote photometric calibration, Tauro et al. (2016a) adopted a UAS equipped with a
534
system of four lasers that focus points at known distances in the field of view. In several experimental
535
settings, the accuracy of surface flow velocity estimations from UAS was found to be comparable to
536
(or even better than) traditional ground-based LSPIV configurations (Tauro et al., 2016b). In fact,
537
compared to fixed implementations, UAS enable capture of larger fields of view with a diffused
538
rather than direct illumination. Such optical image velocimetry techniques can measure flow velocity
539
fields over extended regions rather than pointwise, and at temporal resolutions comparable to or
540
even better than ADV (Acoustic Doppler Velocimetry) based on the presence of detectable features
541
on the water surface (Tauro et al., 2017).
542
Most platforms offer both piloted and GPS waypoint navigation up to 10 km range (even if this
543
may be subject to national regulations) and are quite stable in windy conditions. In this context, UAS
544
technology is expected to considerably aid in flood monitoring and mapping. In fact, flood
545
observation is a considerable challenge for space-borne passive imagery mostly due to the presence
546
of dense cloud cover, closed vegetation canopies, and the satellite revisit time and viewing angle
547
(Joyce et al. 2009; Sanyal and Lu 2004). Although synthetic aperture radar (SAR) satellite sensors (e.g.
548
Sentinel-1, TerraSAR-X, RADARSAT-2) can overcome these visibility limitations, they are unable to
549
provide sub-metre level spatial resolution necessary for detailed understanding of flood routing and
550
susceptibility. Applying UASs with an appropriate flight mode may overcome some of these issues
551
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
13 of 27
allowing for rapid and safe monitoring of inundations and measurement of flood hydrological
552
parameters (Perks et al., 2016). Moreover, hyperspectral sensor can also be used to extend the range
553
of water monitoring applications. Examples are: sediment concentration, chlorophyll distribution,
554
blooming algae status, submerged vegetation mapping, bathymetry and chemical and organic waste
555
contaminations (Flynn and Chapra, 2014; Klemas, 2015).
556
4. Final remarks and challenges
557
UAS-based remote sensing provides new advanced procedures to monitor key variables,
558
including vegetation status, soil moisture content and stream flow. A detailed description of such
559
variables may increase our capacity to describe water resources availability and helping agricultural
560
and ecosystem management. The present manuscript provides an overview of some of the recent
561
applications in the field UAS-based environmental monitoring. The wide range of applications
562
testifies the great potential of these techniques, but at the same time the variety of methodologies
563
adopted is an evidence that there is still room for significant improvement. The variety of vehicles,
564
sensors and specificity of the case study have stimulated the proliferations of a huge number of
565
specific algorithms addressing flight planning, image registration, calibration and correction,
566
derivation of specific indices or variables: but there is no evidence of comprehensive comparative
567
studies able to selected the appropriate procedure for a specific need.
568
Despite the rapid development in software procedures, there is a huge need to standardize the
569
workflow for operational use of UAS. High spatial resolution of UAS data generates high demands
570
on data storage and processing capacity. Traditional procedures of collecting ground-truth data or
571
ground-control points for satellite imagery do not show sufficient positional accuracy, especially in
572
complex terrain (Müllerová et al. 2017b). Legal constrains restricting the UAS data acquisition can
573
limit some potential applications, particularly in urban environment. There are also technical limits,
574
such as weather constrains (wind, rain), high elevations or very hot environment that can be
575
challenging for most of the devices/sensors (see e.g. Wigmore and Bryan, 2017).
576
Nevertheless, technology and scientific research have a clear path to follow that have been traced
577
by manned aerial photogrammetry and earth observation from satellites. Such observational
578
practices have already addressed several of the problems that UAS-based observations are facing.
579
Miniaturization of technology and sensors will increase with time the reliability of UAS-observation
580
reducing several of the limitations related to the use of UAS.
581
The first and most critical limitation in the use of UAS is the limited flight time that affect
582
directly the possible extent of the investigated area. This problem is currently managed by
583
mission planning able to manage multiple flights, but the technology is offering new solutions
584
that will extend the flight endurance up to several hours making more and more competitive
585
the use of UAS. For instance, new development in the battery industry suggests that the
586
relatively short flying time imposed by battery capacity will be significantly improved in the
587
future (Langridge and Edwards, 2017). In this context, another innovation introduced in the
588
most recent vehicles is an integrated energy supply system connected with solar panel on
589
board that allows to extend typical flight endurance from the maximum of 40-50 minutes up
590
to 5 hours.
591
The second critical issue regards the impact of Ground Sample Distance (GSD) on quality of
592
the surveys. This limitation can be solved implementing 3D flight paths that follows the relief
593
in order to maintain uniform the observation’s Ground Sample Distance (GSD). At the present,
594
only few software (e.g., UgCS, eMotion 3) use digital terrain models to adjust the height path
595
of the mission to the relief in order to maintain uniform GSD.
596
The third critical issue regards the image registration, correction and calibration. Vulnerability
597
of UAS to weather conditions (wind, rain) and the geometric and radiometric limitations of
598
current lightweight sensors have stimulated the development of new algorithms for image
599
mosaicking and correction. In this context, the development of open source and commercial
600
SfM software allowed to properly address the mosaicking issue, but the radiometric correction
601
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
14 of 27
and calibration is still an open question that may find potential solution in earth observations
602
experiences.
603
Vegetation can be measured in its state and distribution using RGB, multispectral,
604
hyperspectral and thermal cameras. Each of these sensors allow to derive information with
605
some sort of drawback. For instance, multispectral, hyperspectral and thermal camera can
606
provide more appropriate description of the vegetation, but at the expenses of the spatial
607
resolution and also with additional needs and requirements for the calibration. Also soil
608
moisture and river flow can be measured using different sensors and algorithms, but a
609
comprehensive assessment of the performances of each of these methods and procedures is
610
strongly needed.
611
The wide range of experiences described herein highlighted the huge variability in the
612
strategies, methodologies and sensors adopted for each specific environmental variable
613
monitored.
614
Finally, UAS compared to satellite offer a similar complexity, but this sector has received much
615
less resources to fill existing gaps in the technology. Nevertheless, this is also the reason why
616
there is a lot of room for further improvements in the technology and use of such methods. The
617
first and most important is also connected to the improvement of satellite techniques that may
618
largely benefit from the use of high detailed UAS-data (see Figure 5).
619
620
There is a growing need to define harmonized approaches able to channel the efforts of all these
621
studies and identify the optimal strategy for UAS-based monitoring. The aim is to define a clear and
622
referenced workflow starting from the planning and acquisition of the data and the generation of
623
maps. In particular, we envisage the need to stimulate a comparative experiment able to assess the
624
reliability of different procedures and combination of algorithms in order to identify the most
625
appropriate methodology for environmental monitoring in different hydroclimatic conditions.
626
627
Figure 5. UAS vs satellite monitoring.
628
The recently funded COST Action entitled HARMONIOUS is aimed at stimulating joint
629
activities to facilitate a more common strategy in environmental monitoring. This Action should
630
enhance observational capabilities and also improve model parameterization across a range of fields.
631
The Action is structured around five working groups (WGs) that will tackle different aspects in the
632
use of UAS technologies, with the aim to identify the optimal strategy for data processing, monitoring
633
the vegetation status, monitoring soil water content, monitoring river systems and discharge and
634
finally harmonize the outcomes of these studies. The structure of the network with the responsible of
635
each activity are shown in Figure 6. The aim is to stimulate, within the next few years, a number of
636
field experiments oriented at benchmarking the existing procedures and algorithms for monitoring
637
the variable of interest mentioned.
638
Number of vehicles, systems,
sensors, algorithms
SpatialResolution
Spatial coverage
Complexity
UAS
Satelites
Global
Coverage
Plot
Scale
Manned
Systems
Expected Advances
Research Investments in the last year
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
15 of 27
In the coming four years, we will organize workshops and training courses, promote scientific
639
missions and design cooperative experiments, that should address the following objectives:
640
Establish standardized protocols for the necessary pre-processing of UAS (geometric correction
641
of image orthomosaics, developing and integrating practical measures for radiometric correction
642
and reflectance retrieval);
643
Improve morphological representation of micro-topography, plots/fields, basins, parcels, and
644
watercourses using UAS-based digital photogrammetry, and LiDAR surveys;
645
Improve standard procedures for environmental monitoring to support precision agriculture
646
and protection of ecosystems;
647
Enhance soil property retrieval, with a major emphasis on soil moisture monitoring through
648
combined use of thermal and VIS/NIR images and spectral based modelling;
649
Understand how field measurements of vegetation properties and soil (moisture) scale up
650
through UAS-based measurements to satellite estimates;
651
Define a flow velocity and discharge monitoring procedure that provides stream flow
652
measurements in open channels, creeks, rivers and floodplains;
653
Identify a new standard procedure for hydrological monitoring that allows key river basin
654
components to be monitored with a high level of detail that may help in the use of the most
655
recent hydrological models.
656
Endorse the UAS utilization chain from mission planning to a final product.
657
658
659
Figure 6. Structure and composition of the research network of the COST Action HARMONIOUS.
660
The integration of different techniques, including traditional instruments, fixed and mobile
661
camera surveys, satellite observations, and geomorphological analyses, is anticipated to allow better
662
characterization of river basins with a spatial and temporal coverage higher than that offered by
663
traditional techniques, improving the knowledge of hydraulic, ecological and hydrological dynamics.
664
Moreover, the definition of clear and specific procedures may also help the definition of new
665
legislation at the European scale removing some of the actual restriction that limiting potential use
666
of UAS in a wider range of contexts.
667
668
669
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
16 of 27
Appendix A: Available sensors and cameras
670
Given the variety of sensor available for UAS applications, we consider extremely useful to
671
provide an overview of the available cameras and sensors and their characteristics. In the following,
672
we summarized the most common optical cameras (Table 1), multispectral cameras (Table 2),
673
hyperspectral cameras (Table 3), thermal cameras and laser scanners (Table 5). The present tables
674
expands the list of sensors provided by Casagrande et al. (2017).
675
Table 1. List of optical cameras suitable for UAS and their main characteristics.
676
Manufacturer
and model
Sensor type
Resolution
(MPx)
Format
type
Sensor
size (mm2)
Pixel
pitch
(μm)
Weight
(kg)
Frame
rate
(fps)
Max
shutter
speed (s-1)
Approx.
price ($)
Canon EOS 5DS
CMOS 51
FF
36.0 x 24.0
4.1
0.930
5.0
8000
3400
Sony Alpha 7R II
CMOS 42
FF MILC
35.9 x 24.0
4.5
0.625
5.0
8000
3200
Pentax 645D
CCD 40
FF
44.0 x 33.0
6.1
1.480
1.1
4000
3400
Nikon D750
CMOS 24
FF
35.9 x 24.0
6.0
0.750
6.5
4000
2000
Nikon D7200
CMOS 24
SF
23.5 x 15.6
3.9
0.675
6.0
8000
1100
Sony Alpha a6300
CMOS 24
SF MILC
23.5 x 15.6
3.9
0.404
11.0
4000
1000
Pentax K-3 II
CMOS 24
SF
23.5 x 15.6
3.9
0.800
8.3
8000
800
Canon EOS 7D
Mark II
CMOS 20
SF
22.3 x 14.9
4.1
0.910
10.0
8000
1500
Panasonic Lumix
DMC GX8
CMOS 20
SF MILC
17.3 x 13.0
3.3
0.487
10.0
8000
1000
Ricoh GXR A16
CMOS 16
SF
23.6 x 15.7
4.8
0.550
2.5
3200
650
Table 2. List of multispectral cameras available on the market for UAS and their main characteristics.
677
Manufacturer and model
Resolution
(Mpx)
Size (mm)
Pixel
size
(μm)
Weight
(kg)
Number
of spectral
bands
Spectral
range
(nm)
Tetracam MiniMCA-6
1.3
131 x 78 x 88
5.2 x 5.2
0.7
6
450-1000
Tetracam ADC micro
3.2
75 x 59 x 33
3.2 x 3.2
0.9
6
520-920
Quest Innovations Condor-5 ICX 285
7
150 x 130 x 177
6.45 x 6.45
1.4
5
400-1000
Parrot Sequoia
1.2
59 x 41 x 28
3.75 x 3.75
0.72
4
550-810
MicaSense RedEdge
120 x 66 x 46
0.18
5
475-840
Sentera Quad
1.2
76 x 62 x 48
3.75
0.170
4
400-825
Sentera High Precision NDVI and
NDRE
1.2
25.4 x 33.8x 37.3
3.75
0.030
2
525-890
Sentera Multispectral Double 4K
12.3
59 x 41 x 44.5
0.080
5
386-860
SLANTRANGE 3P NDVI
146 x 69 x 57
0.350
4
410 - 950
Mappir
3.2
34 x 34 x 40
0.045
1-6
405-345
678
679
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
17 of 27
Table 3. List of hyperspectral cameras for UAS and their main characteristics.
680
Manufacturer and
model
Lens
Size
(mm2)
Pixel size
(μm)
Weight
(kg)
Spectral
range
(nm)
Spectral bands
and resolution
Rikola Ltd.
hyperspectral camera
CMOS
5.6 x 5.6
5.5
0.6
500-900
40- 10 nm
Headwall Photonics
Micro-hyperspec X-series
NIR
InGaAs
9.6 x 9.6
30
1.025
900-1700
62 - 12.9 nm
BaySpec's
OCI-UAV-1000/2000
C-mount
10x10x10
N/A
0.127/0.218
600-1000
100-5 nm/20-12-15nm
HySpex Mjolnir V-1240
25x17.5x17
0.27mrad
4.0
400 – 1000
200-3 nm
HySpex Mjolnir S-620
25.4x17.5x17
0.54 mrad
4.5
970 - 2500
300-5.1
Specim-AISA KESTREL
push-broom
99x215x240
2.3
600 - 1640
Up to 350 bands/3-8nm
Cornirg microHSI 410
SHARK
CCD/CMOS
136x87x70.35
11.7 μm
0.68
400 – 1000
300bands/2nm
Resonon Pika L
10.0x12.5x5.3
5.86
0.6
400-1000
281 bands/2.1 nm
Table 4. Representative thermal cameras suitable for UAS.
681
Manufacturer and model
Resolution
(Px)
Sensor size
(mm2)
Pixel pitch
(μm)
Weight
(kg)
Spectral
range (μm)
Thermal
Sensitivity (mK)
FLIR Vue Pro 640
640 x 512
10.8 x 8.7
17
<0.115
7.5-13.5
50
FLIR Vue Pro 336
336 x 256
5.7 x 4.4
17
<0.115
7.5-13.5
50
FLIR Tau2 640
640 x 512
N/A
17
<0.112
7.5-13.5
50
FLIR Tau2 336
336 x 256
N/A
17
<0.112
7.5-13.5
50
Thermoteknix Miricle 307 K
640 x 480
16.0 x 12.0
25
<0.170
8.0-12.0
50
Thermoteknix Miricle 110 K
384 x 288
9.6 x 7.2
25
<0.170
8.0-12.0
50/70
Workswell WIRIS 640
640 x 512
16. x 12.8
25
<0.400
7.5-13.5
30/50
Workswell WIRIS 336
336 x256
8.4 x 6.4
25
<0.400
7.5-13.5
30/50
YUNCGOETEU
160x120
81 x 108 x 138
12
0.278
8 - 14
< 50
Table 5. List of laser scanners for UAS and their main characteristics.
682
Manufacturer and
model
Scanning pattern
Range
(m)
Weight
(kg)
Angular
res. (deg)
FOV
(deg)
Laser class
and λ (nm)
Frequency
(kp/s)
ibeo Automotive
Systems IBEO LUX
4 Scanning parallel
lines
200
1
(H) 0.125
(V) 0.8
(H) 110
(V) 3.2
Class A 905
22
Velodyne HDL-32E
32 Laser/detector
pairs
100
2
(H)-(V)
1.33
(H) 360
(V)41
Class A 905
700
RIEGL VQ-820-GU
1 Scanning line
>1000
25.5
(H) 0.01
(V) N/A
(H) 60
(V) N/A
Class 3B
532
200
Hokuyo
UTM-30LX-EW
1,080 distances in a
plane
30
0.37
(H) 0.25
(V) N/A
(H) 270
(V) N/A
Class 1905
200
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
18 of 27
Velodyne Puck Hi-Res
Dual Returns
100
0.590
(H)-(V)
0.1-0.4
(H) 360
(V) 20
Class A-903
RIEGL VUX-1UAV
Parallel scan lines
150
3.5
0.001°
330
Class A-NIR
200
Routescene – UAV
LidarPod
32 Laser/detector
pairs
100
1.3
(H)-(V)
1.33
(H) 360
(V) 41
Class A-905
Quanergy M8-1
8 laser/detector pairs
150
0.9
0.03-0.2°
(H) 360
(V) 20
Class A-905
683
Acknowledgements: The present work has been funded by the COST Action CA16219
684
”HARMONIOUS - Harmonization of UAS techniques for agricultural and natural ecosystems
685
monitoring”. B. Tóth acknowledges financial support by the Hungarian National Research,
686
Development and Innovation Office (NRDI) under grant KH124765. J. Müllerová was supported by
687
projects GA17-13998S and RVO67985939.
688
689
References
690
Abrantes, J.R.C.B., J.L.M.P. de Lima, S.A. Prats, J.J. Keizer, 2017. Assessing soil water repellency
691
spatial variability using a thermographic technique: an exploratory study using a small-scale
692
laboratory soil flume. Geoderma, 287, 98–104. (doi: 10.1016/j.geoderma.2016.08.014).
693
Abrantes, J.R.C.B., R.B. Moruzzi, A. Silveira, J.L.M.P. de Lima, 2018. Comparison of thermal, salt and
694
dye tracing to estimate shallow flow velocities: Novel triple tracer approach. Journal of Hydrology,
695
557, 362-377. (doi: 10.1016/j.jhydrol.2017.12.048)
696
Adão, T, Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., Sousa, J.J., 2017 Hyperspectral imaging:
697
A review on UAV-based sensors, data processing and applications for agriculture and forestry,
698
Remote Sensing, 017, 9(11), 1110. (doi:10.3390/rs9111110)
699
Ahmed, O. S., Shemrock, A., Chabot, D., Dillon, C., Williams, G., Wasson, R., & Franklin, S. E. 2017.
700
Hierarchical land cover and vegetation classification using multispectral data acquired from an
701
unmanned aerial vehicle. International Journal of Remote Sensing, 38(8-10), 2037-2052.
702
Ai, M., Hu, Q., Li, J., Wang, M., Yuan, H., & Wang, S. 2015. A robust photogrammetric processing
703
method of low-altitude UAV images. Remote Sensing, 7(3), 2302-2333.
704
Akar, O., 2017. Mapping land use with using Rotation Forest algorithm from UAV images, European
705
Journal of Remote Sensing, 50:1, 269-279.
706
Aldana-Jague, E., Heckrath, G., Macdonald, A., van Wesemael, B., & Van Oost, K. 2016. UAS-based
707
soil carbon mapping using VIS-NIR (480-1000 nm) multi-spectral imaging: Potential and
708
limitations. Geoderma, 275, 55–66.
709
Alvarez-Taboada, F., C. Paredes, J. Julián-Pelaz, 2017. Mapping of the Invasive Species Hakea sericea
710
Using Unmanned Aerial Vehicle (UAV) and WorldView-2 Imagery and an Object-Oriented
711
Approach. Remote Sensing, 9(9):913 (doi: 10.3390/rs9090913).
712
Anderson, K. and Gaston, K. J., 2013, Lightweight unmanned aerial vehicles will revolutionize spatial
713
ecology. Frontiers in Ecology and the Environment, 11: 138–146. (doi:10.1890/120150).
714
Atzberger, C., 2013. Advances in remote sensing of agriculture: Context description, existing
715
operational monitoring systems and major information needs, Remote Sensing, 5:949-981.
716
Baldwin, D., Manfreda, S., Keller, K., Smithwick, E.A.H., Predicting root zone soil moisture with soil
717
properties and satellite near-surface moisture data at locations across the United States, Journal
718
of Hydrology, 2017.
719
Baluja, J., M.P. Diago, P. Balda, R. Zorer, M. Meggio, F. Morales, J. Tardaguila, 2012. Assessment of
720
vineyard water status variability by thermal and multispectral imagery using an unmanned
721
aerial vehicle (UAV). Irrigation Science, 30:511–522.
722
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
19 of 27
Bellvert, J., P.J. Zarco-Tejada, J. Girona, and E. Fereres 2014. Mapping crop water stress index in a
723
‘Pinot-noir’ vineyard: Comparing ground measurements with thermal remote sensing imagery
724
from an unmanned aerial vehicle, Precision Agriculture, 15: 361–376.
725
Ben-Dor, E., Banin, A. 1994. Visible and near-infrared (0.4–1.1 μm) analysis of arid and semiarid soils.
726
Remote Sensing of Environment, 48(3), 261-274.
727
Ben-Dor, E., Banin, A. 1996. Evaluation of several soil properties using convolved TM spectra.
728
Monitoring Soils in the Environment with Remote Sensing and GIS, ORSTOM Éditions, Paris, 135-
729
149.
730
Ben-Dor, E., Chabrillat, S., Demattê, J. A. M., Taylor, G. R., Hill, J., Whiting, M. L., Sommer, S. 2009.
731
Using imaging spectroscopy to study soil properties. Remote Sensing of Environment, 113, S38-
732
S55.
733
Berni, J., Zarco-Tejada, P., Suarez, L., Gonzalez-Dugo, V., Fereres, E., 2008. Remote sensing of
734
vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors,
735
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information
736
Sciences, Vol. XXXVII, 6 p.
737
Brevis, W., Y. Niño, G. H. Jirka, 2011. Integrating cross-correlation and relaxation algorithms for
738
particle tracking velocimetry, Experiments in Fluids, 50(1):135–147.
739
Brook, A., Ben-Dor, E., 2011. Supervised vicarious calibration (SVC) of hyperspectral remote-sensing
740
data. Remote Sensing of Environment, 115(6), 1543-1555.
741
Brook, A., Ben-Dor, E., 2015. Supervised vicarious calibration (SVC) of multi-source hyperspectral
742
remote-sensing data. Remote Sensing, 7(5), 6196-6223.
743
Brook, A., Polinova, M., Ben-Dor, E., 2018. Fine tuning of the SVC method for airborne hyperspectral
744
sensors: the BRDF correction of the calibration nets targets. Remote Sensing of Environment, 204,
745
861-871.
746
Bryson, M., A. Reid, F. Ramos, S. Sukkarieh. 2010. Airborne Vision-Based Mapping and Classification
747
of Large Farmland Environments. Journal of Field Robotics, 27(5): 632–655. (doi:
748
10.1002/rob.20343).
749
Bueren, S.K., A. Burkart, A. Hueni, U. Rascher, M.P. Tuohy, I.J. Yule, 2015. Deploying four optical
750
UAV-based sensors over grassland: challenges and limitations, Biogeosciences, 12: 163–175.
751
Burkart, A., Aasen, H., Alonso, L., Menz, G., Bareth, G., Rascher, U., 2015. Angular dependency of
752
hyperspectral measurements over wheat characterized by a novel UAV based goniometer,
753
Remote Sensing, 7: 725–746.
754
Calviño-Cancela, M., R. Mendez-Rial, J. R., Reguera-Salgado, J., & J. Martín-Herrero, J. (2014). Alien
755
plant monitoring with ultralight airborne imaging spectroscopy. PloS one, 9(7), e102381.
756
Casagrande, G., Sik, A., & Szabó, G. (Eds.). 2017. Small Flying Drones: Applications for Geographic
757
Observation. Springer.
758
Chabot, D., Bird, D. M. (2012). Evaluation of an off-the-shelf unmanned aircraft system for surveying
759
flocks of geese. Waterbirds, 35(1), 170-174.
760
Cohen, Y., V. Alchanatis, Y. Saranga, O. Rosenberg, E. Sela, A. Bosak, 2017. Mapping water status
761
based on aerial thermal imagery: comparison of methodologies for upscaling from a single leaf
762
to commercial fields. Precis. Agric., 18: 801–822. (doi:10.1007/s11119-016-9484-3).
763
Colomina, I., Molina, 2014. Unmanned aerial systems for photogrammetry and remote sensing: a
764
review. ISPRS Journal of Photogrammetry and Remote Sensing. 92: 79-97. (doi:
765
10.1016/j.isprsjprs.02.013)
766
Costa, F.G., J. Ueyama, T. Braun, G. Pessin, F.S. Osorio, P.A. Vargas, 2012. The use of unmanned aerial
767
vehicles and wireless sensor network in agricultural applications, Proceedings of the IEEE
768
International Geoscience and Remote Sensing Symposium (IGARSS 2012), 22-27 July, pp. 5045–
769
5048.
770
Cunliffe, A. M., Brazier, R. E., Anderson, K. 2016. Ultra-fine grain landscape-scale quantification of
771
dryland vegetation structure with drone-acquired structure-from-motion photogrammetry.
772
Remote Sensing of Environment, 183, 129-143.
773
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
20 of 27
D’Addabbo, A., A. Refice, G. Pasquariello, F. Lovergine, D. Capolongo S. Manfreda, 2016. A Bayesian
774
Network for Flood Detection Combining SAR Imagery and Ancillary Data, IEEE Transactions on
775
Geoscience and Remote Sensing, 54(6), 3612 -3625, (doi: 10.1109/TGRS.2016.2520487).
776
d’Oleire-Oltmanns, S., I. Marzolff, K.D. Peter, J.B. Ries, 2012. Unmanned aerial vehicle (UAV) for
777
monitoring soil erosion in Morocco, Remote Sensing, 4:3390–3416
778
Dandois, J. P., E.C. Ellis, 2013. High spatial resolution three-dimensional mapping of vegetation
779
spectral dynamics using computer vision. Remote Sensing of Environment, 136, 259-276.
780
de Lima, J. L. M. P. and J. R. C. B. Abrantes, 2014a. Can infrared thermography be used to estimate
781
soil surface microrelief and rill morphology? CATENA 113, 314–322.
782
(doi:10.1016/j.catena.2013.08.011).
783
de Lima, J. L. M. P. and J. R. C. B. Abrantes, 2014b. Using a thermal tracer to estimate overland and
784
rill flow velocities. Earth Surface Processes and Landforms, 39 (10), 1293-1300. (doi:
785
10.1002/esp.3523)
786
de Lima, J. L. M. P., J. R. C. B. Abrantes, V. P. Silva Jr, A. A. A. Montenegro, 2014a. Prediction of skin
787
surface soil permeability by infrared thermography: a soil flume experiment. Quantitative
788
Infrared Thermography Journal, 08/2014; (doi: 10.1080/17686733.2014.945325)
789
de Lima, J.L.M.P., J.R.C.B. Abrantes, V.P. Silva Jr, M.I.P. de Lima and A.A.A. Montenegro, 2014b.
790
Mapping soil surface macropores using infrared thermography: an exploratory laboratory
791
study. The Scientific World Journal, 2014, 845460, 8 pages. (doi: 10.1155/2014/845460)
792
Detert, M., V. Weitbrecht, 2015. A low-cost airborne velocimetry system: proof of concept, Journal of
793
Hydraulic Research, 53(4):532–539.
794
Dittmann, S., Thiessen, E. and Hartung, E., 2017. Applicability of different non-invasive methods for
795
tree mass estimation: A review. Forest Ecology and management, 398, 208-215.
796
Drusch, M., U. Del Bello, S. Carlier, O. Colin, V. Fernandez, F. Gascon, B. Hoersch, et al. “Sentinel-2:
797
ESA’s Optical High-Resolution Mission for GMES Operational Services.” Remote Sensing of
798
Environment, The Sentinel Missions - New Opportunities for Science, 120 (May 15, 2012): 25–36.
799
(doi: 10.1016/j.rse.2011.11.026).
800
Dufour, S., Bernez, I., Betbeder, J., Corgne, S., Hubert-Moy, L., Nabucet, J., ... & Trollé, C. (2013).
801
Monitoring restored riparian vegetation: how can recent developments in remote sensing
802
sciences help?. Knowledge and Management of Aquatic Ecosystems, (410), 10.
803
Dunford, R., Michel, K., Gagnage, M., Piégay, H., & Trémelo, M. L. (2009). Potential and constraints
804
of Unmanned Aerial Vehicle technology for the characterization of Mediterranean riparian
805
forest, International Journal of Remote Sensing, 30(19), 4915-4935.
806
Dvorák, P., Müllerová, J., Bartalos, T., Bruna, J. 2015. Unmanned aerial vehicles for alien plant species
807
detection and monitoring. The International Archives of Photogrammetry, Remote Sensing and Spatial
808
Information Sciences, 40(1), 83.
809
Eltner, A. and Schneider, D., 2015, Analysis of Different Methods for 3D Reconstruction of Natural
810
Surfaces from Parallel-Axes UAV Images. Photogram Rec, 30: 279–299. doi:10.1111/phor.12115
811
Erdelj, M., M. Król, E. Natalizio, 2017. Wireless sensor networks and multi-UAV systems for natural
812
disaster management. Computer Networks, 124: 72-86. (doi: 10.1016/j.comnet.2017.05.021)
813
Esposito, F., G. Rufino, A. Moccia, P. Donnarumma, M. Esposito, V. Magliulo. 2007. An Integrated
814
Electro-Optical Payload System for Forest Fires Monitoring from Airborne Platform. In
815
Proceedings of IEEE Aerospace Conference, 1–13. Big Sky, MT: IEEE.
816
Feng, Q., J. Liu, J. Gong, 2015. UAV remote sensing for urban vegetation mapping using random
817
forest and texture analysis, Remote Sensing, 7(1):1074–1094.
818
Flynn, K.F., S.C. Chapra, 2014. Remote sensing of submerged aquatic vegetation in a shallow non-
819
turbid river using an unmanned aerial vehicle, Remote Sensing, 6:12815–12836.
820
Frankenberger, J.R., C. Huang, K. Nouwakpo, 2008. Low-altitude digital photogrammetry technique
821
to assess ephemeral gully erosion, Proceedings of the IEEE International Geoscience and Remote
822
Sensing Symposium (IGARSS 2008), 07-11 July 2008, Boston, Massachusetts, IV:117–120.
823
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
21 of 27
Fujita, I., M. Muste, A. Kruger, 1997. Large-scale particle image velocimetry for flow analysis in
824
hydraulic engineering applications. Journal of Hydraulic Research, 36(3):397–414.
825
Fujita, I., T. Hino, 2003. Unseeded and seeded PIV measurements of river flows video from a
826
helicopter. Journal of Visualization, 6(3):245–252.
827
Fujita, I., Y. Kunita, 2011. Application of aerial LSPIV to the 2002 flood of the Yodo River using a
828
helicopter mounted high density video camera. Journal of Hydro-Environment Research, 5 (4):323–
829
331.
830
Gago, J., D. Douthe, I. Florez-Sarasa, J.M. Escalona, J. Galmes, A.R. Fernie, J. Flexas, H. Medrano,
831
2014. Opportunities for improving leaf water use efficiency under climate change conditions,
832
Plant Science, 226:108–119.
833
Gago, J., Douthe, C., Coopman, R., Gallego, P., Ribas-Carbo, M., Flexas, J., ... & Medrano, H. 2015.
834
UAVs challenge to assess water stress for sustainable agriculture. Agricultural water management,
835
153, 9-19.
836
Gay, A. P., Stewart, T. P., Angel, R., Easey, M., Eves, A. J., Thomas, N. J., ... & Kemp, A. I., 2009.
837
Developing unmanned aerial vehicles for local and flexible environmental and agricultural
838
monitoring. In Proceedings of RSPSoc 2009 Annual Conference (Vol. 8, No. 11, pp. 471-476).
839
Gay, A., T. Stewart, R. Angel, M. Easey, A. Eves, N. Thomas, A. Kemp. 2009. Developing Unmanned
840
Aerial Vehicles for Local and Flexible Environmental and Agricultural Monitoring. In
841
Proceedings of the Remote Sensing and Photogrammetry Society Conference, 471–476. Leicester:
842
ISPRS.
843
Geipel, J., J. Link, W. Claupein, 2014. Combined spectral and spatial modeling of corn yield based on
844
aerial images and crop surface models acquired with an unmanned aircraft system, Remote
845
Sensing, 6:10335–10355.
846
Getzin, S., R. S. Nuske, K. Wiegand, 2014. Using unmanned aerial vehicles (UAV) to quantify spatial
847
gap patterns in forests. Remote Sensing, 6(8), 6988-7004.
848
Getzin, S., Wiegand, K., Schöning, I. 2012. Assessing biodiversity in forests using very high‐resolution
849
images and unmanned aerial vehicles. Methods in Ecology and Evolution, 3(2), 397-404.
850
Gigante, V., P. Milella, V. Iacobellis, S. Manfreda, I. Portoghese, 2009. Influences of Leaf Area Index
851
estimations on the soil water balance predictions in Mediterranean regions, Natural Hazard and
852
Earth System Sciences, 9, 979-991, (doi:10.5194/nhess-9-979-2009).
853
Gillespie, T.W., J. Chu, E. Frankenberg, D. Thomas. 2007. Assessment and Prediction of Natural
854
Hazards from Satellite Imagery. Progress in Physical Geography, 31 (5): 459–470.
855
(doi:10.1177/0309133307083296).
856
Gonçalves, J., Henriques, R., Alves, P., Sousa‐Silva, R., Monteiro, A. T., Lomba, Â., ... & Honrado, J.
857
(2016). Evaluating an unmanned aerial vehicle‐based approach for assessing habitat extent and
858
condition in fine‐scale early successional mountain mosaics. Applied vegetation science, 19(1), 132-
859
146.
860
Gonzalez-Dugo, V., P. Zarco-Tejada, D. Goldhamer, E. Fereres, 2014. Improving the precision of
861
irrigation in a pistachio farm using an unmanned airborne thermal system. Irrigation Science,
862
33(1):43-52.
863
Gonzalez-Dugo, V., P. Zarco-Tejada, E. Nicolas, P.A. Nortes, J.J. Alarcon, D.S. Intrigliolo, E. Fereres,
864
2013. Using high resolution UAV thermal imagery to assess the variability in the water status of
865
five fruit tree species within a commercial orchard, Precision Agriculture, 14(6):660-678.
866
Hand, E. 2015. Startup Liftoff. Science 348, no. 6231: 172–77. (doi: 10.1126/science.348.6231.172).
867
Hassan-Esfahani, L., A. Torres-Rua, A. Jensen, M. McKee, 2015. Assessment of Surface Soil Moisture
868
Using High Resolution Multi-Spectral Imagery and Artificial Neural Networks. Remote Sensing,
869
7, 2627-2646.
870
Helman, D. 2018. Land surface phenology: What do we really ‘see’ from space?, Science of the Total
871
Environment, 618: 665-673 (doi:10.1016/j.scitotenv.2017.07.237).
872
Helman, D., Givati, A., I. M. Lensky. 2015. Annual evapotranspiration retrieved from satellite
873
vegetation indices for the Eastern Mediterranean at 250 m spatial resolution, Atmos. Chem. Phys.,
874
15: 12567-12579.
875
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
22 of 27
Helman, D., Lensky, I. M., Osem, Y., Rohatyn, S., Rotenberg, E., D. Yakir. 2017. A biophysical
876
approach using water deficit factor for daily estimations of evapotranspiration and CO2 uptake
877
in Mediterranean environments, Biogeosciences, 14: 3909-3926.
878
Hervouet, A., R. Dunford, H. Piegay, B. Belletti, M.L. Tremelo, 2011. Analysis of post-flood
879
recruitment patterns in braided channel rivers at multiple scales based on an image series
880
collected by unmanned aerial vehicles, ultralight aerial vehicles, and satellites, GIScience and
881
Remote Sensing, 48:50–73.
882
Hill, D. J., C. Tarasoff, G. E. Whitworth, J. Baron, J. L. Bradshaw, J. S. Church, 2017. Utility of
883
unmanned aerial vehicles for mapping invasive plant species: a case study on yellow flag iris
884
(Iris pseudacorus L.). International Journal of Remote Sensing, 38(8-10):2083-2105 (doi:
885
10.1080/01431161.2016.1264030).
886
Hmimina, G., Dufrene, E., Pontailler, J. Y., Delpierre, N., Aubinet, M., Caquet, B., de Grandcourt, A.
887
S., Burban, B. T., Flechard, C., A. Granier, 2013. Evaluation of the potential of MODIS satellite
888
data to predict vegetation phenology in different biomes: An investigation using ground-based
889
NDVI measurements, Remote Sens. Environ., 132: 145–158.
890
Huang, Y., S.J. Thomson, W.C. Hoffmann, Y. Lan, B.K. Fritz, 2013. Development and prospect of
891
unmanned aerial vehicle technologies for agricultural production management. International
892
Journal of Agricultural Biology and Engineering, 6(3):1–10.
893
Hung, C., Z. Xu, S. Sukkarieh, 2014. Feature learning based approach for weed classification using
894
high resolution aerial images from a digital camera mounted on a UAV, Remote Sensing, 6:12037–
895
12054.
896
Hunt, E.R., W.D. Hivel, S.J. Fujikawa, D.S. Linden, C.S.T. Daughtry, G.W. McCarty, 2010. Acquisition
897
of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote
898
Sensing, 2 (1): 290-305. (doi: 10.3390/rs2010290)
899
HyperUAS – imaging spectroscopy from a multirotor unmanned aircraft system, Journal of Field
900
Robotics, 31(4):571-590. (doi: 10.1002/rob.21508)
901
Jackson, R.D., S.B. Idso, R.J. Reginato, 1981. Canopy temperature as a crop water stress indicator.
902
Water Resour. Res. 17, 1133–1138.
903
James, M. R. and S. Robson, 2014, Mitigating systematic error in topographic models derived from
904
UAV and ground-based image networks. Earth Surface Processes and Landforms, 39: 1413–1420.
905
(doi:10.1002/esp.3609)
906
James, M.R., S. Robson, M.W. Smith, 2017b. 3-D uncertainty-based topographic change detection with
907
structure-from-motion photogrammetry: precision maps for ground controland directly
908
georeferenced surveys. Earth Surface Processes and Landforms, 42 (12): 1769-1788. (doi:
909
10.1002/esp.4125)
910
James, M.R., S. Robson, S. d'Oleire-Oltmanns, U. Niethammer, 2017a. Optimising UAV topographic
911
surveys processed with structure-from-motion: ground control quality, quantity and bundle
912
adjustment. Geomorphology, 280: 51-66. (doi: 10.1016/j.geomorph.2016.11.021)
913
Jannoura, R., K. Brinkmann, D. Uteau, C. Bruns, R.G. Joergensen, 2015. Monitoring of crop biomass
914
using true colour aerial photographs taken from a remote controlled hexacopter, Biosystems
915
Engineering, 129:341–351.
916
Jensen, T., A. Apan, F. Young, L. Zeller, 2007. Detecting the Attributes of a Wheat Crop Using Digital
917
Imagery Acquired from a Low-Altitude Platform. Computers and Electronics in Agriculture, 59 (1–
918
2): 66–77. (doi:10.1016/j.compag.2007.05.004).
919
Jeunnette, M. N., D. P. Hart, 2016. Remote sensing for developing world agriculture: opportunities
920
and areas for technical development, Proc. SPIE 9998, Remote Sensing for Agriculture,
921
Ecosystems, and Hydrology XVIII, 99980Y. (doi: 10.1117/12.2241321)
922
Jhan, J.-P., J.-Y. Rau, N. Haala, M. Cramer, 2017. Investigation of parallax issues for multi-lens
923
multispectral camera band co-registration, International Archives of the Photogrammetry,
924
Remote Sensing and Spatial Information Sciences, XLII (2/W6): 157-163. (doi: 10.5194/isprs-archives-
925
XLII-2-W6-157-2017)
926
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
23 of 27
Johnson, L. F., S. Herwitz, S. Dunagan, B. Lobitz, D. Sullivan, R. Slye, 2003. “Collection of Ultra High
927
Spatial and Spectral Resolution Image Data over California Vineyards with a Small UAV.” In
928
Proceedings of the 30th International Symposium on Remote Sensing of Environment, 845–849.
929
Honolulu, HI.
930
Jones, G. P., L. G. Pearlstine, H. F. Percival, 2006. An assessment of small unmanned aerial vehicles
931
for wildlife research. Wildlife Society Bulletin, 34(3), 750-758.
932
Joyce, K. E., S. E. Belliss, S. V. Samsonov, S. J. McNeill, P. J. Glassey. 2009. A Review of the Status of
933
Satellite Remote Sensing and Image Processing Techniques for Mapping Natural Hazards and
934
Disasters. Progress in Physical Geography, 33 (2): 183–207. (doi:10.1177/0309133309339563)
935
Klemas, V. V., 2015. Coastal and Environmental Remote Sensing from Unmanned Aerial Vehicles:
936
An Overview. Journal of Coastal Research, 31(5), 1260 – 1267.
937
Klosterman, S., & Richardson, A. D. (2017). Observing Spring and Fall Phenology in a Deciduous
938
Forest with Aerial Drone Imagery. Sensors, 17(12), 2852.
939
Knoth, C., Klein, B., Prinz, T., Kleinebecker, T. 2013. Unmanned aerial vehicles as innovative remote
940
sensing platforms for high‐resolution infrared imagery to support restoration monitoring in cut‐
941
over bogs. Applied Vegetation Science, 16(3), 509-517.
942
Koh, L. P., Wich, S. A. (2012). Dawn of drone ecology: low-cost autonomous aerial vehicles for
943
conservation. Tropical Conservation Science, 5(2), 121-132.
944
Laliberte, A. S., Goforth, M. A., Steele, C. M., Rango, A. 2011. Multispectral Remote Sensing from
945
Unmanned Aircraft: Image Processing Workflows and Applications for Rangeland
946
Environments. Remote Sensing, 3 (11): 2529–2551. (doi:10.3390/rs3112529)
947
Langridge, M., L. Edwards, 2017, Future batteries, coming soon: Charge in seconds, last months and
948
power over the air, (online access 13 FEBRUARY 2017) GADGETS.
949
Lehmann, J. R. K., Nieberding, F., Prinz, T., Knoth, C. 2015. Analysis of unmanned aerial system-
950
based CIR images in forestry—A new perspective to monitor pest infestation levels. Forests, 6(3),
951
594-612.
952
Lehmann, J. R., Prinz, T., Ziller, S. R., Thiele, J., Heringer, G., Meira-Neto, J. A., & Buttschardt, T. K.
953
(2017). Open-source processing and analysis of aerial imagery acquired with a low-cost
954
unmanned aerial system to support invasive plant management. Frontiers in Environmental
955
Science, 5, 44.
956
Li, N., D. Zhou, F. Duan, S. Wang, Y. Cui, 2010. Application of unmanned airship image system and
957
processing techniques for identifying of fresh water wetlands at a community scale, Proceedings
958
of the IEEE 18th Geoinformatics International Conference, 18-20 June, Beijing, China, 5 p.
959
Link, J., D. Senner, W. Claupein, 2013. Developing and evaluating an aerial sensor platform (ASP) to
960
collect multispectral data for deriving management decisions in precision farming, Computers
961
and Electronics in Agriculture, 94:20–28.
962
Liu, Z., J. Wu, H. Yang, B. Li, Y. Zhang, S. Yang, 2009. Developing unmanned airship onboard
963
multispectral imagery system for quick-response to drinking water pollution, Proceedings of
964
SPIE 7494, MIPPR 2009: Multispectral Image Acquisition and Processing (J.K. Udupa, N. Sang,
965
L.G. Nyul, H.T. Yichang, editors), China, (doi: 10.1117/12.833451).
966
Lucieer, A., Z. Malenovský, T. Veness, L. Wallace, Ludovisi, R., F. Tauro, R. Salvati, S. Khoury, G.
967
Scarascia Mugnozza, and A. Harfouche, 2017. UAV-based thermal imaging for high-throughput
968
field phenotyping of black poplar response to drought, Frontiers in Plant Science, 8:1681.
969
Ludovisi, R., F. Tauro, R. Salvati, S. Khoury, G. Mugnozza Scarascia, & A. Harfouche, 2017. UAV-
970
Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to
971
Drought. Frontiers in Plant Science, 8, 1681. (doi: 10.3389/fpls.2017.01681)
972
Malos, J., B. Beamish, L. Munday, P. Reid, and C. James. 2013. “Remote Monitoring of Subsurface
973
Heatings in Opencut Coal Mines.” In Proceedings of 13th Coal Operators’ Conference, 227– 231.
974
Wollongong: University of Wollongong.
975
Manfreda, S., K. K. Caylor, S. Good, 2017. An Ecohydrological framework to explain shifts in
976
vegetation organization across climatological gradients, Ecohydrology, 10(3), 1-14, (doi:
977
10.1002/eco.1809).
978
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
24 of 27
Manfreda, S., K.K. Caylor, 2013. On The Vulnerability of Water Limited Ecosystems to Climate
979
Change, Water, 5(2), 819-833; (doi:10.3390/w5020819).
980
Manfreda, S., L. Brocca, T. Moramarco, F. Melone, and J. Sheffield, 2014. A physically based approach
981
for the estimation of root-zone soil moisture from surface measurements, Hydrology and Earth
982
System Sciences, 18, 1199-1212, (doi:10.5194/hess-18-1199-2014).
983
Matese, A., P. Toscano, S. Filippo Di Gennaro, L. Genesio, F. Vaccari, J. Primicerio, C. Belli, A. Zaldei,
984
R. Bianconi, B. Gioli, 2015. Intercomparison of UAV, Aircraft and Satellite Remote Sensing
985
Platforms for Precision Viticulture, Remote Sens. 7, 2971-2990.
986
McCabe, M. F., B. Aragon, R. B., Houborg, R., & J. Mascaro, J.,2017b. CubeSats in hydrology:
987
Ultrahigh-resolution insights into vegetation dynamics and terrestrial evaporation. Water
988
Resources Research, 53, 10,017–10,024, (doi: 10.1002/2017WR022240) , 2017b.
989
McCabe, M. F., M. Rodell, D. E., M., Alsdorf, D. G., Miralles, R. Uijlenhoet, W. R., Wagner, A. W.,
990
Lucieer, R. A., Houborg, N. E. C. ,, R., Verhoest, T. E. , Franz, J. Shi, H. J., Gao, H., and E. F.
991
Wood, 2017a. The future of Earth observation in hydrology, Hydrol. Earth Syst. Sci., 21, 3879-
992
3914, (doi: 10.5194/hess-21-3879-2017), 2017a.
993
McGwire, K. C., M. A. Weltz, J. A. Finzel, C. E. Morris, L. F. Fenstermaker, D. S. McGraw. 2013.
994
Multiscale Assessment of Green Leaf Cover in a Semi-Arid Rangeland with a Small Unmanned
995
Aerial Vehicle. International Journal of Remote Sensing, 34 (5): 1615–1632.
996
(doi:10.1080/01431161.2012.723836).
997
McKenna, P., P. D., Erskine, A. M., Lechner, & S. Phinn, S. (2017). Measuring fire severity using UAV
998
imagery in semi-arid central Queensland, Australia. International Journal of Remote Sensing,
999
38(14), 4244-4264.
1000
Merino, L., J.R. Martinez-de-Dios, A. Ollero, 2015. Cooperative unmanned aerial systems for fire
1001
detection, monitoring and extinguishing, Handbook of Unmanned Aerial Vehicles (K.P.
1002
Valavanis and G.J. Vachtsevanos, editors) Springer, New York, pp. 2693–2722.
1003
Mesas-Carrascosa, F. J., Rumbao, I. C., Berrocal, J. A. B., & Porras, A. G. F. 2014. Positional quality
1004
assessment of orthophotos obtained from sensors on board multi-rotor UAV platforms. Sensors,
1005
14(12), 22394-22407.
1006
Michez, A., H. Piégay, H., L. Jonathan, L., H. Claessens, H., & P. Lejeune, P. (2016). Mapping of
1007
riparian invasive species with supervised classification of Unmanned Aerial System (UAS)
1008
imagery. International Journal of Applied Earth Observation and Geoinformation, 44, 88-94.
1009
Minařík, R., J. Langhammer, J. 2016. Use of a multispectral uav photogrammetry for detection and
1010
tracking of forest disturbance dynamics. International Archives of the Photogrammetry, Remote
1011
Sensing & Spatial Information Sciences, 41
1012
Müllerová, J., Brůna, J., Bartaloš, T., Dvořák, P., Vítková, M., Pyšek, P. (2017b). Timing is important:
1013
unmanned aircraft versus satellite imagery in plant invasion monitoring. Frontiers in Plant
1014
Science 8: 887; (doi: 10.3389/fpls.2017.00887).
1015
Müllerová, J., T. Bartaloš, T., J. Brůna, J., P. Dvořák, P., & M. Vítková, M. (2017a). Unmanned aircraft
1016
in nature conservation – an example from plant invasions. International Journal of Remote Sensing
1017
38 (8-10): 2177-2198; (doi: 10.1080/01431161.2016.1275059).
1018
NASA, 2015. Hurricane and Severe Storm Sentinel (HS3) Mission, URL:
1019
http://www.nasa.gov/mission_pages/hurricanes/missions/ hs3/news/hs3.html (last date
1020
accessed: 20 February 2015).
1021
Niethammer, U., M.R. James, S. Rothmund, J. Travelletti, M. Joswig, 2012. UAV-based remote sensing
1022
of the Super Sauze landslide: Evaluation and results, Engineering Geology, 128:2–11.
1023
Otero, V., Van De Kerchove, R., Satyanarayana, B., Martínez-Espinosa, C., Amir Bin Fisol, M., Rodila
1024
Bin Ibrahim, M., Sulong, I., Mohd-Lokman, H., Lucas, R. and Dahdouh-Guebas (2018).
1025
Managing mangrove forests from the sky: forest inventory using field data and Unmanned
1026
Aerial Vehicle (UAV) imagery in the Matang Mangrove Forest Reserve, Peninsular Malaysia,
1027
Remote Sensing, 411C, 35-45.
1028
Pajares, G. 2015. Overview and Current Status of Remote Sensing Applications Based on Unmanned
1029
Aerial Vehicles (UAVs), Photogrammetric Engineering & Remote Sensing, 81(4), 281-329.
1030
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
25 of 27
Paneque-Gálvez, J., M. K. McCall, B. M. Napoletano, S. A. Wich, L. P. Koh, 2014. Small drones for
1031
community-based forest monitoring: An assessment of their feasibility and potential in tropical
1032
areas. Forests, 5(6), 1481-1507.
1033
Pena, J.M., J. Torres-Sanchez, A. Serrano-Perez, A.I. de Castro, F. Lopez- Granados, 2015. Quantifying
1034
Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling
1035
Detection as Affected by Sensor Resolution. Sensors, 15:5609-5626.
1036
Pena, J.M., J. Torres-Sanchez, A.I. de Castro, M. Kelly F. Lopez- Granados, 2013. Weed mapping in
1037
early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images,
1038
Plos ONE, 8(10): e77151.
1039
Peppa, M., J.P. Mills, P. Moore, P.E. Miller, J.C. Chambers, 2016. Accuracy assessment of a UAV-based
1040
landslide monitoring system. International Archives of the Photogrammetry, Remote Sensing and
1041
Spatial Information Sciences, XLI (B5): 895-902. (doi: 10.5194/isprsarchives-XLI-B5-895-2016)
1042
Perks MT, Russell AJ, Large ARG. 2016. Technical Note: Advances in flash flood monitoring using
1043
unmanned aerial vehicles (UAVs). Hydrology and Earth System Sciences, 20(10), 4005-4015.
1044
Primicerio, J., S. F. Di Gennaro, E. Fiorillo, L. Genesio, E. Lugato, A. Matese, F. P. Vaccari. 2012. A
1045
Flexible Unmanned Aerial Vehicle for Precision Agriculture. Precision Agriculture 13 (4): 517–523.
1046
(doi:10.1007/s11119-012-9257-6).
1047
Puliti, S., H. O. Ø rka, T. Gobakken, E. Næ sset, 2015. Inventory of small forest areas using an
1048
unmanned aerial system. Remote Sensing, 7(8), 9632-9654.
1049
Quaritsch, M., K. Kruggl, D. Wischounig-Strucl, S. Bhattacharya, M. Shah, and B. Rinner, 2010.
1050
Networked UAVs as aerial sensor network for disaster management applications, Elektrotechnik
1051
& Informationstechnik, 127(3):56–63.
1052
Quilter, M. C., Anderson, V. J. 2000. Low altitude/large scale aerial photographs: A tool for range and
1053
resource managers. Rangelands Archives, 22(2), 13-17.
1054
Quiquerez, A., Chevigny, E., Allemand, P., Curmi, P., Petit, C., & Grandjean, P. 2014. Assessing the
1055
impact of soil surface characteristics on vineyard erosion from very high spatial resolution aerial
1056
images (CÔ te de Beaune, Burgundy, France). Catena, 116, 163–172.
1057
Reid, A., F. Ramos, S. Sukkarieh, 2011. Multi-class classification of vegetation in natural environments
1058
using an unmanned aerial system, Proceedings of the IEEE International Conference on Robotics
1059
and Automation, 09-13 May, Shanghai, China, pp. 2953–2959.
1060
Reif, M. K., & Theel, H. J. (2017). Remote sensing for restoration ecology: Application for restoring
1061
degraded, damaged, transformed, or destroyed ecosystems. Integrated environmental assessment
1062
and management, 13(4), 614-630.
1063
Rocchini, D., Andreo, V., Förster, M., Garzon-Lopez, C. X., Gutierrez, A. P., Gillespie, T. W., ... &
1064
Marcantonio, M. (2015). Potential of remote sensing to predict species invasions: A modelling
1065
perspective. Progress in Physical Geography, 39(3), 283-309.
1066
Saberioon, M.M., M.S.M. Amina, A.R. Anuar, A. Gholizadeh, A. Wayayokd, S. Khairunniza-Bejo,
1067
2014. Assessment of rice leaf chlorophyll content using visible bands at different growth stages
1068
at both the leaf and canopy scale, International Journal of Applied Earth Observation and
1069
Geoinformation, 32:35–45.
1070
Samseemoung, G., P. Soni, H.P.W. Jayasuriya, V.M. Salokhe, 2012. An Application of low altitude
1071
remote sensing (LARS) platform for monitoring crop growth and weed infestation in a soybean
1072
plantation, Precision Agriculture, 13:611–627.
1073
Sankey, T., J. Donager, J. McVay, J. B. Sankey, 2017. UAV lidar and hyperspectral fusion for forest
1074
monitoring in the southwestern USA. Remote Sensing of Environment, 195, 30-43. (doi:
1075
10.1016/j.rse.2017.04.007)
1076
Sanyal, J., X.X. Lu. 2004. Application of Remote Sensing in Flood Management with Special Reference
1077
to Monsoon Asia: A Review. Natural Hazards, 33 (2): 283–301. (doi:10.1023/B:
1078
NHAZ.0000037035.65105.95).
1079
Shahbazi, M., Théau, J., Ménard, 2014. Recent applications of unmanned aerial imagery in natural
1080
resource management, GIScience & Remote Sensing, 51, 4.
1081
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
26 of 27
Siberth, T., R. Wakrow, J.H. Chandler, 2016. Automatic detection of blurred images in UAV image
1082
sets. ISPRS Journal of Photogrammetry and Remote Sensing, 122:1-16. (doi:
1083
10.1016/j.isprsjprs.2016.09.010).
1084
Singh, K. K., A. E. Frazier, 2018. A meta-analysis and review of unmanned aircraft system (UAS)
1085
imagery for terrestrial applications, International Journal of Remote Sensing, (doi:
1086
10.1080/01431161.2017.14209419).
1087
Smigaj, M., R. Gaulton, J.C. Suarez and S.L. Barr, 2017. Use of miniature thermal cameras for detection
1088
of physiological stress in conifers. Remote Sensing, 9(9), 20 (doi: 10.3390/rs9090957).
1089
Soriano-Disla, J. M., L. J. Janik, R. A. Viscarra Rossel, L. M. Macdonald, M. J. McLaughlin, 2014. The
1090
performance of visible, near-, and mid-infrared reflectance spectroscopy for prediction of soil
1091
physical, chemical, and biological properties. Applied Spectroscopy Reviews, 49(2), 139-186.
1092
Stone, H., D. D’Ayala, S. Wilkinson, 2017. The use of emerging technology in post-disaster
1093
reconnaissance missions. EEFIT Report, Institution of Structural Engineers, London. 25 pp.
1094
Sullivan, D.G., J.P. Fulton, J.N. Shaw, G. Bland, 2007. Evaluating the sensitivity of an unmanned
1095
thermal infrared aerial system to detect water stress in a cotton canopy, Transactions of the
1096
American Society of Agricultural Engineers, 50(6):1955–1962.
1097
Syvitski, J. P. M., I. Overeem, G. R. Brakenridge, M. Hannon. 2012. Floods, Floodplains, Delta Plains
1098
– A Satellite Imaging Approach. Sedimentary Geology, 267–268: 1–14.
1099
(doi:10.1016/j.sedgeo.2012.05.014).
1100
Tahar, K.N., A. Ahmad, W. Akib, 2011. Unmanned aerial vehicle technology for low cost landslide
1101
mapping, Proceedings of the 11th South East Asian Survey Congress and 13th International
1102
Surveyors Congress, Kuala Lumpur, pp. 22–31.
1103
Tahar, K.N., A. Ahmad, W. Akib, W. Mohd, 2012. A new approach on production of slope map using
1104
autonomous unmanned aerial vehicle, International Journal of Physical Sciences, 7(42):5678-5686.
1105
Tang, L., G. Shao, 2015. Drone remote sensing for forestry research and practices. Journal of Forestry
1106
Research, 26(4), 791-797.
1107
Tauro F., A. Petroselli, E. Arcangeletti, 2016b. Assessment of drone-based surface flow observations,
1108
Hydrological Processes, 30(7):1114—1130.
1109
Tauro F., R. Piscopia, S. Grimaldi, Accepted. Streamflow observations from cameras: Large Scale
1110
Particle Image Velocimetry of Particle Tracking Velocimetry?, Water Resources Research, (doi:
1111
10.1002/2017WR020848).
1112
Tauro, F., C. Pagano, P. Phamduy, S. Grimaldi, M. Porfiri, 2015. Large-scale particle image
1113
velocimetry from an unmanned aerial vehicle, IEEE/ASME Transactions in Mechatronics,
1114
20(6):3269-3275.
1115
Tauro, F., M. Porfiri, S. Grimaldi, 2016a. Surface flow measurements from drones, Journal of Hydrology,
1116
540:240—245.
1117
Torres-Sanchez, J., J.M. Pena, A.I. de Castro, F. Lopez-Granados, 2014. Multi-temporal mapping of
1118
the vegetation fraction in early-season wheat fields using images from UAV, Computers and
1119
Electronics in Agriculture, 103:104–113.
1120
Torresan, C., Berton, A., Carotenuto, F., Filippo Di Gennaro, S., Gioli, B., Matese, A., Miglietta, F.,
1121
Vagnoli, C., Zaldei, A. and Wallace, L. (2017). Forestry applications of UAVs in Europe: a review.
1122
International Journal of Remote Sensing, 38, 2427-2447.
1123
Tralli, D. M., Blom, R.G., Zlotnicki, V., Donnellan, A., Evans, D.L.. 2005. “Satellite Remote Sensing of
1124
Earthquake, Volcano, Flood, Landslide and Coastal Inundation Hazards.” ISPRS Journal of
1125
Photogrammetry and Remote Sensing 59 (4): 185–198. (doi:10.1016/j. isprsjprs.2005.02.002).
1126
Urbahs, A., Jonaite, I., 2013. Features of the use of unmanned aerial vehicles for agriculture
1127
applications, Aviation, 17(4):170–175.
1128
Uto, K., Seki, H., Saito, G., Kosugi, Y., 2013. Characterization of rice paddies by a UAV-mounted
1129
miniature hyperspectral sensor system, IEEE Journal on Selected Topics and Applications for Earth
1130
Observation and Remote Sensing, 6: 851–860.
1131
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1
27 of 27
Ventura, D., Bonifazi A., Gravina M.F., Ardizzone G.D., 2017. Unmanned Aerial Systems (UASs) for
1132
Environmental Monitoring: A Review with Applications in Coastal Habitats, Aerial Robots -
1133
Aerodynamics, Control and Applications, Dr. Omar D Lopez Mejia (Ed.), InTech, (doi:
1134
10.5772/intechopen.69598).
1135
Wal, van der, T., B. Abma, A. Viguria, E. Previnaire, P.J. Zarco- Tejada, P. Serruys, E. van Valkengoed,
1136
P. van der Voet, 2013. Fieldcopter: Unmanned aerial systems for crop monitoring services,
1137
Precision Agriculture’13 (J.V. Stafford, editor), pp. 169–165.
1138
Watts, A.C., V.G. Ambrosia, E.A. Hinkley, 2012. Unmanned aircraft systems in remote sensing and
1139
scientific research: Classification and Considerations of use, Remote Sensing, 4:1671–1692.
1140
Wawrzyniak, V., Piegay, H., Allemand, P., Vaudor, L., Grandjean, P., 2013. Prediction of water
1141
temperature heterogeneity of braided rivers using very high resolution thermal infrared (TIR)
1142
images, International Journal of Remote Sensing, 34(13):4812–4831.
1143
Whitehead K., C. H. Hugenholtz, 2014. Remote sensing of the environment with small unmanned
1144
aircraft systems (UASs), part 1: a review of progress and challenges, J. Unmanned Veh. Syst., 2:
1145
69–85 (doi: 10.1139/juvs-2014-0006).
1146
Whitehead, K., Hugenholtz, C. H., Myshak, S., Brown, O., LeClair, A., Tamminga, A.,... & Eaton, B.,
1147
2014. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 2:
1148
scientific and commercial applications. J. Unmanned Veh. Syst., 2(3):86-102 (doi: 0.1139/juvs-2014-
1149
0007).
1150
Wigmore, O., Bryan, M., 2017. Monitoring tropical debris-covered glacier dynamics from high-
1151
resolution unmanned aerial vehicle photogrammetry, Cordillera Blanca, Peru. The Cryosphere,
1152
11(6):2463-2480 (doi:10.5194/tc-11-2463-2017).
1153
Witte, B.M., Singler, R.F., Bailey, S.C.C. 2017. Development of an Unmanned Aerial Vehicle for the
1154
Measurement of Turbulence in the Atmospheric Boundary Layer. Atmosphere, 8, 195.
1155
Yilmaz, K. K., R. F. Adlerab, Y. Tianbc, Y. Hongd, H. F. Piercebe. 2010. Evaluation of a Satellite-Based
1156
Global Flood Monitoring System. International Journal of Remote Sensing, 31 (14): 3763–3782.
1157
(doi:10.1080/01431161.2010.483489).
1158
Zarco-Tejada, P.J., A. Catalina, M.R. Gonzalez, P. Martin, 2013a. Relationships between net
1159
photosynthesis and steady-state chlorophyll fluorescence retrieved from airborne hyperspectral
1160
imagery, Remote Sensing of Environment, 136:247–258.
1161
Zarco-Tejada, P.J., Gonzalez-Dugo, V., Williams, L.E., Suarez, L., Berni, J.A.J., Goldhamer, D., Fereres,
1162
E., 2013e. A PRI-based water stress index combining structural and chlorophyll effects:
1163
Assessment using diurnal narrow-band airborne imagery and the CWSI thermal index, Remote
1164
Sensing of Environment, 138:38–50.
1165
Zarco-Tejada, P.J., M.L. Guillen-Climent, R. Hernandez-Clement, A. Catalinac, M.R. Gonzalez, P.
1166
Martin, 2013c. Estimating leaf carotenoid content in vineyards using high resolution
1167
hyperspectral imagery acquired from an unmanned aerial vehicle (UAV), Agricultural and Forest
1168
Meteorology, (171-172):281– 294.
1169
Zarco-Tejada, P.J., Suarez, L., Gonzalez-Dugo, V., 2013b. Spatial resolution effects on chlorophyll
1170
fluorescence retrieval in a heterogeneous canopy using hyperspectral imagery and radiative
1171
transfer simulation, IEEE Geoscience and Remote Sensing Letters, 10(4):937–941.
1172
Zarco-Tejada, P.J., V. Gonzalez-Dugo, J.A.J. Berni, 2012. Fluorescence, temperature and narrow-band
1173
indices acquired from a UAV platform for water stress detection using a microhyperspectral
1174
imager and a thermal camera, Remote Sensing of Environment, 117:322–337.
1175
Zhang, C. & Kovacs, J.M., 2012. The application of small unmanned aerial systems for precision
1176
agriculture: a review, Precision Agric., 13: 693. (doi: 10.1007/s11119-012-9274-5)
1177
Zhang, C., Walters, D., Kovacs, J.M., 2014. Applications of low altitude remote sensing in agriculture
1178
upon farmer requests - A case study in northeastern Ontario, Canada, Plos ONE, 9(11): e112894.
1179
Zhu, J., Wang, K., Deng, J., Harmon T. 2009. “Quantifying Nitrogen Status of Rice Using Low Altitude
1180
UAV-Mounted System and Object-Oriented Segmentation Methodology.” In Proceedings of the
1181
ASME International Design Engineering Technical Conferences and Computers and
1182
Information in Engineering Conference, 1–7. San Diego, CA: ASME.
1183
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 14 March 2018 doi:10.20944/preprints201803.0097.v1