ArticlePDF Available

Abstract

This paper presents a radar approach to navigation of small and micro Unmanned Aerial Vehicles (UAV) in environments challenging for common sensors. A technique based on radar odometry is briefly explained and schemes for complete integration with other sensors are proposed. The focus of the paper is set on ultralight radars and interpretation of outputs of such sensor when dealing with autonomous navigation in complex scenario. The experimental setup used to analyse the proposed approach comprises one multi-rotor UAV and one ultralight commercial radar. Results from flight tests in which both forward-only motion and mixed motion are presented and analysed, providing a reference for understanding outputs of radar in complex scenarios. The radar odometry solution is compared with ground truth provided by GPS sensor.
ULTRALIGHT RADAR FOR SMALL AND MICRO-UAV NAVIGATION
A. F. Scannapieco a*, A. Renga a, G. Fasano a, A. Moccia a
a DII - Department of Industrial Engineering, University of Naples Federico II, Piazzale Tecchio, 80 - 80125 Naples, Italy -
(antoniofulvio.scannapieco, alfredo.renga, giancarmine.fasano, antonio.moccia)@unina.it
KEY WORDS: UAV, Navigation, Radar
ABSTRACT:
This paper presents a radar approach to navigation of small and micro Unmanned Aerial Vehicles (UAV) in environments challenging
for common sensors. A technique based on radar odometry is briefly explained and schemes for complete integration with other sensors
are proposed. The focus of the paper is set on ultralight radars and interpretation of outputs of such sensor when dealing with
autonomous navigation in complex scenario. The experimental setup used to analyse the proposed approach comprises one multi-rotor
UAV and one ultralight commercial radar. Results from flight tests in which both forward-only motion and mixed motion are presented
and analysed, providing a reference for understanding outputs of radar in complex scenarios. The radar odometry solution is compared
with ground truth provided by GPS sensor.
1. INTRODUCTION
Autonomous navigation is one of the most investigated fields of
Unmanned Aerial Vehicles (UAV). It typically relies on fused
measurements from both an Inertial Measurement Unit (IMU)
and a GPS sensor, the drift of IMU being removed via a Kalman
Filter (Quist et al, 2016a). However, GPS signal is often neither
available nor reliable during operations involving small or micro
UAV. This represents an issue for navigation, since IMU housed
onboard the platform may produce fast growing errors due to
limited performance. Common approaches to the problem of
GPS-denied navigation exploit electro-optical sensors and
involve algorithms based on visual odometry (VO) (Nister et al.,
2004) and LIDAR (Zhang, 2017). Effectiveness of these
techniques can be hindered by difficulties with installation
onboard micro-UAV, as when dealing with LIDAR, and
especially adverse illumination conditions, such as smoke or
dust. Radar provides information relevant to navigation but is
independent of illumination conditions. Furthermore, current
level of miniaturization and increasing interest in different fields,
e.g. aerospace and automotive sectors, quicken the integration of
ultralight radars onboard small and micro UAV (Moses et al,
2011),(Fasano et al, 2017), in terms of size, weight and power
(SWaP).
Radar odometry for UAV navigation has been recently proposed,
among others, by (Kaufmann et al, 2015), (Quist et al, 2016a),
(Quist et al, 2016b), and (Scannapieco et al, 2017). In particular,
(Kaufmann et al, 2015) proposed a simulation of two-
dimensional navigation solution based on data from both side-
looking radar housed on a fixed-wing UAV and IMU.
Approaches in (Quist et al, 2016a) and (Quist et al, 2016b) are
tailored to fixed-wing UAV and data acquired by a high
performance Synthetic Aperture Radar (SAR) flying on a Cessna
aircraft. An approach towards small and micro UAV, which can
move in different directions with different speeds, hover, or even
exhibit only attitude rotations, at very low altitude in a GPS-
challenging scenario is presented in (Scannapieco et al, 2017).
The work is oriented to existing commercial ultralight radars and
to environments that can be significantly cluttered, hindering
reliable extraction of many strong and stable scatterers. Indeed,
challenges for radar navigation may arise with a large amount of
* Corresponding author
radar reflectors in the scene. In addition, relative attitude between
radar antennas and physical objects and operational wavelength
can affect the visibility of some targets (Knott, 1990).
The aims of this work are to present strategy for navigation of
small and micro UAV with ultralight radars in challenging
scenarios and to show the differences in radar outputs, not always
straightforward, depending on the scene.
The paper is organized as follows: Section 2 illustrates principles
of radar-aided navigation; Section 3 provides and explains results
of experimental campaigns; finally, Section 4 provides
conclusions.
2. RADAR-AIDED NAVIGATION
2.1 Radar Odometry
Radar odometry exploits information on fixed and strong targets
in the scene to retrieve platform ownship motion. Given a model
of radar, the main steps before using odometric algorithms are
target detection and multiple-target tracking (MTT)
(Scannapieco et al, 2017). Concerning this, it is also necessary to
mention that an advantage of radar-based odometry compared
with vision-based systems is the direct access to range
information that prevents from scale drift phenomena.
Currently most of the lightweight radars exploit Frequency
Modulated Continuous Wave (FMCW) technology, owing to
inherent features of this scheme that allow small and light high
range resolution sensors having a limited consumption of energy
radar (Scannapieco et al, 2015). Therefore, the model of radar
used in this work is a FMCW radar with a single transmitting
(Tx) antenna and two receiving (Rx) antennas separated in
azimuthal direction. Two-dimensional information, i.e. range and
azimuth angle of targets, is retrieved from observation of the
scene via phase interferometry. This mode of operation differs
from both 2D LIDAR and 3D LIDAR (Scannapieco et al, 2017)
and it is worth highlighting that range and angular information
are provided with an extremely different accuracy and the phase
interferometry assumes a single target at each range
(Scannapieco et al, 2017).
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
333
Target detection develops in two phases: range-bearing
estimation and features extraction. At each time step n, Fast
Fourier Transform (FFT) is applied to the output of the radar to
extract the range content. The resulting complex discrete
frequency-domain signal for a-th channel can be expressed as
(,)= (,){(,)} (1)
where = range frequency
(,)= magnitude component
(,)= phase component
Frequency is directly proportional to range in FMCW systems.
Then, when both receiving channels are enabled, phase
component can be processed via the formula
(, )= sin 
 (2)
to estimate bearing angle. In equation (2) is the wavelength and
L is the separation between phase centres of antennas. Since
separation between receivers is very small compared with the
range resolution, the magnitude component of two signals (1) is
averaged non-coherently to achieve partial clutter suppression
thus obtaining the value (,).
Once range and bearing contents are available, the extraction of
relevant information and rejection of clutter are demanded to a
one-dimensional Constant False Alarm Rate (CFAR) filter
(Scannapieco et al, 2017). Ordered Statistics CFAR (OS-CFAR)
(Rohling, 1983) performs well under different operative
conditions. The OS-CFAR is developed by assuming that power
content of each cell within a sliding window, whose size
depends on applications, is first rank-ordered according to
increasing magnitude (Rohling, 1983). The ordered statistic
, is assumed as noise level. For each Cell Under Test (CUT)
the OS-CFAR detector compares the power level of CUT itself
with noise level times a scaling factor  and a target is present
if
≥ , (3)
Extraction of strong scatterers leads to a sparse representation,
unlike passive cameras (Scaramuzza, 2011) or active RGB-D
(Vetrella et al, 2015) sensors which both provide spatially dense
information.
The output of target detection is then fed into MTT algorithm.
MTT algorithm hereby proposed works in three steps
(Scannapieco et al, 2017). First, a Global Nearest Neighbours
(GNN) algorithm associate new measurements with correct
available tracks. All measurements that fall outside the
uncertainty ellipsis, centered around estimated measurements,
are not considered for association via the Munkres’ algorithm.
Then, track handling strategies evaluate the status of new,
tentative, and firm tracks. Finally, Extended Kalman Filter (EKF)
provides new estimates of the range and bearing for each track.
At this point the tracks can be used by an odometer based on
Singular Value Decomposition (SVD) to obtain rotation matrix
and translation vector and retrieve the ownship motion of the
platform. It is worth noting that the proposed approach uses radar
data to estimate horizontal motion and heading angle. In fact, roll
and pitch can be estimated with onboard inertial sensors. The
estimate of height above ground level (AGL) can be obtained at
each time step n from magnitude vector (,). Indeed, the first
peak higher than a certain threshold, related to thermal noise,
represents the first ground echo. The knowledge of the range of
ground echo  and the tilt angle of the sensor at leads to
AGL as  =  sin(+ .5) (4)
where = beamwidth in elevation.
Figure 1. Schematic for loosely-coupled sensor fusion.
Figure 2. Schematic for tightly-coupled sensor fusion.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
334
2.2 Fusion with Inertial Sensors
Radar-only navigation is feasible (Scannapieco et al, 2017),
however fusion with Inertial Navigation Systems (INS) and other
sensors could enhance navigation capabilities of small UAV. A
first fusion approach is the loosely-coupled sensor fusion. The
radar navigation solution, i.e. the variation of the pose provided
by radar odometry solution, is computed separately and fused
with INS navigation solution in an EKF. The solution of EKF
depends also on the AGL and other possible sensors. Loosely-
coupled fusion schematic is shown in Fig. 1. In the tightly-
coupled fusion scheme (see Fig. 2), on the contrary, no direct
radar navigation is produced but range and bearing contents in
the tracks are used directly in EKF. Despite being more complex
than the loosely-coupled fusion, this solution can exploit
information such as range rate and angular rate in an easier way
to augment navigation results.
3. EXPERIMENTAL RESULTS
This section illustrates the main results obtained during two
experimental campaigns. The focus is on the understanding radar
data to assist navigation.
3.1 Setup and scenes
The radar front-end used during experimental campaigns is
FMCW radar 24-GHz SENTIRE Radar by IMST (IMST, 2015).
Patch antennas, a single transmitting and two receiving separated
in azimuth direction complete the sensor. The angular interval in
which is possible to measure unambiguously the bearing angle is
approximately 120°, that is, ±60° around boresight direction.
The micro UAV adopted for flight tests is a customised version
of the 3DR X8+ octocopter. The X8+ is a flexible platform able
to embark auxiliary onboard systems up to maximum payload
weight of 800 g (or close to 1 kg with reduced flight-time
capabilities). The onboard payload for the radar sensing test
includes the radar system, an Odroid XU4 embedded CPU
running Linux operating system, a DC-DC converter, a dedicated
battery for radar, an auxiliary GPS receiver (Ublox LEA-6T) with
raw data capabilities, and an associated GPS patch antenna. The
radar is located fore to minimize obstructions and disturbances
to/from the other electronics components. Odroid XU4, DC-DC
converter, the dedicated radar battery, and the auxiliary GPS
system, have been installed above a plate under the drone belly.
The test strategy relies on data acquisition for offline processing.
Proper acquisition software has been developed in Python
language to store all the data with an accurate time-tag based on
the CPU clock. The CPU time-tag is also provided for GPS
measurements, including GPS time, gathered with very small
latency, enabling accurate synchronization of data acquired from
different sensors.
Two different test sites have been selected for experimental
campaigns. The first one is a complex GPS-denied and cluttered
environment and can be described as urban canyon (see Fig. 3).
The radar is mounted in a forward-looking configuration and a
Xiaomi YI camera, slightly tilted, provides also optical images.
Fig. 4 illustrates the setup. It is worth noting that the rotor blades
were added just before the flight. The second test site, instead, is
a less challenging environment containing man-made objects
such as poles, wired nets, a car (see Fig. 5). In this case the radar
is slightly pitched (20°), as shown in Fig. 6. This mounting
solution keeps forward-looking capabilities but also improves the
illumination of ground targets.
Figure 3. First campaign: scene.
Figure 4. Setup for first campaign. IMST radar is mounted in
forward-looking position.
Figure 5. Second campaign: scene.
Figure 6. Setup for second campaign. IMST radar is slightly
pitched down.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
335
3.2 Results
The first result shown in Fig. 7 is the difference in radargram
measurement. In the first campaign, the UAV moves forward
almost continuously. It stops hovering at around 60 s and
experiences a 180° rotation at around 140 s. From the radargram
it is also possible to retrieve information on the nature of the
targets. Indeed, the slope of range curves indicates the relative
range rate. Targets with different relative range rate are moving
in a different way. This information is important for navigation
as the one proposed relies on fixed targets. The complete
understanding of the scene is given after target tracking as shown
in Fig. 8. Indeed, in a forward-only motion the targets get closer
in range and their bearing moves from the centre to the side of
the beam before disappearing. In addition, the sudden rotation is
witnessed by the steep angular rate at around 140 s. Each colour
is associated to a single track.
The second scene, on the contrary, shows motion with both
forward-looking observation and side-looking observation. This
is important because the methods here presented are for platforms
that can experience any kind of motion. The radargram in Fig. 9
shows returns from omnidirectional scattering targets, i.e. poles,
metallic net and the car. Around 100 s a complete turn is
experienced and then the side-looking observation occurs. From
the image of the scene it might be expected to see also the corner
reflectors. However, they were mounted in the scene in a
particular configuration so to highlight that when dealing with
radar it is also important the observation angle. Again, target
tracking gives relevant information (see Fig. 10). The targets seen
by radar are on the left, therefore they are the metallic poles. After
rotation, there is side-looking observation, hence the range walks
are hyperbolas and there is a linear variation of the bearing angle,
because the targets move from right to left. In the last part of the
campaign, again forward motion is performed with poles to right
of the radar.
Figure 7. Magnitude plot as function of range and time for first campaign.
Figure 8. Output of MTT algorithm for first campaign. Each colour corresponds to a single track.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
336
Figure 9. Magnitude plot as function of range and time for second campaign.
Figure 10. Output of MTT algorithm for second campaign. Each colour corresponds to a single track.
Finally, radar odometry results from forward-motion segment
during second campaign are shown in Fig. 11, in which the GPS
ground truth is present. The trajectory is in a North-East reference
frame. The odometer follows the motion of the platform and its
trajectory is in the range of uncertainty of the GPS. Therefore,
the results can be considered acceptable for navigation purposes.
4. CONCLUSIONS
In this paper navigation of mini- and micro-UAV in challenging
scenario supported by ultralight radar sensor has been discussed.
Algorithms for radar-odometry navigation have been provided
and sensor fusion strategies anticipated. In addition, the results of
two experimental campaigns in target scenarios have been
thoroughly analysed. The results indicates that the analysis of
target detection outputs and MTT outputs, despite not as
straightforward as visual-based information, can lead to a precise
knowledge of motion of the platform. In addition, these results
could serve as a basis and aid for comprehension of radar outputs
in different scenarios.
Finally, radar-only odometry has been tested, showing an
acceptable accuracy. Main challenges are due to the
discrimination between rotation and translation. When the
platform rotates with no translation or with very small translation,
the current odometer based on SVD has difficulties in
differentiating between translation and rotation. Indeed,
variations of angles in the very small updating time interval are
perceived as combination of both forward- and cross-range
motion. The presence of data from IMU could however assist
odometer in understanding radar measurements.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
337
Figure 11. UAV trajectory estimated by radar-only odometry.
GPS track is also shown for comparison.
5. ACKNOWLEDGEMENTS
This research was carried out in the frame of Programme STAR,
financially supported by University of Naples Federico II
(UniNA) and Compagnia di San Paolo, and in the framework of
“Programma per il finanziamento della ricerca di Ateneo” funded
by UniNa. REFERENCES
Fasano G., Renga A., Vetrella A.R., Ludeno G., Catapano I., and
Soldovieri F., 2017. Proof of Concept of Micro-UAV based
Radar Imaging. Unmanned Aircraft Systems (ICUAS), 2017
International Conference on.
IMST, 2015. available Online: http://www.radar-
sensor.com/products/radar-modules/sr-1200/
Kauffman K, J. Raquet J., Morton Y.T.J., and Garmatyuk D.,
2013. Real-time UWB-OFDM radar-based navigation in
unknown terrain, IEEE Transactions on Aerospace and
Electronic Systems, 49(3), pp. 1453–1466.
Knott E.F., 1990. "Radar Cross Section" in Radar Handbook,
McGraw Hill, New York.
Moses A. A., Rutherford M. J., Kontitsis M., and Valavanis K.P.,
2011. UAV-borne X-band radar for MAV collision avoidance,
Proc. SPIE 8045, Unmanned Systems Technology XIII, 80450U.
Nister D., Naroditsky O., and Bergen J., 2004. Visual odometry,
Computer Vision and Pattern Recognition, 2004. CVPR 2004.
Proceedings of the 2004 IEEE Computer Society Conference on,
1, pp. I-652-I-659.
Quist E. and Beard R., 2016a. Radar Odometry on Fixed-Wing
Small Unmanned Aircraft, IEEE Transactions on Aerospace and
Electronic Systems, 52(1), pp. 396 – 410.
Quist E., Niedfeldt P., and Beard R., 2016b. Radar Odometry
with Recursive-RANSAC, IEEE Transactions on Aerospace and
Electronic Systems, 52(4), pp. 1618 - 1630.
Rohling H., 1983. Radar CFAR Thresholding in Clutter and
Multiple Target Situations, IEEE Transactions on Aerospace and
Electronic Systems, AES-19 (4), pp. 608 – 621.
Scannapieco A.F, Renga A., and Moccia A., 2015. Preliminary
Study of a Millimeter Wave FMCW InSAR for UAS Indoor
Navigation, Sensors, 15 (2), pp. 2309-2335.
Scannapieco A.F, Renga A., Fasano G., and Moccia A., 2017.
Experimental analysis of radar odometry by commercial
ultralight radar sensor for miniaturized UAS, Journal of
Intelligent & Robotic Systems, submitted.
Scaramuzza D. and Fraundorfer F., 2011. Visual Odometry Part
I: The First 30 Years and Fundamentals, IEEE Robotics &
Automation Magazine.
Vetrella A.R, Savvaris A., Fasano G., and Accardo D., 2015.
RGB-D camera-based quadrotor navigation in GPS-denied and
low light environments using known 3D markers, Unmanned
Aircraft Systems (ICUAS), 2015 International Conference on, pp.
185-192.
Zhang J and Singh S., 2017. Low-drift and real-time lidar
odometry and mapping, Autonomous Robots.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLII-2/W6, 2017
International Conference on Unmanned Aerial Vehicles in Geomatics, 4–7 September 2017, Bonn, Germany
This contribution has been peer-reviewed.
https://doi.org/10.5194/isprs-archives-XLII-2-W6-333-2017 | © Authors 2017. CC BY 4.0 License.
338
... Unfortunately, the achieved range of the system (approximately 100 m) means that it can only affect collision avoidance. A similar situation occurred in [34], where the authors checked the capabilities of the micro-radar mounted on a UAV. The use of navigation radars with a range of several kilometers requires a significant increase in the AUV volume and the space for a sufficiently large antenna. ...
Article
Full-text available
Autonomous Underwater Vehicles (AUVs) are currently one of the most intensively developing branches of marine technology. Their widespread use and versatility allow them to perform tasks that, until recently, required human resources. One problem in AUVs is inadequate navigation, which results in inaccurate positioning. Weaknesses in electronic equipment lead to errors in determining a vehicle’s position during underwater missions, requiring periodic reduction of accumulated errors through the use of radio navigation systems (e.g., GNSS). However, these signals may be unavailable or deliberately distorted. Therefore, in this paper, we propose a new computer vision-based method for estimating the position of an AUV. Our method uses computer vision and deep learning techniques to generate the surroundings of the vehicle during temporary surfacing at the point where it is currently located. The next step is to compare this with the shoreline representation on the map, which is generated for a set of points that are in a specific vicinity of a point determined by dead reckoning. This method is primarily intended for low-cost vehicles without advanced navigation systems. Our results suggest that the proposed solution reduces the error in vehicle positioning to 30–60 m and can be used in incomplete shoreline representations. Further research will focus on the use of the proposed method in fully autonomous navigation systems.
... This risk can ideally be mitigated by aiding the GPS navigation with GPS denied navigation solutions [24], [25], including vision-based [26], [27], [28], lidar-based [29], [30], radar-based [31], [32], ultra-wideband (UWB) positioningbased [33], [34], and combined navigation solutions [35], [36]. UWB positioning systems are commonly used for indoor navigation and can reach centimeter-level accuracies [37]. ...
Article
Full-text available
This paper presents a comprehensive review of state-of-the-art navigation methods available for unmanned aerial vehicles (UAVs) used in parcel delivery. Particularly, the paper focuses on state-of-the-art sensor configurations, multi-sensor data fusion architectures, and their performance when employed for UAV navigation. Additionally, this paper presents the associated safety regulations for UAV navigation currently imposed by regulatory bodies in US and Canada. The existing navigation solutions sometimes produce degenerative results due to GPS loss, multipath signals, spoofing events, and other sensor degradation scenarios. Therefore, this article investigates the suitability of integrating visual lidar odometry and mapping (VLOAM) with GPS to overcome the limitations of existing navigation solutions. A comparative study of the multi-sensory combined solutions is presented with numerical simulations, validating the regulatory compliance of VLOAM and GPS integrated system under common GPS failure cases. Note to Practitioners —This work was motivated by the need for a survey on existing UAV navigation methods for parcel delivery applications. Different UAV navigation methods exist, depending on the sensors used and the sensor fusion architectures, with varying degrees of localization accuracy. It can be challenging for researchers and practitioners to decide which method to adopt for their application while complying with the existing safety regulations. Therefore, this paper presents an overview of the current safety regulation for UAV navigation and evaluates the state-of-the-art navigation methods against regulatory safety compliance. Additionally, a numerically validated safe navigation method is suggested for UAV-based parcel delivery. This paper provides researchers and practitioners with comprehensive reference sources in the UAV navigation field, which can help them develop suitable solutions to ensure safe navigation.
Chapter
As radar sensors are being miniaturized, there is a growing interest for using them in indoor sensing applications such as indoor drone obstacle avoidance. In those novel scenarios, radars must perform well in dense scenes with a large number of neighboring scatterers. Central to radar performance is the detection algorithm used to separate targets from the background noise and clutter. Traditionally, most radar systems use conventional constant false alarm rate (CFAR) detectors, but their performance degrades in indoor scenarios with many reflectors. Inspired by the advances in nonlinear target detection, this chapter proposes a novel high performance yet low-complexity target detector and experimentally validates the proposed algorithm on a dataset acquired using a radar mounted on a drone. It is experimentally shown that the proposed algorithm drastically outperforms ordered statistics CFAR (OS-CFAR) for the specific task of indoor drone navigation with more than 19% higher probability of detection for a given probability of false alarm. After introducing this novel radar detector, this chapter goes on to present how the proposed radar sensing setup is embedded on a drone platform for acquiring sensor-fusion databases that will be used in the subsequent chapter during various SLAM and people detection experiments.
Article
Accurate motion estimation is a challenging problem for agile radar platforms, even when state of the art Inertial Navigation Sensors (INSs) are used. However, it is an important problem to solve as it can have a large impact on the performance of radar modes such as Synthetic Aperture Radar (SAR). This study addresses the motion estimation problem of agile radar platforms from the perspective of an omnidirectional radar array. In this study, we perform an analysis on the applicability of an omnidirectional radar array to explicitly estimate the motion of an agile SAR platform and improve imaging quality. Building on existing 1D SAR motion compensation techniques, we develop a method to estimate the 3D motion of the radar platform utilizing its height and velocity vector. Using a prototype radar developed at TNO, we experimentally verify that using the proposed velocity estimation method alone, we achieve comparable positioning performance to that of a state-of-the-art INS, making it possible to perform INS-free SAR imaging by using arbitrary flight paths. We also show that fusing the radar positioning estimates obtained from the proposed methods with the INS output yields an additional increase in SAR imaging performance, improving the resolvability and detectability of weak targets.
Article
Full-text available
As radar sensors are being miniaturized,there is a growing interest for using them in indoor sensing applications such as indoor drone obstacle avoidance. In those novel scenarios,radars must perform well in dense scenes with a large number of neighboring scatterers. Central to radar performance is the detection algorithm used to separate targets from the background noise and clutter. Traditionally,most radar systems use conventional CFAR detectors but their performance degrades in indoor scenarios with many reflectors. Inspired by the advances in non-linear target detection,we propose a novel high-performance,yet low-complexity target detector and we experimentally validate our algorithm on a dataset acquired using a radar mounted on a drone. We experimentally show that our proposed algorithm drastically outperforms OS-CFAR (standard detector used in automotive systems) for our specific task of indoor drone navigation with more than 19% higher probability of detection for a given probability of false alarm. We also benchmark our proposed detector against a number of recently proposed multi-target CFAR detectors and show an improvement of 16% in probability of detection compared to CHA-CFAR,with even larger improvements compared to both OR-CFAR and TS-LNCFAR in our particular indoor scenario. To the best of our knowledge,this work improves the state of the art for high-performance yet low-complexity radar detection in critical indoor sensing applications.
Article
Full-text available
Autonomous navigation of miniaturized Unmanned Aircraft Systems (UAS) in complex environments, when Global Positioning System is unreliable or not available, is still an open issue. This paper contributes to that topic exploring the use of radar-only odometry by existing commercial ultralight radars. The focus is set on an end-to-end Multiple-Target Tracking strategy compliant with desired sensor and platform, which exploits both range and bearing measurements provided by the radar. A two-dimensional odometry approach is then implemented. Main results show real-time capabilities and standard deviation of errors in Forward and Cross-range directions smaller than 1.50 m and 3.00 m, respectively. Field test data are also used to discuss the potential of this technique, challenging issues, and future improvements.
Article
Full-text available
Here we propose a real-time method for low-drift odometry and mapping using range measurements from a 3D laser scanner moving in 6-DOF. The problem is hard because the range measurements are received at different times, and errors in motion estimation (especially without an external reference such as GPS) cause mis-registration of the resulting point cloud. To date, coherent 3D maps have been built by off-line batch methods, often using loop closure to correct for drift over time. Our method achieves both low-drift in motion estimation and low-computational complexity. The key idea that makes this level of performance possible is the division of the complex problem of Simultaneous Localization and Mapping, which seeks to optimize a large number of variables simultaneously, into two algorithms. One algorithm performs odometry at a high-frequency but at low fidelity to estimate velocity of the laser scanner. Although not necessary, if an IMU is available, it can provide a motion prior and mitigate for gross, high-frequency motion. A second algorithm runs at an order of magnitude lower frequency for fine matching and registration of the point cloud. Combination of the two algorithms allows map creation in real-time. Our method has been evaluated by indoor and outdoor experiments as well as the KITTI odometry benchmark. The results indicate that the proposed method can achieve accuracy comparable to the state of the art offline, batch methods.
Article
Full-text available
Small autonomous unmanned aerial systems (UAS) could be used for indoor inspection in emergency missions, such as damage assessment or the search for survivors in dangerous environments, e.g., power plants, underground railways, mines and industrial warehouses. Two basic functions are required to carry out these tasks, that is autonomous GPS-denied navigation with obstacle detection and high-resolution 3D mapping with moving target detection. State-of-the-art sensors for UAS are very sensitive to environmental conditions and often fail in the case of poor visibility caused by dust, fog, smoke, flames or other factors that are met as nominal mission scenarios when operating indoors. This paper is a preliminary study concerning an innovative radar sensor based on the interferometric Synthetic Aperture Radar (SAR) principle, which has the potential to satisfy stringent requirements set by indoor autonomous operation. An architectural solution based on a frequency-modulated continuous wave (FMCW) scheme is proposed after a detailed analysis of existing compact and lightweight SAR. A preliminary system design is obtained, and the main imaging peculiarities of the novel sensor are discussed, demonstrating that high-resolution, high-quality observation of an assigned control volume can be achieved.
Article
This paper explores the use of radar odometry for Global Positioning System–denied navigation. The range progression from arbitrary ground-based point scatterers is used to estimate an unmanned aerial vehicle’s relative motion. In high clutter environments, the recursive-RANSAC algorithm provides robust and efficient feature identification, data association, and tracking. The tracked feature range measurements are integrated with inertial measurement unit measurements in an extended Kalman filter. Real flight data from a cluttered environment are used to validate the results.
Article
This paper develops an odometry algorithm for unmanned air vehicles (UAVs) based solely on radar measurements. The radar return from stationary ground-based scatterers is used to create a range-compressed image, which can be used to estimate the relative motion of the UAV. The Hough transform is used to extract scatterer features from the range-compressed image and to solve the data association problem. The estimated relative motion of the aircraft based on radar return is then fused with inertial measurement unit data in an extended Kalman filter. Radar data collected during actual flight tests are compared to truth data as estimated using a global positioning system sensor to demonstrate the effectiveness of the algorithm. Robustness properties of the algorithm are explored using simulation studies.
Conference Paper
This paper presents an original approach for autonomous navigation based on RGB-D data and known 3D markers, where the basic concept is to detect and recognize the markers and then to use them for a straightforward pose estimation solution. The developed algorithms can allow a quadrotor to autonomously fly in (cooperative) GPS denied environments and/or when there is no natural or artificial illumination of the scene, by following a predetermined path consisting of successive targets having a well defined shape and/or color. Algorithms for target detection and recognition based on depth data are described which are optimized for real time use, paying particular attention to the on-board computational load. Experimental tests have been carried out by integrating a RGB-Depth sensor (ASUS Xtion Pro Live) on-board a custom-built quadrotor. First results confirm the potential of the proposed approach. The technique can be applied to different types of unmanned aerial vehicles (UAVs), as well as unmanned ground vehicles (UGVs).
Article
We present a signal processing algorithm and simulation study for aerial navigation with an ultrawideband orthogonal frequency division multiplexed (UWB-OFDM) radar in Global Positioning System (GPS)-denied environments. Stationary scatterers are detected and tracked using an M/N detector and modified global nearest neighbor (GNN) tracker. The radar measurements to the scatterers are combined with inertial navigation system (INS) measurements in an extended Kalman filter (EKF) to compute the aircraft position. The estimation error of the proposed algorithm is analyzed through computer-based simulations with/without radar measurements from the scatterers and with varying signal-to-noise ratio (SNR).
Article
Increased use of Miniature (Unmanned) Aerial Vehicles (MAVs) is coincidentally accompanied by a notable lack of sensors suitable for enabling further increases in levels of autonomy and consequently, integration into the National Airspace System (NAS). The majority of available sensors suitable for MAV integration are based on infrared detectors, focal plane arrays, optical and ultrasonic rangefinders, etc. These sensors are generally not able to detect or identify other MAV-sized targets and, when detection is possible, considerable computational power is typically required for successful identification. Furthermore, performance of visual-range optical sensor systems can suffer greatly when operating in the conditions that are typically encountered during search and rescue, surveillance, combat, and most common MAV applications. However, the addition of a miniature radar system can, in consort with other sensors, provide comprehensive target detection and identification capabilities for MAVs. This trend is observed in manned aviation where radar systems are the primary detection and identification sensor system. Within this document a miniature, lightweight X-Band radar system for use on a miniature (710mm rotor diameter) rotorcraft is described. We present analyses of the performance of the system in a realistic scenario with two MAVs. Additionally, an analysis of MAV navigation and collision avoidance behaviors is performed to determine the effect of integrating radar systems into MAV-class vehicles.