Vision Based UAV Attitude Estimation: Progress and Insights.

Journal of Intelligent and Robotic Systems (Impact Factor: 0.83). 01/2012; 65:295-308. DOI: 10.1007/s10846-011-9588-y
Source: DBLP

ABSTRACT Unmanned aerial vehicles (UAVs) are increasingly replacing manned systems in situations that are dangerous, remote, or difficult
for manned aircraft to access. Its control tasks are empowered by computer vision technology. Visual sensors are robustly
used for stabilization as primary or at least secondary sensors. Hence, UAV stabilization by attitude estimation from visual
sensors is a very active research area. Vision based techniques are proving their effectiveness and robustness in handling
this problem. In this work a comprehensive review of UAV vision based attitude estimation approaches is covered, starting
from horizon based methods and passing by vanishing points, optical flow, and stereoscopic based techniques. A novel segmentation
approach for UAV attitude estimation based on polarization is proposed. Our future insightes for attitude estimation from
uncalibrated catadioptric sensors are also discussed.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In computer vision systems an unpredictable image corruption can have significant impact on its usability. Image recovery methods for partial image damage, in particular in moving scenarios, can be crucial for recovering corrupted images. In these situations, image fusion techniques can be successfully applied to congregate information taken at different instants and from different points-of-view to recover damaged parts. In this article we propose a technique for temporal and spatial image fusion, based on fuzzy classification, which allows partial image recovery upon unexpected defects without user intervention. The method uses image alignment techniques and duplicated information from previous images to create fuzzy confidence maps. These maps are then used to detect damaged pixels and recover them using information from previous frames.
    2013 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2013); 07/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This work presents an optimized visual fuzzy servoing system for avoidance obstacle task using an unmanned aerial vehicle. The cross-entropy method was used to set the gains of the controller inputs. The optimization process was made using the ROS-Gazebo 3D simulation with an extension software developed for this work. Once the optimal controller was obtained a set of real tests were made with a quadcopter to evaluate the behavior of the controller with excellent results. To accomplish this task just the visual information of the front camera of the quadrotor was used. This image is processed off-board and the information is send to the Fuzzy Logic controller which sends commands to modify the orientation of the aircraft.
    Fuzzy Systems (FUZZ-IEEE), 2012 IEEE International Conference on; 06/2012
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of this paper is to present a method for integration of measurements provided by inertial sensors (gyroscopes and accelerometers), GPS and a video system in order to estimate position and attitude of an UAV (Unmanned Aerial Vehicle). Inertial sensors are widely used for aircraft navigation because they represent a low cost and compact solution, but their measurements suffer of several errors which cause a rapid divergence of position and attitude estimates. To avoid divergence inertial sensors are usually coupled with other systems as for example GNSS (Global Navigation Satellite System). In this paper it is examined the possibility to couple the inertial sensors also with a camera. A camera is generally installed on-board UAVs for surveillance purposes, it presents several advantages with respect to GNSS as for example great accuracy and higher data rate. Moreover, it can be used in urban area or, more in general, where multipath effects can forbid the application of GNSS. A camera, coupled with a video processing system, can provide attitude and position (up to a scale factor), but it has lower data rate than inertial sensors and its measurements have latencies which can prejudice the performances and the effectiveness of the flight control system. The integration of inertial sensors with a camera allows exploiting the better features of both the systems, providing better performances in position and attitude estimation.
    Information Fusion (FUSION), 2012 15th International Conference on; 01/2012


Available from
May 20, 2014