Conference Paper

UAV Altitude Estimation by Mixed Stereoscopic Vision

MIS Lab., Univ. of Picardie Jules Verne, Amiens, France
DOI: 10.1109/IROS.2010.5652254 Conference: Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on
Source: IEEE Xplore


Altitude is one of the most important parameters to be known for an Unmanned Aerial Vehicle (UAV) especially during critical maneuvers such as landing or steady flight. In this paper, we present mixed stereoscopic vision system made of a fish-eye camera and a perspective camera for altitude estimation. Contrary to classical stereoscopic systems based on feature matching, we propose a plane sweeping approach in order to estimate the altitude and consequently to detect the ground plane. Since there exists a homography between the two views and the sensor being calibrated and the attitude estimated by the fish-eye camera, the algorithm consists then in searching the altitude which verifies this homography. We show that this approach is robust and accurate, and a CPU implementation allows a real time estimation. Experimental results on real sequences of a small UAV demonstrate the effectiveness of the approach.

Download full-text


Available from: Cédric Demonceaux, Oct 04, 2015
1 Follower
53 Reads
  • Source
    • "Eynard et. al [10] applied a stereo camera which combined omnidirectional and perspective camera together, where the attitude data were computed from the omnidirectional camera images. The approach is interesting but in such system there is a need for constantly good image being received by both cameras, to ensure good results. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Localization of Small-Size Unmanned Air Vehicles (UAVs) such as the Quadrotors in Global Positioning System (GPS)-denied environment such as indoors has been done using various techniques. Most of the experiment indoors that requires localization of UAVs, used cameras or ul-trasonic sensors installed indoor or applied indoor environment modification such as patching (Infra Red) IR and visual markers. While these systems have high accuracy for the UAV localization, they are expensive and have less practicality in real situations. In this paper a system consisting of a stereo camera embedded on a quadrotor UAV (QUAV) for indoor localization was proposed. The optical flow data from the stereo camera then are fused with attitude and acceleration data from our sen-sors to get better estimation of the quadrotor location. The quadrotor altitude is estimated using Scale Invariant Feature Transform (SIFT) Feature Stereo Matching in addition to the one computed using optical flow. To avoid latency due to computational time, image processing and the quadrotor control are processed threads and core allocation. The performance of our QUAV altitude estimation is better compared to single-camera embedded QUAVs due to the stereo camera triangulation, where it leads to better estimation of the x-y position using optical flow when fused together.
    Applied Mechanics and Materials 07/2014; 629(2014):270-277. DOI:10.4028/ · 0.15 Impact Factor
  • Source
    • "In order to avoid a matching, we propose a plane-sweeping alogrithm [4]. If we suppose that the UAV navigates on a flat floor at an altitude d, we can show that there exists a homography H between perspective and fish-eye cameras ([11]): "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the combined applications of omnidirectional vision featuring on its application to aerial robotics. Omnidirectional vision is first used to compute the attitude, altitude and motion not only in rural environment but also in the urban space. Secondly, a combination of omnidirectional and perspective cameras permits to estimate the altitude. Finally we present a stereo system consisting of an omnidirectional camera with a laser pattern projector enables to calculate the altitude and attitude during the improperly illuminated conditions to dark environments. We demonstrate that omnidirectional camera in conjunction with other sensors is suitable choice for UAV applications not only in different operating environments but also in various illumination conditions.
    Journal of Intelligent and Robotic Systems 01/2013; 69(1-4). DOI:10.1007/s10846-012-9752-z · 1.18 Impact Factor
  • Source
    • "Dans chaque vue, le nombre de points suivis varie entre 50 et 200. Ensuite , nous supposons que la vue perspective pointe vers le sol, dont la segmentation est connue par plane-sweeping [12]. La fig 6 présente la trajectoire finale du mouvement estimé. "
Show more