Conference Paper

UAV Altitude Estimation by Mixed Stereoscopic Vision

MIS Lab., Univ. of Picardie Jules Verne, Amiens, France
DOI: 10.1109/IROS.2010.5652254 Conference: Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on
Source: IEEE Xplore


Altitude is one of the most important parameters to be known for an Unmanned Aerial Vehicle (UAV) especially during critical maneuvers such as landing or steady flight. In this paper, we present mixed stereoscopic vision system made of a fish-eye camera and a perspective camera for altitude estimation. Contrary to classical stereoscopic systems based on feature matching, we propose a plane sweeping approach in order to estimate the altitude and consequently to detect the ground plane. Since there exists a homography between the two views and the sensor being calibrated and the attitude estimated by the fish-eye camera, the algorithm consists then in searching the altitude which verifies this homography. We show that this approach is robust and accurate, and a CPU implementation allows a real time estimation. Experimental results on real sequences of a small UAV demonstrate the effectiveness of the approach.

Download full-text


Available from: Cédric Demonceaux
  • Source
    • "The proposed solutions are based on artificial ground-located landmarks [12] [25], optical flow [5] [27], stereoscopic vision [11] [16] [20], and machine learning [8]. Most of these methods assume that the camera's viewpoint is oriented to the ground and that the camera parameters are known. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This work addresses the problem of camera elevation estimation from a single photograph in an outdoor environment. We introduce a new benchmark dataset of one-hundred thousand images with annotated camera elevation called Alps100K. We propose and experimentally evaluate two automatic data-driven approaches to camera elevation estimation: one based on convolutional neural networks, the other on local features. To compare the proposed methods to human performance, an experiment with 100 subjects is conducted. The experimental results show that both proposed approaches outperform humans and that the best result is achieved by their combination.
    Full-text · Conference Paper · Sep 2015
  • Source
    • "Eynard et. al [10] applied a stereo camera which combined omnidirectional and perspective camera together, where the attitude data were computed from the omnidirectional camera images. The approach is interesting but in such system there is a need for constantly good image being received by both cameras, to ensure good results. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Localization of Small-Size Unmanned Air Vehicles (UAVs) such as the Quadrotors in Global Positioning System (GPS)-denied environment such as indoors has been done using various techniques. Most of the experiment indoors that requires localization of UAVs, used cameras or ul-trasonic sensors installed indoor or applied indoor environment modification such as patching (Infra Red) IR and visual markers. While these systems have high accuracy for the UAV localization, they are expensive and have less practicality in real situations. In this paper a system consisting of a stereo camera embedded on a quadrotor UAV (QUAV) for indoor localization was proposed. The optical flow data from the stereo camera then are fused with attitude and acceleration data from our sen-sors to get better estimation of the quadrotor location. The quadrotor altitude is estimated using Scale Invariant Feature Transform (SIFT) Feature Stereo Matching in addition to the one computed using optical flow. To avoid latency due to computational time, image processing and the quadrotor control are processed threads and core allocation. The performance of our QUAV altitude estimation is better compared to single-camera embedded QUAVs due to the stereo camera triangulation, where it leads to better estimation of the x-y position using optical flow when fused together.
    Full-text · Article · Jul 2014 · Applied Mechanics and Materials
  • Source
    • "In order to avoid a matching, we propose a plane-sweeping alogrithm [4]. If we suppose that the UAV navigates on a flat floor at an altitude d, we can show that there exists a homography H between perspective and fish-eye cameras ([11]): "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the combined applications of omnidirectional vision featuring on its application to aerial robotics. Omnidirectional vision is first used to compute the attitude, altitude and motion not only in rural environment but also in the urban space. Secondly, a combination of omnidirectional and perspective cameras permits to estimate the altitude. Finally we present a stereo system consisting of an omnidirectional camera with a laser pattern projector enables to calculate the altitude and attitude during the improperly illuminated conditions to dark environments. We demonstrate that omnidirectional camera in conjunction with other sensors is suitable choice for UAV applications not only in different operating environments but also in various illumination conditions.
    Full-text · Article · Jan 2013 · Journal of Intelligent and Robotic Systems
Show more