Conference Paper

UAV altitude estimation by mixed stereoscopic vision

MIS Lab., Univ. of Picardie Jules Verne, Amiens, France
DOI: 10.1109/IROS.2010.5652254 Conference: Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on
Source: IEEE Xplore

ABSTRACT Altitude is one of the most important parameters to be known for an Unmanned Aerial Vehicle (UAV) especially during critical maneuvers such as landing or steady flight. In this paper, we present mixed stereoscopic vision system made of a fish-eye camera and a perspective camera for altitude estimation. Contrary to classical stereoscopic systems based on feature matching, we propose a plane sweeping approach in order to estimate the altitude and consequently to detect the ground plane. Since there exists a homography between the two views and the sensor being calibrated and the attitude estimated by the fish-eye camera, the algorithm consists then in searching the altitude which verifies this homography. We show that this approach is robust and accurate, and a CPU implementation allows a real time estimation. Experimental results on real sequences of a small UAV demonstrate the effectiveness of the approach.

1 Bookmark
 · 
170 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: New software simultaneously calibrates conventional and omnidirec-tional cameras in the same stereoscopic rig. A stereo (or stereoscopic) rig is a device with two or more cam-eras that makes it possible to simulate human binocular vision and its ability to capture 3D images. The type and number of cameras used depends on the intended application. In the last decade, omnidirectional cameras (standard cameras that point to curved mirrors or use a fisheye lens) have attracted interest because of their wide field of view. However, conventional (per-spective) cameras, which have a more limited angle of vision, are still useful. Their images have very high spatial resolution com-pared to omnidirectional images because, generally, the same number of pixels is used in both cases, although the field of view is smaller in perspective cameras. Therefore, creating a stereo rig with both a perspective and an omnidirectional camera has the potential to merge high spatial resolution with a wide field of view (see Figure 1). Systems that combine different types of cameras on the same rig are called hybrid stereoscopic systems. Their main applica-tion fields are video surveillance and localization or navigation in robotics. For instance, a stereo rig with perspective and fisheye cameras can be mounted on a unmanned aerial vehicle (UAV, see Figure 2) to estimate its altitude using images from both cameras (through a method called a plane-sweeping algorithm). 1 On the other hand, the attitude of the aircraft, that is, its orientation relative to a reference line or plane, is determined thanks to the fisheye view and using the horizon line or straight lines in urban environments (such as the edge of a building). The field of view of the fisheye lens is wide enough that it can capture the ground—which is projected in the center of the image—and the horizon line—which appears on the border of the picture. Image processing can then detect this reference line and give the atti-tude of the UAV. Recently, we combined both views to have a precise and robust estimation of UAV motion. 2 To retrieve 3D information on the environment and mo-tion of a hybrid stereo rig, the device has to be calibrated first, a process that consists in determining the cameras' intrin-sic and extrinsic parameters. The relative pose—rotation and Figure 1. Example of a hybrid stereo rig: perspective and fisheye cameras. Figure 2. A perspective-and-fisheye stereo rig embedded on an un-manned aerial vehicle and its images. translation—between the cameras are the extrinsic variables to be estimated. Intrinsic parameters, on the other hand, are those related to the projection models of the cameras in question. When capturing an image, each point of a 3D scene is pro-jected on the picture through the lens. The projection model is the geometry relationship that relates a point from a scene to a point on the image. Consequently, different types of projection are required depending on the camera, the lens, and eventually the mirror.
    SPIENewsroom 06/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Small Unmanned Aerial Vehicles (UAV) are typically driven by LiPo batteries. The batteries have their own dynamics, which changes during discharge. Classical approaches to altitude control assume time-invariant system and therefore fail. Adaptive controllers require an identified system model which is often unavailable. Battery dynamics can be characterized and used for a battery model-based controller. This controller is useful in situations when no feedback from actuators (such as RPM or thrust) is available. After measuring the battery dynamics for two distinct types of batteries, a controller is designed and experimentally verified, showing a consistent performance during whole discharge test. Unmanned Aerial Vehicles (UAV), Vertical Take-Off and Landing (VTOL), quadrotor, hexarotor, multirotor, altitude control, battery monitoring and modelling.
    Proc. of the International Conference on Unmanned Aircraft Systems (ICUAS); 01/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the combined applications of omnidirectional vision featuring on its application to aerial robotics. Omnidirectional vision is first used to compute the attitude, altitude and motion not only in rural environment but also in the urban space. Secondly, a combination of omnidirectional and perspective cameras permits to estimate the altitude. Finally we present a stereo system consisting of an omnidirectional camera with a laser pattern projector enables to calculate the altitude and attitude during the improperly illuminated conditions to dark environments. We demonstrate that omnidirectional camera in conjunction with other sensors is suitable choice for UAV applications not only in different operating environments but also in various illumination conditions.
    Journal of Intelligent and Robotic Systems 01/2013; 69(1-4). · 0.83 Impact Factor

Full-text (2 Sources)

Download
56 Downloads
Available from
May 30, 2014