Conference Paper

Fusion of inertial, vision, and air pressure sensors for MAV navigation

DOI: 10.1109/MFI.2008.4648040 Conference: Multisensor Fusion and Integration for Intelligent Systems, 2008. MFI 2008. IEEE International Conference on
Source: IEEE Xplore

ABSTRACT Traditional methods used for navigating miniature unmanned aerial vehicles (MAVs) consist of fusion between Global Positioning System (GPS) and inertial measurement unit (IMU) information. However, many of the flight scenarios envisioned for MAVs (in urban terrain, indoors, in hostile (jammed) environments, etc.) are not conducive to utilizing GPS. Navigation in GPS-denied areas can be performed using an IMU only. However, the size, weight, and power constraints of MAVs severely limits the quality of IMUs that can be placed on-board the MAVs, making IMU-only navigation extremely inaccurate. In this paper, we introduce a system for fusing information from two additional sensors (an electro-optical camera and differential air pressure sensor) with the IMU to improve the navigation abilities of the MAV. We discuss some important implementation issues that must be addressed when fusing information from these sensors together. Results demonstrate an improvement of at least 10x in final position accuracy when fusing together information from these sensors as outlined in this paper.

0 Bookmarks
 · 
65 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Inertia-visual sensor systems become more and more popular in mobile robotics. They allow for global and drift-free localization at high dynamics. Cameras and inertial measurement units (IMUs) are complementary sensors which mutually enhance if correctly fused. However, this complementary nature brings major problems for the IMU to camera registration. Several solutions to compute the spatial alignment are described in literature. In this work, we want to stress the importance of temporal alignment and compare two methods for determining the temporal displacement of sensor measurements. The presented temporal registration can be used as independent preprocessing step without any knowledge about the spatial relation. Further, we present closed-form methods to initialize the angular alignment of the IMU and the camera which can also be applied to setups with gyroscopes only. If high accuracies are required this result can be used to initialize any filter or batch-optimization method to improve convergence and reduce processing time. Simulations and experiments illustrate the presented methods and underline the importance of temporal alignment.
    Robotics and Biomimetics (ROBIO), 2011 IEEE International Conference on; 01/2011
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a cost-effective approach to design an embedded autopilot system for a 4-rotor aerial vehicle. The system is targeted for autonomous control research, and it consists of an onboard system and a ground station. The onboard system interfaces with the vehicle and provides data link for sensor measurements and control data. The ground station provides a user-friendly interface for logging, monitoring and controlling. The paper will detail the system design and integration.
    01/2010;