Map-aided localization in sparse global positioning system environments using vision and particle filtering.

Journal of Field Robotics (Impact Factor: 2.15). 09/2011; 28:619-643. DOI: 10.1002/rob.20395
Source: DBLP

ABSTRACT A map-aided localization approach using vision, inertial sensors when available, and a particle filter is proposed and empirically evaluated. The approach, termed PosteriorPose, uses a Bayesian particle filter to augment global positioning system (GPS) and inertial navigation solutions with vision-based measurements of nearby lanes and stop lines referenced against a known map of environmental features. These map-relative measurements are shown to improve the quality of the navigation solution when GPS is available, and they are shown to keep the navigation solution converged in extended GPS blackouts. Measurements are incorporated with careful hypothesis testing and error modeling to account for non-Gaussian and multimodal errors committed by GPS and vision-based detection algorithms. Using a set of data collected with Cornell's autonomous car, including a measure of truth via a high-precision differential corrections service, an experimental investigation of important design elements of the PosteriorPose estimator is conducted. The algorithm is shown to statistically outperform a tightly coupled GPS/inertial navigation solution both in full GPS coverage and in extended GPS blackouts. Statistical performance is also studied as a function of road type, filter likelihood models, bias models, and filter integrity tests. © 2011 Wiley Periodicals, Inc. © 2011 Wiley Periodicals, Inc.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Accurate localization of moving sensors is essential for many fields, such as robot navigation and urban mapping. In this paper, we present a framework for GPS-supported visual Simultaneous Localization and Mapping with Bundle Adjustment (BA-SLAM) using a rigorous sensor model in a panoramic camera. The rigorous model does not cause system errors, thus representing an improvement over the widely used ideal sensor model. The proposed SLAM does not require additional restrictions, such as loop closing, or additional sensors, such as expensive inertial measurement units. In this paper, the problems of the ideal sensor model for a panoramic camera are analysed, and a rigorous sensor model is established. GPS data are then introduced for global optimization and georeferencing. Using the rigorous sensor model with the geometric observation equations of BA, a GPS-supported BA-SLAM approach that combines ray observations and GPS observations is then established. Finally, our method is applied to a set of vehicle-borne panoramic images captured from a campus environment, and several ground control points (GCP) are used to check the localization accuracy. The results demonstrated that our method can reach an accuracy of several centimetres.
    Sensors 01/2012; 13(1):119-36. · 2.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: A Gaussian sum filter (GSF) with component extended Kalman filters (EKF) is proposed as an approach to localizing an autonomous vehicle in an urban environment with limited GPS availability. The GSF uses vehicle-relative vision-based measurements of known map features coupled with inertial navigation solutions to accomplish localization in the absence of GPS. The vision-based measurements have multimodal measurement likelihood functions that are well represented as weighted sums of Gaussian densities. The GSF is used because of its ability to represent the posterior distribution of the vehicle pose with better efficiency (fewer terms, less computational complexity) than a corresponding bootstrap particle filter with various numbers of particles because of the interaction with measurement hypothesis tests. The expectation-maximization algorithm is used off line to determine the representational efficiency of the particle filter in terms of an effective number of Gaussian densities. In comparison, the GSF, which uses an iterative condensation procedure after each iteration of the filter to maintain real-time capabilities, is shown through a series of in-depth empirical studies to more accurately maintain a representation of the posterior distribution than the particle filter using 37 min of recorded data from Cornell University's autonomous vehicle driven in an urban environment, including a 32 min GPS blackout. © 2012 Wiley Periodicals, Inc. © 2012 Wiley Periodicals, Inc.
    Journal of Field Robotics 03/2012; 29(2):240-257. · 2.15 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a robot navigation system capable of online self-reconfiguration according to the needs imposed by the various contexts present in heterogeneous environments. The ability to cope with heterogeneous environments is key for a robust deployment of service robots in truly demanding scenarios. In the proposed system, flexibility is present at the several layers composing the robot's navigation system. At the lowest layer, proper locomotion modes are selected according to the environment's local context. At the highest layer, proper motion and path planning strategies are selected according to the environment's global context. While local context is obtained directly from the robot's sensory input, global context is inspected from semantic labels registered off-line on geo-referenced maps. The proposed system leverages on the well-known Robotics Operating System (ROS) framework for the implementation of the major navigation system components. The system was successfully validated over approximately 1 Km long experiments on INTROBOT, an all-terrain industrial-grade robot equipped with four independently steered wheels.
    Proc. of the IEEE Intl. Symp. on Industrial Electronics (ISIE). IEEE Press, 2013.; 01/2013