Conference Paper

Visual path following using only monocular vision for urban environments.

IRISA/INRIA, Rennes
DOI: 10.1109/IROS.2007.4399322 Conference: 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 29 - November 2, 2007, Sheraton Hotel and Marina, San Diego, California, USA
Source: DBLP

ABSTRACT This document provides a summary to a short video with the same title. The video shows the French intelligent transportation vehicle CyCab performing visual path following using only monocular vision. All phases of the process are shown with a spoken commentary. In the teaching phase, the user drives the robot manually while images from the camera are stored. Key images with corresponding images features are stored as a map together with 2D and 3D local information. In the navigation phase, CyCab follows the learned path by tracking the images features projected from the map and with a simple visual servoing control law.

0 Bookmarks
 · 
43 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Autonomous cars will likely play an important role in the future. A vision system designed to support outdoor navigation for such vehicles has to deal with large dynamic environments, changing imaging conditions, and temporary occlusions by other moving objects. This paper presents a novel appearance-based navigation framework relying on a single perspective vision sensor, which is aimed towards resolving of the above issues. The solution is based on a hierarchical environment representation created during a teaching stage, when the robot is controlled by a human operator. At the top level, the representation contains a graph of key-images with extracted 2D features enabling a robust navigation by visual servoing. The information stored at the bottom level enables to efficiently predict the locations of the features which are currently not visible, and eventually (re-)start their tracking. The outstanding property of the proposed framework is that it enables robust and scalable navigation without requiring a globally consistent map, even in interconnected environments. This result has been confirmed by realistic off-line experiments and successful real-time navigation trials in public urban areas.
    2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2007), 18-23 June 2007, Minneapolis, Minnesota, USA; 01/2007
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a complete system for outdoor robot navigation is presented. It uses only monocular vision. The robot is first guided on a path by a human. During this learning step, the robot records a video sequence. From this sequence, a three dimensional map of the trajectory and the environment is built. When this map has been computed, the robot is able to follow the same trajectory by itself. Experimental results carried out with an urban electric vehicle are shown and compared to the ground truth.
    Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ International Conference on; 09/2005
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper addresses the design of a control law for vision-based robot navigation. The method proposed is based on a topological representation of the environment. Within this context, a learning stage enables a graph to be built in which nodes represent views acquired by the camera, and edges denote the possibility for the robotic system to move from one image to an other. A path finding algorithm then gives the robot a collection of views describing the environment it has to go through in order to reach its desired position. This article focuses on the control law used for controlling the robot motion's online. The particularity of this control law is that it does not require any reconstruction of the environment, and does not force the robot to converge towards each intermediary position in the path. Landmarks matched between each consecutive views of the path are considered as successive features that the camera has to observe within its field of view. An original visual servoing control law, using specific features, ensures that the robot navigates within the visibility path. Simulation results demonstrate the validity of the proposed approach
    Robotics and Automation, 2006. ICRA 2006. Proceedings 2006 IEEE International Conference on; 06/2006

Full-text (2 Sources)

View
16 Downloads
Available from
Jun 5, 2014