Localization of Sensors and Objects in Distributed Omnidirectional Vision

Source: CiteSeer


application systems based on simple computer vision techniques. Especially, the practical approach recently focused on is to use multiple vision sensors with simple visual processing.

2 Reads
  • Source
    • "For these reasons, our proposed technique uses pedestrians moving through the environment as features for calibrating sensor positions. Similar work in multi-sensor localization using pedestrians as reference features has been performed with omnidirectional cameras [19], although these techniques cannot be directly applied due to fundamental differences in the nature of LRF and video data. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Laser range finders (LRF’s) are non-invasive sensors which can be used for high-precision and anonymous tracking of pedestrians in social environments. Such sensor networks can be used in robotics to assist in navigation and human–robot interaction. Typically, multiple LRF’s are used together for such tasks, and the relative positions of these sensors must be precisely calibrated. We propose a technique for estimating relative LRF positions using observations of social groups in the pedestrian flow as keypoint features for determining coarse estimates of relative sensor offsets. The most likely offset is estimated using a generalized Hough transform and used to identify sets of possible shared observations of individual pedestrians between pairs of sensors. Outliers are rejected using the RANSAC technique, and the resulting shared observations from each sensor pair are combined into a constraint matrix for the sensor network, which is solved using least-squares minimization. Results show calibration accuracy of sensor positions within 34 mm and 0.51°, and an analysis of pedestrian data collected from ubiquitous networks in three public and commercial spaces shows that the proposed calibration technique enables pedestrian tracking within 11 cm accuracy.
    Advanced Robotics 02/2014; 28(9). DOI:10.1080/01691864.2013.879272 · 0.57 Impact Factor
  • Source
    • "Another family of techniques often used for sensor localization utilizes natural or artificial landmarks in the environment to triangulate the positions of sensors. A thorough analysis of multi-sensor localization problems and proposes techniques for qualitative localization which could be applied to LRF's is presented in [21], but the main focus is on localization and tracking with omnidirectional cameras. Other research has focused on localizing sensor networks based on distances and connectivity between the nodes, where the sensors themselves are landmarks [22]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Laser range finders are a non-invasive tool which can be used for anonymously tracking the motion of people and robots in real-world environments with high accuracy. Based on a commercial system we have developed, this paper addresses two practical issues of using networks of portable laser range finders in field environments. We first describe a technique for automated calibration of sensor positions and orientations, by using velocity-based matching of observed human trajectories to define constraints between the sensors. We then propose a mechanism for detecting when a sensor has been moved out of alignment, which can be used to alert an operator of the condition and automatically exclude erroneous data from tracking calculations. After describing our techniques for solving these problems, we demonstrate the effectiveness of our calibration and error detection systems in live trials with our real-time system, as well as offline tests based on scan data recorded from field trials.
    2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan; 01/2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Wireless Multimedia Sensor Networks (WMSNs) are gaining popularity among researchers over the past few years. Knowledge of the geographic locations of the sensor nodes is very important in a WMSN. In this paper we propose a new algorithm which uses the connectivity information, the estimated distance information among the sensor nodes, as well as the vision images to find the location of the sensor nodes to enable sensor calibration. We achieve this by solving an ID association problem in the WMSN. We then generate local maps for nodes in immediate vicinity and merge them together to get a global map. We demonstrate the effectiveness of our proposed approach through computer simulation.
    Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on; 11/2010

Similar Publications