Conference Paper

LISTEN: Non-interactive Localization in Wireless Camera Sensor Networks.

DOI: 10.1109/RTSS.2010.15 Conference: Proceedings of the 31st IEEE Real-Time Systems Symposium, RTSS 2010, San Diego, California, USA, November 30 - December 3, 2010
Source: DBLP


Recent advances in the application field increasingly demand the use of wireless camera sensor networks (WCSNs), for which localization is a crucial task to enable various location-based services. Most of the existing localization approaches for WCSNs are essentially interactive, i.e. require the interaction among the nodes throughout the localization process. As a result, they are costly to realize in practice, vulnerable to sniffer attacks, inefficient in energy consumption and computation. In this paper we propose LISTEN, a non-interactive localization approach. Using LISTEN, every camera sensor node only needs to silently listen to the beacon signals from a mobile beacon node and capture a few images until determining its own location. We design the movement trajectory of the mobile beacon node, which guarantees to locate all the nodes successfully. We have implemented LISTEN and evaluated it through extensive experiments. The experimental results demonstrate that it is accurate, efficient, and suitable for WCSNs that consist of low-end camera sensors.

16 Reads
  • Source
    • "persons) or by detecting common features among overlapping fields of view [34]. In addition, mobile 185 nodes [35] and even robots [36] [37] [38] can be used to locate the position of the cameras, and then the camera network can localise the robot accurately. For instance, in [37] Rekleitis et al. were able to calibrate a network of 7 cameras in a 450 m 2 environment. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose a multi-sensor fusion algorithm based on particle filters for mobile robot localisation in crowded environments. Our system is able to fuse the information provided by sensors placed on-board, and sensors external to the robot (off-board). We also propose a methodology for fast system deployment, map construction, and sensor calibration with a limited number of training samples. We validated our proposal experimentally with a laser range-finder, a WiFi card, a magnetic compass, and an external multi-camera network. We have carried out experiments that validate our deployment and calibration methodology. Moreover, we performed localisation experiments in controlled situations and real robot operation in social events. We obtained the best results from the fusion of all the sensors available: the precision and stability was sufficient for mobile robot localisation. No single sensor is reliable in every situation, but nevertheless our algorithm works with any subset of sensors: if a sensor is not available, the performance just degrades gracefully.
    Information Fusion 04/2015; 27. DOI:10.1016/j.inffus.2015.03.006 · 3.68 Impact Factor