Conference Paper

LISTEN: Non-interactive Localization in Wireless Camera Sensor Networks.

DOI: 10.1109/RTSS.2010.15 Conference: Proceedings of the 31st IEEE Real-Time Systems Symposium, RTSS 2010, San Diego, California, USA, November 30 - December 3, 2010
Source: DBLP

ABSTRACT Recent advances in the application field increasingly demand the use of wireless camera sensor networks (WCSNs), for which localization is a crucial task to enable various location-based services. Most of the existing localization approaches for WCSNs are essentially interactive, i.e. require the interaction among the nodes throughout the localization process. As a result, they are costly to realize in practice, vulnerable to sniffer attacks, inefficient in energy consumption and computation. In this paper we propose LISTEN, a non-interactive localization approach. Using LISTEN, every camera sensor node only needs to silently listen to the beacon signals from a mobile beacon node and capture a few images until determining its own location. We design the movement trajectory of the mobile beacon node, which guarantees to locate all the nodes successfully. We have implemented LISTEN and evaluated it through extensive experiments. The experimental results demonstrate that it is accurate, efficient, and suitable for WCSNs that consist of low-end camera sensors.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A collaborative vision-based technique is proposed for local-izing the nodes of a surveillance network based on obser-vations of a non-cooperative moving target. The proposed method employs lightweight in-node image processing and limited data exchange between the nodes to determine the positions and orientations of the nodes participating in syn-chronized observations of the target. A node with an oppor-tunistic observation of a passing target broadcasts a synchro-nizing packet and triggers image capture by its neighbors. In the cluster of participating nodes, the triggering node and a helper node define a relative coordinate system. Once a small number of joint observations of the target are made by the nodes, the model allows for a decentralized or a cluster-based solution for the localization problem. No images are transferred between the network nodes for the localization task, making the proposed method efficient and scalable. Simulation and experimental results are provided to verify the performance of the proposed technique.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present the design, implementation, and evaluation of BeepBeep, a high-accuracy acoustic-based ranging system. It operates in a spontaneous, ad-hoc, and device-to-device context without leveraging any pre-planned infrastructure. It is a pure software-based solution and uses only the most ba- sic set of commodity hardware - a speaker, a microphone, and some form of device-to-device communication - so that it is readily applicable to many low-cost sensor platforms and to most commercial-off-the-shelf mobile devices like cell phones and PDAs. It achieves high accuracy through a combination of three techniques: two-way sensing, self- recording, and sample counting. The basic idea is the fol- lowing. To estimate the range between two devices, each will emit a specially-designed sound signal ("Beep") and collect a simultaneous recording from its microphone. Each recording should contain two such beeps, one from its own speaker and the other from its peer. By counting the num- ber of samples between these two beeps and exchanging the time duration information with its peer, each device can de- rive the two-way time of flight of the beeps at the granularity of sound sampling rate. This technique cleverly avoids many sources of inaccuracy found in other typical time-of-arrival schemes, such as clock synchronization, non-real-time han- dling, software delays, etc. Our experiments on two common cell phone models have shown that we can achieve around one or two centimeters accuracy within a range of more than ten meters, despite a series of technical challenges in impl e- menting the idea.
    Proceedings of the 5th International Conference on Embedded Networked Sensor Systems, SenSys 2007, Sydney, NSW, Australia, November 6-9, 2007; 01/2007
  • [Show abstract] [Hide abstract]
    ABSTRACT: This work proposes the novel use of spinning beacons for precise indoor localization. The proposed "SpinLoc" (Spinning Indoor Localization) system uses "spinning" ( i.e., rotating) beacons to create and detect predictable and high- ly distinguishable Doppler signals for sub-meter localiza- tion accuracy. The system analyzes Doppler frequency shifts of signals from spinning beacons, which are then used to calculate orientation angles to a target. By obtaining orientation angles from two or more beacons, SpinLoc can precisely locate stationary or slow-moving targets. After designing and implementing the system using MICA2 motes, its performance was tested in an indoor garage en- vironment. The experimental results revealed a median error of 40~50 centimeters and a 90% error of 70~90 cen- timeters.
    Proceedings of the 6th International Conference on Embedded Networked Sensor Systems, SenSys 2008, Raleigh, NC, USA, November 5-7, 2008; 01/2008