Conference Paper

Data fusion improves the coverage of wireless sensor networks

DOI: 10.1145/1614320.1614338 Conference: Proceedings of the 15th Annual International Conference on Mobile Computing and Networking, MOBICOM 2009, Beijing, China, September 20-25, 2009
Source: DBLP


Wireless sensor networks (WSNs) have been increasingly available for critical applications such as security surveillance and environmental monitoring. An important performance measure of such applications is sensing coverage that characterizes how well a sensing field is monitored by a network. Although advanced collaborative signal processing algorithms have been adopted by many existing WSNs, most previous analytical studies on sensing coverage are conducted based on overly simplistic sensing models (e.g., the disc model) that do not capture the stochastic nature of sensing. In this paper, we attempt to bridge this gap by exploring the fundamental limits of coverage based on stochastic data fusion models that fuse noisy measurements of multiple sensors. We derive the scaling laws between coverage, network density, and signal-to-noise ratio (SNR). We show that data fusion can significantly improve sensing coverage by exploiting the collaboration among sensors. In particular, for signal path loss exponent of k (typically between 2.0 and 5.0), rho_f=O(rho_d^(1-1/k)), where rho_f and rho_d are the densities of uniformly deployed sensors that achieve full coverage under the fusion and disc models, respectively. Our results help understand the limitations of the previous analytical results based on the disc model and provide key insights into the design of WSNs that adopt data fusion algorithms. Our analyses are verified through extensive simulations based on both synthetic data sets and data traces collected in a real deployment for vehicle detection.

Download full-text


Available from: Chih-Wei Yi, Feb 09, 2015
  • Source
    • "For every pair of points, there is a probability of a sensor located at one point detecting an event at the other, and coverage is defined by the probability that at least one sensor detects an event. Value fusion is a different way to detect events [13]–[15]. A decision that an event occurred is made if the sum of all signal strengths is above some threshold. "
    [Show abstract] [Hide abstract]
    ABSTRACT: An important problem in the study of sensor networks is how to select a set of sensors that maximizes coverage of other sensors. Given pairwise coverage values, three commonly found functions give some estimate of the aggregate coverage possible by a set of sensors: maximum coverage by any selected sensor (MAX), total coverage by all selected sensors (SUM), and the probability of correct prediction by at least one sensor (PROB). MAX and SUM are two extremes of possible coverage, while PROB, based on an independence assumption, is in the middle. This paper addresses the following question: what guarantees can be made of coverage that is evaluated by an unknown sub-modular function of coverage when sensors are selected according to MAX, SUM, or PROB? We prove that the guarantees are very bad: In the worst case, coverage differs by a factor of sqrt(n), where n is the number of sensors. We show in simulations on synthetic and real data that the differences can be quite high as well. We show how to potentially address this problem using a hybrid of the coverage functions.
    DCOSS; 06/2015
  • Source
    • "measurements) or decisions (e.g. estimations) [30]. In the former case, measurements or features are fused to obtain the global estimate . "
    [Show abstract] [Hide abstract]
    ABSTRACT: The choice of the most suitable fusion scheme for smart cam-era networks depends on the application as well as on the available computational and communication resources. In this paper we discuss and compare the resource requirements of five fusion schemes, namely centralised fusion, flooding, consensus, token passing and dynamic clustering. The Ex-tended Information Filter is applied to each fusion scheme to perform target tracking. Token passing and dynamic clus-tering involve negotiation among viewing nodes (cameras observing the same target) to decide which node should per-form the fusion process whereas flooding and consensus do not include this negotiation. Negotiation helps limiting the number of participating cameras and reduces the required resources for the fusion process itself but requires additional communication. Consensus has the highest communication and computation costs but it is the only scheme that can be applied when not all viewing nodes are connected directly and routing tables are not available.
    8th ACM / IEEE International Conference on Distributed Smart Cameras (ICDSC 2014); 11/2014
  • Source
    • "Sensor networks are ubiquitous across many different domains, including wireless communications, temperature and process control, area surveillance, object tracking and numerous other fields [2] [6]. Large performance gains can be achieved in such networks by performing data fusion between the sensors, or combining information from the individual sensors to reach system-level decisions [9] [16] [24] [26]. The sensors are typically connected by wireless links to either a separate information collector (centralized fusion) or to each other (distributed fusion). "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper develops a mathematical and computational framework for analyzing the expected performance of Bayesian data fusion, or joint statistical inference, within a sensor network. We use variational techniques to obtain the posterior expectation as the optimal fusion rule under a deterministic constraint and a quadratic cost, and study the smoothness and other properties of its classification performance. For a certain class of fusion problems, we prove that this fusion rule is also optimal in a much wider sense and satisfies strong asymptotic convergence results. We show how these results apply to a variety of examples with Gaussian, exponential and other statistics, and discuss computational methods for determining the fusion system's performance in more general, large-scale problems. These results are motivated by studying the performance of fusing multi-modal radar and acoustic sensors for detecting explosive substances, but have broad applicability to other Bayesian decision problems.
    11/2013; 3(4). DOI:10.1093/imaiai/iau009
Show more