ThesisPDF Available

Ensuring the resilience of wireless sensor networks to malicious data injections through measurements inspection

Authors:

Abstract

Malicious data injections pose a severe threat to the systems based on Wireless Sensor Networks (WSNs) since they give the attacker control over the measurements, and on the system's status and response in turn. Malicious measurements are particularly threatening when used to spoof or mask events of interest, thus eliciting or preventing desirable responses. Spoofing and masking attacks are particularly difficult to detect since they depict plausible behaviours, especially if multiple sensors have been compromised and collude to inject a coherent set of malicious measurements. Previous work has tackled the problem through measurements inspection, which analyses the inter-measurements correlations induced by the physical phenomena. However, these techniques consider simplistic attacks and are not robust to collusion. Moreover, they assume highly predictable patterns in the measurements distribution, which are invalidated by the unpredictability of events. We design a set of techniques that effectively detect malicious data injections in the presence of sophisticated collusion strategies, when one or more events manifest. Moreover, we build a methodology to characterise the likely compromised sensors. We also design diagnosis criteria that allow us to distinguish anomalies arising from malicious interference and faults. In contrast with previous work, we test the robustness of our methodology with automated and sophisticated attacks, where the attacker aims to evade detection. We conclude that our approach outperforms state-of-the-art approaches. Moreover, we estimate quantitatively the WSN degree of resilience and provide a methodology to give a WSN owner an assured degree of resilience by automatically designing the WSN deployment. To deal also with the extreme scenario where the attacker has compromised most of the WSN, we propose a combination with software attestation techniques, which are more reliable when malicious data is originated by a compromised software, but also more expensive, and achieve an excellent trade-off between cost and resilience.
A preview of the PDF is not available
... For a reliable sensor network, there is then a need to discern between genuine measurements and measurements with systemic errors. In order to detect false data by measurement inspections, there exist many anomalous data detection techniques such as outlier detection and statistical tests [12]. These techniques are unable to detect false data in the event of collusion, however, this could be achieved by exploiting relationships between observable sensor measurements. ...
... Our assumptions for the attacker's resources and knowledge follow closely with those described by Illiano [12]. The assumptions made are the following: ...
Conference Paper
We propose a novel framework to detect false data injections in a low-density sensor environment with heterogeneous sensor data. The proposed detection algorithm learns how each sensor's data correlates within the sensor network, and false data is identified by exploiting the anomalies in these correlations. When a large number of sensors measuring homogeneous data are deployed, data correlations in space at a fixed snapshot in time could be used as as basis to detect anomalies. Exploiting disruptions in correlations when false data is injected has been used in a high-density sensor setting and proven to be effective. With increasing adoption of sensor deployments in low-density setting, there is a need to develop detection techniques for these applications. However, with constraints on the number of sensors and different data types, we propose the use of temporal correlations across the heterogeneous data to determine the authenticity of the reported data. We also provide an adversarial model that utilizes a graphical method to devise complex attack strategies where an attacker injects coherent false data in multiple sensors to provide a false representation of the physical state of the system with the aim of subverting detection. This allows us to test the detection algorithm and assess its performance in improving the resilience of the sensor network against data integrity attacks.
Conference Paper
Full-text available
Attestation and measurements inspection are different but complementary approaches towards the same goal: ascertaining the integrity of sensor nodes in wireless sensor networks. In this paper we compare the benefits and drawbacks of both techniques and seek to determine how to best combine them. However, our study shows that no single solution exists, as each choice introduces changes in the measurements collection process, affects the attestation protocol, and gives a different balance between the high detection rate of attestation and the low power overhead of measurements inspection. Therefore, we propose three strategies that combine measurements inspection and attestation in different ways, and a way to choose between them based on the requirements of different applications. We analyse their performance both analytically and in a simulator. The results show that the combined strategies can achieve a detection rate close to attestation, in the range 96--99%, whilst keeping a power overhead close to measurements inspection, in the range 1--10%.
Article
Full-text available
Wireless Sensor Networks carry a high risk of being compromised since their deployments are often unattended, physically accessible and the wireless medium is difficult to secure. Malicious data injections take place when the sensed measurements are maliciously altered to trigger wrong and potentially dangerous responses. When many sensors are compromised, they can collude with each other to alter the measurements making such changes difficult to detect. Distinguishing between genuine and malicious measurements is even more difficult when significant variations may be introduced because of events, especially if more events occur simultaneously. We propose a novel methodology based on wavelet transform to detect malicious data injections, to characterise the responsible sensors, and to distinguish malicious interference from faulty behaviours. The results, both with simulated and real measurements, show that our approach is able to counteract sophisticated attacks, achieving a significant improvement over state-of-the-art approaches.
Article
Full-text available
Attestation is a mechanism used by a trusted entity to validate the software integrity of an untrusted platform. Over the past few years, several attestation techniques have been proposed. While they all use variants of a challenge-response protocol, they make different assumptions about what an attacker can and cannot do. Thus, they propose intrinsically divergent validation approaches. We survey in this article the different approaches to attestation, focusing in particular on those aimed at Wireless Sensor Networks. We discuss the motivations, challenges, assumptions, and attacks of each approach. We then organise them into a taxonomy and discuss the state of the art, carefully analysing the advantages and disadvantages of each proposal. We also point towards the open research problems and give directions on how to address them.
Book
This third edition of "Low-Rate Wireless Personal Area Networks: Enabling Wireless Sensors with IEEE 802.15.4" is the newest handbook in the IEEE Standards Wireless Networks Series. This updated book now includes detailed information from the revised IEEE Std 802.15.4-2006, which includes the amendment IEEE 802.15.4b. IEEE Std 802.15.4 was developed to address low-cost and low-power design to enable applications in the fields of industrial, agricultural, vehicular, residential, and medical sensors and actuators. This book offers the reader an insider's view of the standard. Features include an overview of the standard, the motivation and vision behind it, background on the technology, technical features and components, application scenarios, and material not covered in the standard related to the network layer functionality for applications. The book also focuses on implementation and system design considerations, including an analysis of system-level, real-world issues that will be important for prospective implementers to consider. Presented in a concise and easy to read format by experts intimately involved in the development and writing of the standard, this guide is an invaluable resource to the standard for those interested in the field of "simple" wireless connectivity. Low-Rate Wireless Personal Area Networks, 2nd Edition, is "must read" for anyone who wants to fully understand the inner-workings and possibilities of the IEEE 802.15.4 standard. © 2010 by The Institute of Electrical and Electronics Engineers, Inc. All rights reserved.
Article
Our ability to synthesize sensory data that preserves specific statistical properties of the real data has had tremendous implications on data privacy and big data analytics. The synthetic data can be used as a substitute for selective real data segments,that are sensitive to the user, thus protecting privacy and resulting in improved analytics.However, increasingly adversarial roles taken by data recipients such as mobile apps, or other cloud-based analytics services, mandate that the synthetic data, in addition to preserving statistical properties, should also be difficult to distinguish from the real data. Typically, visual inspection has been used as a test to distinguish between datasets. But more recently, sophisticated classifier models (discriminators), corresponding to a set of events, have also been employed to distinguish between synthesized and real data. The model operates on both datasets and the respective event outputs are compared for consistency. In this paper, we take a step towards generating sensory data that can pass a deep learning based discriminator model test, and make two specific contributions: first, we present a deep learning based architecture for synthesizing sensory data. This architecture comprises of a generator model, which is a stack of multiple Long-Short-Term-Memory (LSTM) networks and a Mixture Density Network. second, we use another LSTM network based discriminator model for distinguishing between the true and the synthesized data. Using a dataset of accelerometer traces, collected using smartphones of users doing their daily activities, we show that the deep learning based discriminator model can only distinguish between the real and synthesized traces with an accuracy in the neighborhood of 50%.
Conference Paper
Remote attestation is a crucial security service particularly relevant to increasingly popular IoT (and other embedded) devices. It allows a trusted party (verifier) to learn the state of a remote, and potentially malware-infected, device (prover). Most existing approaches are static in nature and only check whether benign software is initially loaded on the prover. However, they are vulnerable to runtime attacks that hijack the application's control or data flow, e.g., via return-oriented programming or data-oriented exploits. As a concrete step towards more comprehensive runtime remote attestation, we present the design and implementation of Control-FLow ATtestation (C-FLAT) that enables remote attestation of an application's control-flow path, without requiring the source code. We describe a full prototype implementation of C-FLAT on Raspberry Pi using its ARM TrustZone hardware security extensions. We evaluate C-FLAT's performance using a real-world embedded (cyber-physical) application, and demonstrate its efficacy against control-flow hijacking attacks.