Conference Paper

Getting More from PCA: First Results of Using Principal Component Analysis for Extensive Power Analysis

DOI: 10.1007/978-3-642-27954-6_24 Conference: Topics in Cryptology - CT-RSA 2012 - The Cryptographers' Track at the RSA Conference 2012, San Francisco, CA, USA, February 27 - March 2, 2012. Proceedings
Source: DBLP


Differential Power Analysis (DPA) is commonly used to obtain information about the secret key used in cryptographic devices. Countermeasures against DPA can cause power traces to be misaligned, which reduces the effectiveness of DPA. Principal Component Analysis (PCA) is a powerful tool, which is used in different research areas to identify trends in a data set. Principal Components are introduced to describe the relationships within the data. The largest principal components capture the data with the largest variance. These Principal Components can be used to reduce the noise in a data set or to transform the data set in terms of these components. We propose the use of Principal Component Analysis to improve the correlation for the correct key guess for DPA attacks on software DES traces and show that it can also be applied for other algorithms. We also introduce a new way of determining key candidates by calculating the absolute average value of the correlation traces after a DPA attack on a PCA-transformed trace. We conclude that Principal Component Analysis can successfully be used as a preprocessing technique to reduce the noise in a trace set and improve the correlation for the correct key guess using Differential Power Analysis attacks.

Download full-text


Available from: Lejla Batina, Feb 12, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Spectral methods, ranging from traditional Principal Components Analysis to modern Laplacian matrix factorization, have proven to be a valuable tool for a wide range of diverse data mining applications. Commonly these methods are stated as optimization problems and employ the extremal (maximal or minimal) eigenvectors of a certain input matrix for deriving the appropriate statistical inferences. Interestingly, recent studies have questioned this "modus operandi" and revealed that useful information may also be present within low-order eigenvectors whose mass is concentrated (localized) in a small part of their indexes. An application context where localized low-order eigenvectors have been successfully employed is "Differential Power Analysis" (DPA). DPA is a well studied side-channel attack on cryptographic hardware devices (such as smart cards) that employs statistical analysis of the device's power consumption in order to retrieve the secret key of the cryptographic algorithm. In this work we propose a data mining (clustering) formulation of the DPA process and also provide a theoretical model that justifies and explains the utility of low-order eigenvectors. In our data mining formulation, we consider that the key-relevant information is modelled as a "low-signal" pattern that is embedded in a "high-noise" dataset. In this respect our results generalize beyond DPA and are applicable to analogous low-signal, hidden pattern problems. The experimental results using power trace measurements from a programmable smart card, verify our approach empirically.
    Full-text · Conference Paper · Sep 2012
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Pre-processing techniques are widely used to increase the success rate of side-channel analysis when attacking (protected) implementations of cryptographic algorithms. However, as of today, the according steps are usually chosen heuristically. In this paper, we present an analytical expression for the correlation coefficient after applying a linear transform to the side-channel traces. Doing so, we are able to precisely quantify the influence of a linear filter on the result of a correlation power analysis. On this basis, we demonstrate the use of optimisation algorithms to efficiently and methodically derive "optimal" filter coefficients in the sense that they maximise a given definition for the distinguishability of the correct key candidate. We verify the effectiveness of our methods by analysing both simulated and real-world traces for a hardware implementation of the AES.
    Full-text · Conference Paper · Nov 2012
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Since the introduction of side channel attacks in the nineties, a large amount of work has been devoted to their effectiveness and efficiency improvements. On the one side, general results and conclusions are drawn in theoretical frameworks, but the latter ones are often set in a too ideal context to capture the full complexity of an attack performed in real conditions. On the other side, practical improvements are proposed for specific contexts but the big picture is often put aside, which makes them difficult to adapt to different contexts. This paper tries to bridge the gap between both worlds. We specifically investigate which kind of issues is faced by a security evaluator when performing a state of the art attack. This analysis leads us to focus on the very common situation where the exact time of the sensitive processing is drown in a large number of leakage points. In this context we propose new ideas to improve the effectiveness and/or efficiency of the three considered attacks. In the particular case of stochastic attacks, we show that the existing literature, essentially developed under the assumption that the exact sensitive time is known, cannot be directly applied when the latter assumption is relaxed. To deal with this issue, we propose an improvement which makes stochastic attack a real alternative to the classical correlation power analysis. Our study is illustrated by various attack experiments performed on several copies of three micro-controllers with different CMOS technologies (respectively 350, 130 and 90 nanometers).
    Full-text · Article · Jan 2013
Show more