Article

Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

Philips Res. Labs., Eindhoven, Netherlands
IEEE Transactions on Systems Man and Cybernetics - Part A Systems and Humans (Impact Factor: 2.18). 06/2010; DOI: 10.1109/TSMCA.2010.2041657
Source: OAI

ABSTRACT In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases.

1 Follower
 · 
97 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this study a statistical attack against fuzzy commitment schemes is presented. Comparisons of different pairs of binary biometric feature vectors yield binomial distributions, the standard deviations of which are bounded by the entropy of biometric templates. In case error correction consists of a series of chunks, like in the vast majority of approaches, helper data become vulnerable to statistical attacks. Error-correction codewords are bound to separate parts of a binary template among which biometric entropy is dispersed. As a consequence, chunks of the helper data are prone to statistical significant false acceptance. In experimental evaluations the proposed attack is applied to different iris-biometric fuzzy commitment schemes retrieving cryptographic keys at alarming low effort.
    01/2012; 1(2):94-104. DOI:10.1049/iet-bmt.2011.0001
  • Conference Paper: A Real Helper Data Scheme
    [Show abstract] [Hide abstract]
    ABSTRACT: The helper data scheme utilizes a secret key to protect biometric templates. The current helper data scheme requires binary feature representations that introduce quantization error and thus reduce the capacity of biometric channels. For spectral-minutiae based fingerprint recognition systems, Shannon theory proves that the current helper data scheme cannot have more than 6 bits. A 6-bit secret key is too short to secure the storage of biometric templates. Therefore, we propose a new helper data scheme without quantization. A basic realization is to convert the real-valued feature vector into a phase vector. By applying the spectral minutiae method in the FVC2000-DB2 fingerprint database, our new helper data scheme together with repetition codes and BCH codes allows at least 76 secret bits.
    2013 2nd IAPR Asian Conference on Pattern Recognition (ACPR); 11/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Biometric discretization derives a binary string for each user based on an ordered set of biometric features. This representative string ought to be discriminative, informative, and privacy protective when it is employed as a cryptographic key in various security applications upon error correction. However, it is commonly believed that satisfying the first and the second criteria simultaneously is not feasible, and a tradeoff between them is always definite. In this article, we propose an effective fixed bit allocation-based discretization approach which involves discriminative feature extraction, discriminative feature selection, unsupervised quantization (quantization that does not utilize class information), and linearly separable subcode (LSSC)-based encoding to fulfill all the ideal properties of a binary representation extracted for cryptographic applications. In addition, we examine a number of discriminative feature-selection measures for discretization and identify the proper way of setting an important feature-selection parameter. Encouraging experimental results vindicate the feasibility of our approach.
    Journal on Advances in Signal Processing 01/2011; 2011(1). DOI:10.1186/1687-6180-2011-107 · 0.81 Impact Factor

Full-text (2 Sources)

Download
53 Downloads
Available from
May 16, 2014