Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

Philips Res. Labs., Eindhoven, Netherlands
IEEE Transactions on Systems Man and Cybernetics - Part A Systems and Humans (Impact Factor: 2.18). 06/2010; DOI: 10.1109/TSMCA.2010.2041657
Source: DBLP

ABSTRACT In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Biometric discretization derives a binary string for each user based on an ordered set of real-valued biometric features. The false acceptance rate (FAR) and the false rejection rate (FRR) of a binary biometric-based system significantly relies on a Hamming distance threshold which decides whether the errors in the query bit string will be rectified with reference to the template bit string. Kelkboom have recently modeled a basic framework to estimate the FAR and the FRR of one-bit biometric discretization. However, as the demand of a bit string with higher entropy (informative length) rises, single-bit discretization is getting less useful today due to its incapability of producing bit string that is longer than the total feature dimensions being extracted, thus causing Kelkboom's model to be of restricted use. In this paper, we extend the analytical framework to multibit discretization for estimating the performance and the decision threshold for achieving a specified FAR/FRR based on equal-probable quantization and linearly separable subcode encoding. Promising estimation results on a synthetic data set with independent feature components and Gaussian measurements vindicate the analytical expressions of our framework. However, for experiments on two popular face data sets, deviation in estimation results were obtained mainly due to the mismatch of independency assumption of our framework. We hence fit the analytical probability mass functions (pmfs) to the experimental pmfs through estimating the mean and the variance parameters from the difference between the corresponding analytical and experimental curves to alleviate such estimation inaccuracies on these data sets.
    IEEE Transactions on Information Forensics and Security 01/2012; 7(4):1242-1254. · 1.90 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Biometric discretization is a key component in biometric cryptographic key generation. It converts an extracted biometric feature vector into a binary string via typical steps such as segmentation of each feature element into a number of labeled intervals, mapping of each interval-captured feature element onto a binary space, and concatenation of the resulted binary output of all feature elements into a binary string. Currently, the detection rate optimized bit allocation (DROBA) scheme is one of the most effective biometric discretization schemes in terms of its capability to assign binary bits dynamically to user-specific features with respect to their discriminability. However, we learn that DROBA suffers from potential discriminative feature misdetection and underdiscretization in its bit allocation process. This paper highlights such drawbacks and improves upon DROBA based on a novel two-stage algorithm: 1) a dynamic search method to efficiently recapture such misdetected features and to optimize the bit allocation of underdiscretized features and 2) a genuine interval concealment technique to alleviate crucial information leakage resulted from the dynamic search. Improvements in classification accuracy on two popular face data sets vindicate the feasibility of our approach compared with DROBA.
    IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics: a publication of the IEEE Systems, Man, and Cybernetics Society 10/2012; · 3.01 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper focuses on a biometric cryptosystem implementation and evaluation based on a number of fingerprint texture descriptors. The texture descriptors, namely, the Gabor filter-based FingerCode, a local binary pattern (LBP), and a local direction pattern (LDP), and their various combinations are considered. These fingerprint texture descriptors are binarized using a biometric discretization method and used in a fuzzy commitment scheme (FCS). We constructed the biometric cryptosystems, which achieve a good performance, by fusing discretized fingerprint texture descriptors and using effective error-correcting codes. We tested the proposed system on a FVC2000 DB2a fingerprint database, and the results demonstrate that the new system significantly improves the performance of the FCS for texture-based fingerprints.
    Expert Systems with Applications 04/2013; 40(5):1888–1901. · 1.85 Impact Factor

Full-text (2 Sources)

Available from
May 16, 2014