Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

Philips Res. Labs., Eindhoven, Netherlands
IEEE Transactions on Systems Man and Cybernetics - Part A Systems and Humans (Impact Factor: 2.18). 06/2010; 40(3). DOI: 10.1109/TSMCA.2010.2041657
Source: DBLP


In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases.

Download full-text


Available from: Gary Garcia-Molina,
  • Source
    • "In keyless systems, the procedure is similar with K, J, and L removed from the above description. To keep the development simple, we considered only a single enrollment A and a single probe D above; in practice, using multiple biometric measurements during the enrollment or decision phase can improve the authentication accuracy [19]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: BIOMETRICS are an important and widely used class of methods for identity verification and access control. Biometrics are attractive because they are inherent properties of an individual. They need not be remembered like passwords, and are not easily lost or forged like identifying documents. At the same time, bio- metrics are fundamentally noisy and irreplaceable. There are always slight variations among the measurements of a given biometric, and, unlike passwords or identification numbers, biometrics are derived from physical characteristics that cannot easily be changed. The proliferation of biometric usage raises critical privacy and security concerns that, due to the noisy nature of biometrics, cannot be addressed using standard cryptographic methods. In this article we present an overview of "secure biometrics", also referred to as "biometric template protection", an emerging class of methods that address these concerns.
    IEEE Signal Processing Magazine 05/2013; 30(5). DOI:10.1109/MSP.2013.2261691 · 5.85 Impact Factor
  • Source
    • "To model these variabilities, we use the Parallel Gaussian Channels (PGC) as portrayed in Fig. 4(a). This approach has been successfully used on estimating the performance of two biometric databases in Kelkboom et al. (2010) [31] in which the validity of the PGC approach is shown. We assume an ideal Acquisition and Feature-Extraction module which always produces the same feature vector µ i for subject i. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Template protection techniques are used within biometric systems in order to protect the stored biometric template against privacy and security threats. A great portion of template protection techniques are based on extracting a key from, or binding a key to the binary vector derived from the biometric sample. The size of the key plays an important role, as the achieved privacy and security mainly depend on the entropy of the key. In the literature, it can be observed that there is a large variation on the reported key lengths at similar classification performance of the same template protection system, even when based on the same biometric modality and database. In this work, we determine the analytical relationship between the classification performance of the fuzzy commitment scheme and the theoretical maximum key size given as input a Gaussian biometric source. We show the effect of the system parameters such as the biometric source capacity, the number of feature components, the number of enrolment and verification samples, and the target performance on the maximum key size. Furthermore, we provide an analysis of the effect of feature interdependencies on the estimated maximum key size and classification performance. Both the theoretical analysis, as well as an experimental evaluation using the MCYT fingerprint database showed that feature interdependencies have a large impact on performance and key size estimates. This property can explain the large deviation in reported key sizes in literature.
    IEEE Transactions on Information Forensics and Security 08/2012; 7(4):1225-1241. DOI:10.1109/TIFS.2012.2191961 · 2.41 Impact Factor
  • Source
    • "A potential performance loss after the bit extraction process may represent the penalty for the requirement to extract a binary representation from the biometric sample. In [11], the performance of a single bit extraction process with a Hamming distance classifier has been theoretically determined under the assumption that the biometric data is Gaussian distributed. In this work we first discuss the theoretical performance of the optimal likelihood-ratio continuous classifier, under the assumption that the biometric data is Gaussian distributed. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Template protection techniques are privacy and security enhancing techniques of bio-metric reference data within a biometric system. Several of the template protection schemes known in the literature require the extraction of a binary representation from the real-valued biometric sample, which raises the question whether the bit extraction method reduces the classification performance. In this work we provide the theoreti-cal performance of the optimal log likelihood ratio continuous classifier and compare it with the theoretical performance of a binary Hamming distance classifier with a single bit extraction scheme as known from the literature. We assume biometric data mod-eled by a Gaussian between-class and within-class probability density with independent feature components and we also include the effect of averaging multiple enrolment and verification samples.
Show more