Fingerprint individuality refers to the extent of uniqueness of fingerprints and is the main criteria for deciding between a match versus nonmatch in forensic testimony. Often, prints are subject to varying levels of noise, for example, the image quality may be low when a print is lifted from a crime scene. A poor image quality causes human experts as well as automatic systems to make more errors in feature detection by either missing true features or detecting spurious ones. This error lowers the extent to which one can claim individualization of fingerprints that are being matched. The aim of this paper is to quantify the decrease in individualization as image quality degrades based on fingerprint images in real databases. This, in turn, can be used by forensic experts along with their testimony in a court of law. An important practical concern is that the databases used typically consist of a large number of fingerprint images so computational algorithms such as the Gibbs sampler can be extremely slow. We develop algorithms based on the Laplace approximation of the likelihood and infer the unknown parameters based on this approximate likelihood. Two publicly available databases, namely, FVC2002 and FVC2006, are analyzed from which estimates of individuality are obtained. From a statistical perspective, the contribution can be treated as an innovative application of Generalized Linear Mixed Models (GLMMs) to the field of fingerprint-based authentication.
"A generalized mixed model framework for assessing fingerprint individuality in presence of varying image quality." Ann. Appl. Stat. 8 (3) 1314 - 1340, September 2014. https://doi.org/10.1214/14-AOAS734