January 2004
·
25 Reads
·
35 Citations
Pour obtenir le grade de DOCTEUR de l’UNIVERSITÉ PARIS VI
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
January 2004
·
25 Reads
·
35 Citations
Pour obtenir le grade de DOCTEUR de l’UNIVERSITÉ PARIS VI
January 2003
·
7 Reads
In this report we address the problem of non-frontal face verification when only a frontal training image is available (e.g. a passport photograph) by augmenting a client's frontal face model with artificially synthesized models for non-frontal views. In the framework of a Gaussian Mixture Model (GMM) based classifier, two techniques are proposed for the synthesis: UBMdiff and LinReg. Both techniques rely on a priori information and learn how face models for the frontal view are related to face models at a non-frontal view. The synthesis and augmentation approach is evaluated by applying it to two face verification systems: Principal Component Analysis (PCA) based and DCTmod2 [31] based; the two systems are a representation of holistic and non-holistic approaches, respectively. Results from experiments on the FERET database suggest that in almost all cases, frontal model augmentation has beneficial effects for both systems; they also suggest that the LinReg technique (which is based on multivariate regression of classifier parameters) is more suited to the PCA based system and that the UBMdiff technique (which is based on differences between two general face models) is more suited to the DCTmod2 based system. The results also support the view that the standard DCTmod2/GMM system (trained on frontal faces) is less affected by out-of-plane rotations than the corresponding PCA/GMM system; moreover, the DCTmod2/GMM system using augmented models is, in almost all cases, more robust than the corresponding PCA/GMM system. NOTE: This report has been superseded by IDIAP-RR 04-04.
... (a) Exact Hessian of a Transformer in [57] (b) Exact Hessian of a MLP [7] (c) Estimated Hessian of a MLP [42] (d) Estimated Hessian of a MLP [33] The findings in [57] suggest that it is necessary to use a different learning rate for each block. Nonetheless, Adam does much more than that: it assigns an individual learning rate not just for each block, but for each parameter. ...
January 2004