Generalized Discriminant Analysis: A Matrix Exponential Approach

Department of Computer Science, Chongqing University, Chongqing 400030, China.
IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics: a publication of the IEEE Systems, Man, and Cybernetics Society (Impact Factor: 6.22). 08/2009; 40(1):186-97. DOI: 10.1109/TSMCB.2009.2024759
Source: PubMed


Linear discriminant analysis (LDA) is well known as a powerful tool for discriminant analysis. In the case of a small training data set, however, it cannot directly be applied to high-dimensional data. This case is the so-called small-sample-size or undersampled problem. In this paper, we propose an exponential discriminant analysis (EDA) technique to overcome the undersampled problem. The advantages of EDA are that, compared with principal component analysis (PCA) + LDA, the EDA method can extract the most discriminant information that was contained in the null space of a within-class scatter matrix, and compared with another LDA extension, i.e., null-space LDA (NLDA), the discriminant information that was contained in the non-null space of the within-class scatter matrix is not discarded. Furthermore, EDA is equivalent to transforming original data into a new space by distance diffusion mapping, and then, LDA is applied in such a new space. As a result of diffusion mapping, the margin between different classes is enlarged, which is helpful in improving classification accuracy. Comparisons of experimental results on different data sets are given with respect to existing LDA extensions, including PCA + LDA, LDA via generalized singular value decomposition, regularized LDA, NLDA, and LDA via QR decomposition, which demonstrate the effectiveness of the proposed EDA method.

1 Follower
18 Reads
  • Source
    • "In this paper, we propose the application of exponential discriminant analysis (EDA) for fault detection and isolation. EDA has been studied for image classification in [39], however, to the best of authors' knowledge, the technique has not been utilized for FDI. The proposed technique overcomes the small sampling size problem by taking the exponential transformation. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Fisher discriminant analysis (FDA), as a dimensionality reduction technique, is widely used for fault diagnosis. It, however, suffers from under-sampled or small sample size (SSS) problem, that is, it cannot be directly applied in the case of small training data set in higher dimensional processes. Many modified approaches have been proposed to address this problem but a comprehensive solution to this problem is still missing. In this paper, we propose application of exponential discriminant analysis (EDA) for fault detection and isolation. The proposed technique not only overcomes the small sample size problem but also has an increased discriminant power. Compared with FDA, EDA is equivalent to performing an exponential transformation to distance between samples. Thus, the between-class distance is enlarged, whereas the within-class distance is shortened. Therefore, margin between different classes is enlarged, thereby improving fault isolation capability. Tennessee Eastman process is used as benchmark to present a comparison of FDA and EDA. Furthermore, EDA is applied for monitoring of Coupled Liquid Tanks System.
    Full-text · Article · Jul 2015 · Neurocomputing
  • Source
    • "In this section, we give first a brief description of Linear Discriminant Analysis (LDA), then we give a review of Side-Information based Linear Discriminant Analysis (SILD) method. After that, we present some matrix exponential mathematical properties used as a basis for the derivation of Exponential Discriminant Analysis EDA [12]. Finally, we present Side-Information based Exponential Discriminant Analysis SIEDA. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Recently, there is an extensive research efforts devoted to the challenging problem of face verification in unconstrained settings and weakly labeled data, where the task is to determine whether pairs of images are from the same person or not. In this paper, we propose a novel discriminative dimensionality reduction technique called Side-Information Exponential Discriminant Analysis (SIEDA) which inherits the advantages of both Side-Information Linear Discriminant (SILD) and Exponential Discriminant Analysis (EDA). SIEDA transforms the problem of face verification under weakly labeled data into a generalized eigenvalue problem while alleviating the preprocessing step of PCA dimensionality reduction. To further boost the performance, the multi-scale variant of the binarized statistical image features histograms are adopted for efficient and rich facial texture representation. Extensive experimental evaluation on the challenging Labeled Faces in the Wild LFW benchmark database demonstrates the superiority of SIEDA over SILD. Moreover, the obtained verification accuracy is impressive and compares favorably against the state-of-the-art.
    Full-text · Conference Paper · May 2015
  • Source
    • "It removes the null space of betweenclass scatter matrix and extracts the discriminant information that corresponds to the smallest eigenvalues of the withinclass scatter matrix. Zhang et al. [9] proposed an exponential discriminant analysis (EDA) method to extract the most "
    [Show abstract] [Hide abstract]
    ABSTRACT: Local Fisher discriminant analysis (LFDA) was proposed for dealing with the multimodal problem. It not only combines the idea of locality preserving projections (LPP) for preserving the local structure of the high-dimensional data but also combines the idea of Fisher discriminant analysis (FDA) for obtaining the discriminant power. However, LFDA also suffers from the undersampled problem as well as many dimensionality reduction methods. Meanwhile, the projection matrix is not sparse. In this paper, we propose double sparse local Fisher discriminant analysis (DSLFDA) for face recognition. The proposed method firstly constructs a sparse and data-adaptive graph with nonnegative constraint. Then, DSLFDA reformulates the objective function as a regression-type optimization problem. The undersampled problem is avoided naturally and the sparse solution can be obtained by adding the regression-type problem to a penalty. Experiments on Yale, ORL, and CMU PIE face databases are implemented to demonstrate the effectiveness of the proposed method.
    Full-text · Article · Mar 2015 · Mathematical Problems in Engineering
Show more