Conference Paper

An Efficient Incremental Kernel Principal Component Analysis for Online Feature Selection.

Kobe Univ., Kobe
DOI: 10.1109/IJCNN.2007.4371325 Conference: Proceedings of the International Joint Conference on Neural Networks, IJCNN 2007, Celebrating 20 years of neural networks, Orlando, Florida, USA, August 12-17, 2007
Source: DBLP

ABSTRACT In this paper, a feature extraction method for online classification problems is proposed by extending kernel principal component analysis (KPCA). In our previous work, we proposed an incremental KPCA algorithm which could learn a new input incrementally without keeping all the past training data. In this algorithm, eigenvectors are represented by a linear sum of linearly independent data which are selected from given training data. A serious drawback of the previous IKPCA is that many independent data are prone to be selected during learning and this causes large computation and memory costs. For this problem, we propose a novel approach to the selection of independent data; that is, they are not selected in the high-dimensional feature space but in the low-dimensional eigenspace spanned by the current eigenvectors. Using this method, the number of independent data is restricted to the number of eigenvectors. This restriction makes the learning of the modified IKPCA (M-IKPCA) very fast without loosing the approximation accuracy against true eigenvectors. To verify the effectiveness of M-IKPCA, the learning time and the accuracy of eigenspaces are evaluated using two UCI benchmark datasets. As a result, we confirm that the learning of M-IKPCA is at least 5 times faster than the previous version of IKPCA.

0 Bookmarks
 · 
65 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a new incremental learning algo­ rithm of Kernel Principal Component Analysis (KPCA) is proposed for online feature extraction in pattern recognition problems. The proposed algorithm is derived by extending the Takeuchi et al.'s Incremental KPCA (T-IKPCA) that can learn a new data incrementally without keeping past training data. However , even if more than two data are given in a chunk, T-IKPCA should learn them individually; that is, in order to update the eigen-feature space, the eigenvalue decomposition should be performed for every data in the chunk. To alleviate this problem, we extend T-IKPCA such that an eigen-feature space learning is conducted by performing the eigenvalue decomposition only once for a chunk of given data. In the proposed IKPCA, whenever a new chunk of training data are given, linearly independent data are first selected based on the cumulative proportion. Then, the eigenspace augmentation is conducted by calculating the coefficients for the selected linearly independent data, and the eigen-feature space is rotated based on the rotation matrix that can be obtained by solving a kernel eigenvalue problem. To verify the effectiveness of the proposed IKPCA, the learning time and the accuracy of eigenvectors are evaluated using the three VCI benchmark data sets. From the experimental results, we confirm that the proposed IKPCA can learn an eigen-feature space very fast without sacrificing the recognition accuracy.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose an incremental 2- directional 2-dimensional linear discriminant analysis (I- (2D) 2 LDA) for multitask pattern recognition (MTPR) problems in which a chunk of training data for a particular task are given sequentially and the task is switched to another related task one after another. In I-(2D) 2 LDA, a discriminant space of the current task spanned by 2 types of discriminant vectors is augmented with effective discriminant vectors that are selected from other tasks based on the class separability. We call the selective augmentation of discriminant vectors knowledge transfer of feature space. In the experiments, the proposed I- (2D) 2 LDA is evaluated for the three tasks using the ORL face data set: person identification (Task 1), gender recognition (Task 2), and young-senior discrimination (Task 3). The results show that the knowledge transfer works well for Tasks 2 and 3; that is, the test performance of gender recognition and that of young-senior discrimination are enhanced.
    01/2011;
  • [Show abstract] [Hide abstract]
    ABSTRACT: An incremental learning algorithm of Kernel Principal Component Analysis (KPCA) called Chunk Incremental KPCA (CIKPCA) has been proposed for online feature extraction in pattern recognition. CIKPCA can reduce the number of times to solve the eigenvalue problem compared with the conventional incremental KPCA when a small number of data are simultaneously given as a stream of data chunks. However, our previous work suggests that the computational costs of the independent data selection in CIKPCA could dominate over those of the eigenvalue decomposition when a large chunk of data are given. To verify this, we investigate the influence of the chunk size to the learning time in CIKPCA. As a result, CIKPCA requires more learning time than IKPCA unless a large chunk of data are divided into small chunks (e.g., less than 50).
    01/2012;