Conference Paper

An Efficient Incremental Kernel Principal Component Analysis for Online Feature Selection

Kobe Univ., Kobe
DOI: 10.1109/IJCNN.2007.4371325 Conference: Proceedings of the International Joint Conference on Neural Networks, IJCNN 2007, Celebrating 20 years of neural networks, Orlando, Florida, USA, August 12-17, 2007
Source: DBLP

ABSTRACT In this paper, a feature extraction method for online classification problems is proposed by extending kernel principal component analysis (KPCA). In our previous work, we proposed an incremental KPCA algorithm which could learn a new input incrementally without keeping all the past training data. In this algorithm, eigenvectors are represented by a linear sum of linearly independent data which are selected from given training data. A serious drawback of the previous IKPCA is that many independent data are prone to be selected during learning and this causes large computation and memory costs. For this problem, we propose a novel approach to the selection of independent data; that is, they are not selected in the high-dimensional feature space but in the low-dimensional eigenspace spanned by the current eigenvectors. Using this method, the number of independent data is restricted to the number of eigenvectors. This restriction makes the learning of the modified IKPCA (M-IKPCA) very fast without loosing the approximation accuracy against true eigenvectors. To verify the effectiveness of M-IKPCA, the learning time and the accuracy of eigenspaces are evaluated using two UCI benchmark datasets. As a result, we confirm that the learning of M-IKPCA is at least 5 times faster than the previous version of IKPCA.

1 Follower
9 Reads
  • Source
    • "Finally, PCA and LDA transform inputs into linear features, and these features are not always effective for classification purposes. Recently kernel PCA and kernel LDA are widely noticed as high-performance feature extraction methods (Scholkopf 1996); hence, the extension of incremental learning approach to kernel PCA (Takeuchi 2007 "
    Evolving Intelligent Systems: Methodology and Applications, 04/2010: pages 151 - 171; , ISBN: 9780470569962
  • [Show abstract] [Hide abstract]
    ABSTRACT: Least squares support vector machine (LSSVM) has been used in soft sensor modeling in recent years. In developing a successful model based on LSSVM, the first important step is feature extraction. Principal components analysis (PCA) is a usual method for linear feature extraction and kernel PCA (KPCA) is a nonlinear PCA developed by using the kernel method. KPCA can efficiently extract the nonlinear relationship between original inputs. This paper proposes to combine KPCA with LSSVM to forecast the Mooney-viscosity of styrene butadiene rubber (SBR). KPCA is firstly applied for feature extraction. Then LSSVM is applied to proceed regression modeling. The experiment results show that KPCA-LSSVM features high learning speed, good approximation and generalization ability compared with SVM and PCA-SVM. The root mean square errors of the Mooney-viscosity in the KPCA-LSSVM, PCA-LSSVM and LSSVM are 0.0145, 0.0377 and 0.1775 respectively. LSSVM with KPCA for feature extraction has best performance. It may be used to efficiently guide production.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Hydrogen gas concentration forecasting and evaluation is very important for Bio-ethanol Steam Reforming hydrogen production. A lot of methods have been applied in the field of gas concentration forecasting including principal component analysis (PCA) and artificial neural network (ANN) etc. this paper used kernel principal component analysis (KPCA) as a preprocessor of Least Squares Support Vector Machine (LS-SVM) to extract the principal features of original data and employed the Particle Swarm Optimization (PSO) to optimize the free parameters of LS-SVM. Then LS-SVM is applied to proceed hydrogen gas concentration regression modeling. The experiment results show that KPCA-LSSVM features high learning speed, good approximation and generalization ability compared with SVM and PCA-SVM.
    12/2009: pages 190-197;
Show more