Article

Combined SVM-Based Feature Selection and Classification.

Machine Learning (Impact Factor: 1.69). 01/2005; 61:129-150. DOI: 10.1007/s10994-005-1505-9
Source: DBLP

ABSTRACT Feature selection is an important combinatorial optimisation problem in the context of supervised pattern classification. This paper presents four novel continuous feature selection approaches directly minimising the classifier performance. In particular, we include linear and nonlinear Support Vector Machine classifiers. The key ideas of our approaches are additional regularisation and embedded nonlinear feature selection. To solve our optimisation problems, we apply difference of convex functions programming which is a general framework for non-convex continuous optimisation. Experiments with artificial data and with various real-world problems including organ classification in computed tomography scans demonstrate that our methods accomplish the desired feature selection and classification performance simultaneously.

0 Bookmarks
 · 
90 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: In our previous work, we have developed methods for selecting input variables for function approximation based on block addition and block deletion. In this paper, we extend these methods to feature selection. To avoid random tie breaking for a small sample size problem with a large number of features, we introduce the weighted sum of the recognition error rate and the average of margin errors as the feature selection and feature ranking criteria. In our methods, starting from the empty set of features, we add several features at a time until a stopping condition is satisfied. Then we search deletable features by block deletion. To further speedup feature selection, we use a linear programming support vector machine (LP SVM) as a preselector. By computer experiments using benchmark data sets we show that the addition of the average of margin errors is effective for small sample size problems with large numbers of features in realizing high generalization ability.
    Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition; 09/2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we consider the problem of feature selection and classification under uncertain data that is inherently prevalent in almost all datasets. Using principles of Robust Optimization, we propose a robust scheme to handle data with ellipsoidal model uncertainty. The difficulty in treating zero-norm ℓ0 in feature selection problem is overcome by using an appropriate approximation and DC (Difference of Convex functions) programming and DCA (DC Algorithm). The computational results show that the proposed robust optimization approach is more performant than a traditional approach in immunizing perturbation of the data.
    Proceedings of the 13th international conference on Advances in Data Mining: applications and theoretical aspects; 07/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Feature Selection is the problem of choosing a minimal subset from the set of all features which are sufficient and necessary for classifier. UTA [1] is a simple algorithm performed in the trained artificial neural network. UTA evaluates the features according to their accuracy by removing them one by one. This algorithm classifies features into three categories: relevant, irrelevant and redundant features. UTA can guarantees that all of the relevant features are useful, but the disadvantage of UTA is that all correlated features will be determined as irrelevant/redundant features; because they are evaluated one by one. But in fact some of them may be relevant features. Ant colony optimization (ACO) is widely used for feature selection, and has very good performance; but it needs to a reasonable running time. In this paper at the beginning a UTA algorithm is performed, and then an ACO is used for finding those useful features which UTA could not find them. Proposed algorithm (called UTAACO) efficiently improved the performance of UTA, and reduced the computational time of ACO. Obtained results indicate the robustness of UTAACO.
    2011 3rd International Conference on Machine Learning and Computing (ICMLC 2011), Singapore; 02/2011

Full-text

Download
0 Downloads