Combined SVM-Based Feature Selection and Classification

Universität Mannheim, Mannheim, Baden-Württemberg, Germany
Machine Learning (Impact Factor: 1.89). 11/2005; 61(1-3):129-150. DOI: 10.1007/s10994-005-1505-9
Source: DBLP

ABSTRACT Feature selection is an important combinatorial optimisation problem in the context of supervised pattern classification. This paper presents four novel continuous feature selection approaches directly minimising the classifier performance. In particular, we include linear and nonlinear Support Vector Machine classifiers. The key ideas of our approaches are additional regularisation and embedded nonlinear feature selection. To solve our optimisation problems, we apply difference of convex functions programming which is a general framework for non-convex continuous optimisation. Experiments with artificial data and with various real-world problems including organ classification in computed tomography scans demonstrate that our methods accomplish the desired feature selection and classification performance simultaneously.

10 Reads
  • Source
    • "With respect to search strategies, complete, heuristic [5], random [6], [7]strategies were proposed. And with respect 978-1-4799-6594-6/14/$31.00 ©2014 IEEE 346 to evaluating measures, these methods can be nearly divided into two classes: classifiers-specic [8], [9], [10] and classifier independent [II], [12], [13]. The previous employs a learning algorithm to evaluate the quality of selected features based on the classification accuracies or contribution to the classification boundary, such as the so called wrapper method [8] and weight based algorithms [14], [15]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a new feature selection technique based on rough sets and bat algorithm (BA). BA is attractive for feature selection in that bats will discover best feature combinations as they fly within the feature subset space. Compared with GAs, BA does not need complex operators such as crossover and mutation, it requires only primitive and simple mathematical operators, and is computationally inexpensive in terms of both memory and runtime. A fitness function based on rough-sets is designed as a target for the optimization. The used fitness function incorporates both the classification accuracy and number of selected features and hence balances the classification performance and reduction size. This paper make use of four initialisation strategies for starting the optimization and studies its effect on bat performance. The used initialization reflects forward and backward feature selection and combination of both. Experimentation is carried out using UCI data sets which compares the proposed algorithm with a GA-based and PSO approaches for feature reduction based on rough-set algorithms. The results on different data sets shows that bat algorithm is efficient for rough set-based feature selection. The used rough-set based fitness function ensures better classification result keeping also minor feature size.
    9th International Conference on Computer Engineering & Systems (ICCES), Cairo, Egypt; 12/2014
  • Source
    • "As a result, it is difficult to evaluate heterogeneous features concurrently. However, most conventional FS algorithms focus on datasets with homogeneous features, which can be roughly categorized into two types: numerical FS [16] [17] [18] [19] [20] [21], and non-numerical FS [22] [23] [24] [25]. Several methods were also proposed to solve the problem of heterogeneous feature selection. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Conventional mutual information (MI) based feature selection (FS) methods are unable to handle heterogeneous feature subset selection properly because of data format differences or estimation methods of MI between feature subset and class label. A way to solve this problem is feature transformation (FT). In this study, a novel unsupervised feature transformation (UFT) which can transform non-numerical features into numerical features is developed and tested. The UFT process is MI-based and independent of class label. MI-based FS algorithms, such as Parzen window feature selector (PWFS), minimum redundancy maximum relevance feature selection (mRMR), and normalized MI feature selection (NMIFS), can all adopt UFT for pre-processing of non-numerical features. Unlike traditional FT methods, the proposed UFT is unbiased while PWFS is utilized to its full advantage. Simulations and analyses of large-scale datasets showed that feature subset selected by the integrated method, UFT-PWFS, outperformed other FT-FS integrated methods in classification accuracy.
    Neurocomputing 11/2014; 168. DOI:10.1016/j.neucom.2015.05.053 · 2.08 Impact Factor
  • Source
    • "Numerical experiments show that DCA is in many cases more robust and efficient than standard methods (see e.g. [3] [4] [6] [8] [9] [10] [11] [13] [14] [15] [19] and reference therein). "
    [Show abstract] [Hide abstract]
    ABSTRACT: We investigate a new application of Difference of Convex functions programming and DCA in solving the constrained two-dimensional non-guillotine cutting problem. This problem consists of cutting a number of rectangular pieces from a large rectangular object. The cuts are done under some constraints and the objective is to maximize the total value of the pieces cut. We reformulate this problem as a DC program and solve it by DCA. The performance of the approach is compared with the standard solver CPLEX.
Show more


10 Reads