On General Definition of L1-norm Support Vector Machines for Feature Selection

International Journal of Machine Learning and Computing 08/2011; 1(2):279-283. DOI: 10.7763/IJMLC.2011.V1.41


In this paper, we introduce a new general definition of L1-norm SVM (GL1-SVM) for feature selection and represent it as a polynomial mixed 0-1 programming problem. We prove that solving the new proposed optimization problem reduces error penalty and enlarges the margin between two support vector hyper-planes. This possibly provides better generalization capability of SVM than solving the traditional L1-norm SVM proposed by Bradley and Mangasarian. We also propose a new search method that ensures obtaining of the global feature subset by means of the new GL1-SVM. The proposed search method is based on solving a mixed 0-1 linear programming (M01LP) problem by using the branch and bound algorithm. In this M01LP problem, the number of constraints and variables is linear in the number of full set features. Experimental results obtained over the UCI, LIBSVM, UNM and MIT Lincoln Lab data sets show that the new general L1-norm SVM gives better generalization capability, while selecting fewer features than the traditional L1-norm SVM in many cases. Index Terms—branch and bound, feature selection, L1-norm support vector machine, mixed 0-1 linear programming problem.

Download full-text


Available from: Hai Thanh Nguyen,