Conference Paper

Optimizing a Class of Feature Selection Measures

Conference: NIPS 2009 Workshop on Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra (DISCML)

ABSTRACT Feature selection is an important processing step in machine learning and the de-sign of pattern-recognition systems. A major challenge consists in the selection of relevant features in cases of high-dimensional data sets. In order to tackle the computational complexity, heuristic, sequential or random search strategies are applied frequently. These methods, however, often yield only locally optimal fea-ture sets that might be globally sub-optimal. The aim of our research is to derive a new, efficient approach that ensures globally optimal feature sets. We focus on the so-called filter methods. We show that a number of feature-selection measures, e.g., the correlation-feature-selection measure, the minimal-redundancy-maximal-relevance measure and others, can be fused and generalized. We formulate the fea-ture selection problem as a polynomial-mixed 0 – 1 fractional programming prob-lem (P M 01F P). To solve it, we transform the P M 01F P problem into a mixed 0-1 linear programming (M 01LP) problem. This transformation is performed by applying an improved Chang's method of grouping additional variables. To ob-tain the globally optimal solution to the M 01LP problem, the branch-and-bound algorithm can be used. Experimental results obtained over the UCI database show that our globally optimal method outperforms other heuristic search procedures by up to 10 % of redundant or confusing features that are removed from the original data set, while keeping or yielding an even better accuracy.

Full-text

Available from: Katrin Franke, May 29, 2015
0 Followers
 · 
476 Views
  • Lecture Notes in Computer Science 01/2010; DOI:10.1007/978-3-642-14706-7 · 0.51 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Flooding based DoS attack represents one of most danger attacks in computer networks. Maximizing the effectiveness of flooding based DoS Attack detection accuracy is the main concerns of many researchers. So, many of them are focusing on increasing the detection effectiveness by features reducing. However, limited research studies have concentrated on investigation the correlation between features together and its impact on DoS attack classification accuracy. Therefore and in this paper, remove correlated attributes algorithm has been proposed to select the most effective features on used in network traffic classification. Since that removing related features in a classification model minimizes the detection model error rate, It is a high likelihood that proposed model implementation increase flooding attack classification accuracy rate. In this research study, the proposed model experimental methodology and validation method has been highlighted.
    2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT); 12/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Feature extraction is the heart of an object recognition system. In recognition problem, features are utilized to classify one class of object from another. The original data is usually of high dimensionality. The objective of the feature extraction is to classify the object, and further to reduce the dimensionality of the measurement space to a space suitable for the application of object classification techniques. In the feature extraction process, only the salient features necessary for the recognition process are retained such that the classification can be implemented on a vastly reduced feature set. In paper we are going to discuss the feature as well as classification technique used in neural network.