A new artificial neural network ensemble based on feature selection and class recoding

Neural Computing and Applications (Impact Factor: 1.57). 06/2012; 21(4):1-13. DOI: 10.1007/s00521-010-0458-5


Many of the studies related to supervised learning have focused on the resolution of multiclass problems. A standard technique
used to resolve these problems is to decompose the original multiclass problem into multiple binary problems. In this paper,
we propose a new learning model applicable to multi-class domains in which the examples are described by a large number of
features. The proposed model is an Artificial Neural Network ensemble in which the base learners are composed by the union
of a binary classifier and a multiclass classifier. To analyze the viability and quality of this system, it will be validated
in two real domains: traffic sign recognition and hand-written digit recognition. Experimental results show that our model
is at least as accurate as other methods reported in the bibliography but has a considerable advantage respecting size, computational
complexity, and running time.

KeywordsClassifier ensemble-Multiclass learning-Neural networks-Feature selection-Class recoding

Download full-text


Available from: German Gutierrez
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Dietterich (1998) reviews five statistical tests and proposes the 5 x 2 cv t test for determining whether there is a significant difference between the error rates of two classifiers. In our experiments, we noticed that the 5 x 2 cv t test result may vary depending on factors that should not affect the test, and we propose a variant, the combined 5 x 2 cv F test, that combines multiple statistics to get a more robust test. Simulation results show that this combined version of the test has lower type I error and higher power than 5 x 2 cv proper.
    Preview · Article · Dec 1999 · Neural Computation
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Negative Correlation Learning (NCL) is a popular combining method that employs special error function for the simultaneous training of base neural network (NN) experts. In this article, we propose an improved version of NCL method in which the capability of gating network, as the combining part of Mixture of Experts method, is used to combine the base NNs in the NCL ensemble method. The special error function of the NCL method encourages each NN expert to learn different parts or aspects of the training data. Thus, the local competence of the experts should be considered in the combining approach. The gating network provides a way to support this needed functionality for combining the NCL experts. So the proposed method is called Gated NCL. The improved ensemble method is compared with the previous approaches were used for combining NCL experts, including winner-take-all (WTA) and average (AVG) combining techniques, in solving several classification problems from UCI machine learning repository. The experimental results show that our proposed ensemble method significantly improved performance over the previous combining approaches.
    Full-text · Article · Jan 2011 · Neural Computing and Applications
  • [Show abstract] [Hide abstract]
    ABSTRACT: Humans often seek a second or third opinion about an important matter. Then, a final decision is reached after weighing and combining these opinions. This idea is the base of the ensemble based systems. Ensembles of classifiers are well established as a method for obtaining highly accurate classifiers by combining less accurate ones. On the other hand, evolving classifiers are inspired by the idea of evolve their structure in order to adapt to the changes of the environment. In this paper, we present a proof-of-concept method for constructing an ensemble system based on Evolving Fuzzy Systems. The main contribution of this approach is that the base-classifiers are self-developing (evolving) Fuzzy-rule-based (FRB) classifiers. Thus, we present an ensemble system which is based on evolving classifiers and keeps the properties of the evolving approach classification of streaming data. It is important to clarify that the evolving classifiers are gradually developing but they are not genetic or evolutionary.
    No preview · Conference Paper · Jan 2013
Show more