Article

Multiobjective hybrid optimization and training of recurrent neural networks.

Department of Computer Science and Artificial Intelligence, University of Grenada, Grenada, Spain.
IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics) (Impact Factor: 3.24). 05/2008; 38(2):381-403. DOI:10.1109/TSMCB.2007.912937
Source: PubMed

ABSTRACT The application of neural networks to solve a problem involves tasks with a high computational cost until a suitable network is found, and these tasks mainly involve the selection of the network topology and the training step. We usually select the network structure by means of a trial-and-error procedure, and we then train the network. In the case of recurrent neural networks (RNNs), the lack of suitable training algorithms sometimes hampers these procedures due to vanishing gradient problems. This paper addresses the simultaneous training and topology optimization of RNNs using multiobjective hybrid procedures. The proposal is based on the SPEA2 and NSGA2 algorithms for making hybrid methods using the Baldwinian hybridization strategy. We also study the effects of the selection of the objectives, crossover, and mutation in the diversity during evolution. The proposals are tested in the experimental section to train and optimize the networks in the competition on artificial time-series (CATS) benchmark.

0 0
 · 
0 Bookmarks
 · 
65 Views
  • [show abstract] [hide abstract]
    ABSTRACT: In this paper, we concerns a design of fuzzy radial basis function neural network (FRBFNN) by means of multi-objective optimization. A multi-objective algorithm is proposed to optimize the FRBFNN. In the FRBFNN, we exploit the fuzzy c-means (FCM) clustering to form the premise part of the rules. As the consequent part of fuzzy rules of the FRBFNN model, four types of polynomials are considered, namely constant, linear, quadratic, and modified quadratic. The least square method (LSM) is exploited to estimate the values of the coefficients of the polynomial. In fuzzy modeling, complexity, interpretability (or simplicity) as well as accuracy of the obtained model are essential design criteria. Since the performance of the RBFNN model is directly affected by some parameters such as e.g., the fuzzification coefficient used in the FCM, the number of rules and the orders of the polynomials in the consequent parts of the rules, we carry out both structural as well as parametric optimization of the network. The proposed multi-objective algorithm is used to optimize the parameters of the model while the optimization is of multi-objective character as it is aimed at the simultaneous minimization of complexity and maximization of accuracy.
    Advances in Neural Networks - ISNN 2011 - 8th International Symposium on Neural Networks, ISNN 2011, Guilin, China, May 29-June 1, 2011, Proceedings, Part III; 01/2011
  • Source
    Life Science Journal 12/2013; · 0.17 Impact Factor
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: This paper presents a new multiobjective evolutionary algorithm applied to a radial basis function (RBF) network design based on multiobjective particle swarm optimization augmented with local search features. The algorithm is named the memetic multiobjective particle swarm optimization RBF network (MPSON) because it integrates the accuracy and structure of an RBF network. The proposed algorithm is implemented on two-class and multiclass pattern classification problems with one complex real problem. The experimental results indicate that the proposed algorithm is viable, and provides an effective means to design multiobjective RBF networks with good generalization capability and compact network structure. The accuracy and complexity of the network obtained by the proposed algorithm are compared with the memetic non-dominated sorting genetic algorithm based RBF network (MGAN) through statistical tests. This study shows that MPSON generates RBF networks coming with an appropriate balance between accuracy and simplicity, outperforming the other algorithms considered.
    Information Sciences 12/2013; 239:165–190. · 3.64 Impact Factor