Conference Paper

Statistical Analysis of Parameter Setting in Real-Coded Evolutionary Algorithms.

DOI: 10.1007/978-3-642-15871-1_46 Conference: Parallel Problem Solving from Nature - PPSN XI, 11th International Conference, Kraków, Poland, September 11-15, 2010. Proceedings, Part II
Source: DBLP

ABSTRACT When evolutionary algorithm (EA) applications are being developed it is very important to know which parameters have the greatest
influence on the behavior and performance of the algorithm. This paper proposes using the ANOVA (ANalysis Of the VAriance)
method to carry out an exhaustive analysis of an EA method and the different parameters it requires, such as those related
to the number of generations, population size, operators application and selection type. When undertaking a detailed statistical
analysis of the influence of each parameter, the designer should pay attention mostly to the parameter presenting values that
are statistically most significant. Following this idea, the significance and relative importance of the parameters with respect
to the obtained results, as well as suitable values for each of these, were obtained using ANOVA on four well known function
optimization problems.

0 Bookmarks
 · 
75 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces a new method for the global unconstrained minimization of a differentiable objective function. The method is based on search trajectories, which are defined by a differential equation and exhibit certain similarities to the trajectories of steepest descent. The trajectories depend explicitly on the value of the objective function and aim at attaining a given target level, while rejecting all larger local minima. Convergence to the gloal minimum can be proven for a certain class of functions and appropriate setting of two parameters.
    Journal of Optimization Theory and Applications 04/1981; 34(1):11-39. · 1.42 Impact Factor
  • Proceedings of the 5th International Conference on Genetic Algorithms; 06/1993
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Interest in hybrid methods that combine artificial neural networks and evolutionary algorithms has grown in the last few years, due to their robustness and ability to design networks by setting initial weight values, by searching the architecture and the learning rule and parameters. This paper presents an exhaustive analysis of the G-Prop method, and the different parameters the method requires (population size, selection rate, initial weight range, number of training epochs, etc.) are determined. The paper also the discusses the influence of the application of genetic operators on the precision (classification ability or error) and network size in classification problems. The significance and relative importance of the parameters with respect to the results obtained, as well as suitable values for each, were obtained using the ANOVA (analysis of the variance). Experiments show the significance of parameters concerning the neural network and learning in the hybrid methods. The parameters found using this method were used to compare the G-Prop method both to itself with other parameter settings, and to other published methods.
    IEEE Transactions on Neural Networks 02/2002; 13(6):1374-94. · 2.95 Impact Factor

Full-text (4 Sources)

View
115 Downloads
Available from
May 22, 2014