Component weighting functions for adaptive search with EDAs.
ABSTRACT This paper introduces the component weighting approach as a general optimization heuristic to increase the likelihood of escaping from local optima by dynamically modifying the fitness function. The approach is tested on the optimization of the simplified hydrophobic-polar (HP) protein problem using estimation of distribution algorithms (EDAs). We show that the use of component weighting together with statistical information extracted from the set of selected solutions considerably improve the results of EDAs for the HP problem. The paper also elaborates on the use of probabilistic modeling for the definition of dynamic fitness functions and on the use of combinations of models.
- [show abstract] [hide abstract]
ABSTRACT: Estimation of distribution algorithms (EDAs) are evolutionary methods that use probabilistic models instead of genetic operators to lead the search. Most of current proposals on EDAs do not incorporate adaptive techniques. Usually, the class of probabilistic model employed as well as the learning and sampling methods are static. In this paper, we present a general framework for introducing adaptation in EDAs. This framework allows the possibility of changing the class of probabilistic models during the evolution. We present a number of measures, and techniques that can be used to evaluate the effect of the EDA components in order to design adaptive EDAs. As a case of study we present an adaptive EDA that combines different classes of probabilistic models and sampling methods. The algorithm is evaluated in the solution of the satisfiability problem.06/2008: pages 177-197;
- [show abstract] [hide abstract]
ABSTRACT: David Goldberg's Genetic Algorithms in Search, Optimization and Machine Learning is by far the bestselling introduction to genetic algorithms. Goldberg is one of the preeminent researchers in the field--he has published over 100 research articles on genetic algorithms and is a student of John Holland, the father of genetic algorithms--and his deep understanding of the material shines through. The book contains a complete listing of a simple genetic algorithm in Pascal, which C programmers can easily understand. The book covers all of the important topics in the field, including crossover, mutation, classifier systems, and fitness scaling, giving a novice with a computer science background enough information to implement a genetic algorithm and describe genetic algorithms to a friend.Addison-Wesley, Reading, Massachusetts. 01/1989;
Conference Proceeding: The correlation-triggered adaptive variance scaling IDEA.[show abstract] [hide abstract]
ABSTRACT: It has previously been shown analytically and experimen- tally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from pre- mature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successful use of search distributions in EDAs are presented. Then, an adaptive variance scaling theme is introduced that aims at reducing the risk of premature convergence. Inte- grating the scheme into the iterated density-estimation evo- lutionary algorithm (ID A) yields the correlation-triggered adaptive variance scaling ID A (CT-AVS-ID A). The CT- AVS-ID A is compared to the original ID Aa nd the Evo- lution Strategy with Covariance Matrix Adaptation (CMA- ES) on a wide range of unimodal test-problems by means of a scalability analysis. It is found that the average num- ber of fitness evaluations grows subquadratically with the dimensionality, competitively with the CMA-ES. In addi- tion, CT-AVS-ID A is indeed found to enlarge the class of problems that continuous EDAs can solve reliably.Genetic and Evolutionary Computation Conference, GECCO 2006, Proceedings, Seattle, Washington, USA, July 8-12, 2006; 01/2006