Conference Paper

Threshold selecting: Best possible probability distribution for crossover selection in genetic algorithms

DOI: 10.1145/1388969.1389044 Conference: Genetic and Evolutionary Computation Conference, GECCO 2008, Proceedings, Atlanta, GA, USA, July 12-16, 2008, Companion Material
Source: DBLP


The paper considers the problem of selecting individuals in the current population in Genetic Algorithms for crossover to find a solution of high fitness of a given combinatorial optimization problem. Many different schemes have been considered in literature as possible selection strategies, such as Windowing, Exponential reduction, Linear transformation or normalization and Binary Tournament selection. It is shown that if one wishes to maximize any linear function of the final state probabilities, e.g. the fitness of the best individual of the final population of the algorithm, then the best probability distribution for selecting individuals in each generation is a rectangular distribution over the individuals sorted by their fitness values. This means uniform probabilities have to be assigned to a group of the best individuals of the population but probabilities equal to zero to individuals with fitness ranks higher than a fixed cutoff, which is equal to a certain rank in the sorted fitness vector. The considered strategy is called Threshold Selecting. The proof applies basic arguments of Markov chains and linear optimization and makes only a few assumptions on the underlying principles and hence applies to a large class of Genetic Algorithms.

Download full-text


Available from: Joerg Laessig, Dec 26, 2013
  • Source
    • "A worst case scenario would be a far above average class with a few elements. Improved schemes like Rank Selection[12] "
    [Show abstract] [Hide abstract]
    ABSTRACT: Genetic algorithms are a widely studied area of research. This paper proposes an exhaustive analysis of recombi-nation and mutation schemes for genetic algorithms in a generic way. Besides an often one sided analysis of advantages and disad-vantages of algorithms, this paper tries to highlight all relevant influence for the election of a suitable algorithm. Intention is an independent inspection of influence characteristics. Entropy tests as well as convergence tests are accomplished and several other kinds of influence like population pool size are taken into account. To my best knowledge, there is no research papers that deal with the genetic algorithms at this abstract level. The main intention of this work is to provide theoretical background for further genetic design. A wide variety of analyzes are performed to build an adequate basis for comparison.
    Preview · Article · Jan 2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Regression test suite prioritization techniques reorder test cases so that, on average, more faults will be revealed earlier in the test suite's execution than would otherwise be possible. This paper presents a genetic algorithm-based test prioritization method that employs a wide variety of mutation, crossover, selection, and transformation operators to reorder a test suite. Leveraging statistical analysis techniques, such as tree model construction through binary recursive partitioning and kernel density estimation, the paper's empirical results highlight the unique role that the selection operators play in identifying an effective ordering of a test suite. The study also reveals that, while truncation selection consistently outperformed the tournament and roulette operators in terms of test suite effectiveness, increasing selection pressure consistently produces the best results within each class of operator. After further explicating the relationship between selection intensity, termination condition, fitness landscape, and the quality of the resulting test suite, this paper demonstrates that the genetic algorithm-based prioritizer is superior to random search and hill climbing and thus suitable for many regression testing environments.
    Preview · Conference Paper · Jan 2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Evolutionary algorithms are widespread heuristic methods inspired by natural evolution to solve difficult problems for which analytical approaches are not suitable. In many domains experimenters are not only interested in discovering optimal solutions, but also in finding the largest number of different solutions satisfying minimal requirements. However, the formulation of an effective performance measure describing these requirements, also known as fitness function, represents a major challenge. The difficulty of combining and weighting multiple problem objectives and constraints of possibly varying nature and scale into a single fitness function often leads to unsatisfactory solutions. Furthermore, selective reproduction of the fittest solutions, which is inspired by competition-based selection in nature, leads to loss of diversity within the evolving population and premature convergence of the algorithm, hindering the discovery of many different solutions. Here we present an alternative abstraction of artificial evolution, which does not require the formulation of a composite fitness function. Inspired from viability theory in dynamical systems, natural evolution and ethology, the proposed method puts emphasis on the elimination of individuals that do not meet a set of changing criteria, which are defined on the problem objectives and constraints. Experimental results show that the proposed method maintains higher diversity in the evolving population and generates more unique solutions when compared to classical competition-based evolutionary algorithms. Our findings suggest that incorporating viability principles into evolutionary algorithms can significantly improve the applicability and effectiveness of evolutionary methods to numerous complex problems of science and engineering, ranging from protein structure prediction to aircraft wing design.
    Full-text · Article · Jan 2014 · PLoS ONE