A New Interval-Genetic Algorithm
ABSTRACT Interval optimization algorithms usually use local search methods to obtain a good upper bound of the global optimal value. These local methods are based on point evaluations. A new interval-genetic algorithm is presented that combines an interval arithmetic and a genetic algorithm in the paper. The proposed algorithm uses the improved upper bound of the global optimal value obtained by the genetic algorithm to delete the intervals not containing the global optimal solution from the work set at each iteration. Using the interval arithmetic, the new algorithm not only has the advantages of simplicity and less knowledge about problems as traditional interval optimization algorithms, but also produces the reliable domains where the genetic algorithm is applied to search. Moreover, with the direction provided by the genetic algorithm applied, the chance to divide the reliable interval is increased. A convergence is proved and numerical experiments shows that the proposed algorithm is more efficient than traditional interval optimization algorithms.
- SourceAvailable from: Mario Giovanni C.A. Cimino[Show abstract] [Hide abstract]
ABSTRACT: Granular data and granular models offer an interesting tool for representing data in problems involving uncertainty, inaccuracy, variability and subjectivity have to be taken into account. In this paper, we deal with a particular type of information granules, namely interval-valued data. We propose a multilayer perceptron (MLP) to model interval-valued input–output mappings. The proposed MLP comes with interval-valued weights and biases, and is trained using a genetic algorithm designed to fit data with different levels of granularity. In the evolutionary optimization, two implementations of the objective function, based on a numeric-valued and an interval-valued network error, respectively, are discussed and compared. The modeling capabilities of the proposed MLP are illustrated by means of its application to both synthetic and real world datasets.Information Sciences 02/2014; 257:313–330. · 3.89 Impact Factor