Science topic

# Metaheuristic - Science topic

Explore the latest questions and answers in Metaheuristic, and find Metaheuristic experts.

Questions related to Metaheuristic

I am not able to find a suitable answer in any research paper, why exacly we use metaheuristic algorithms with MCDM techniques to solve decision making problems. Though only MCDM techniques are enough or metaheuristic itself is enough sometiome to solve the same problem.

Please suggest some papers or a survey paper with non-linear parameter tuning for metaheuristics.

Hi there,

I am trying to identify 3 gaps In mobile robot path planning, using metaheuristic methods in an unknown environment, lots of work has been done in this area, but just wondering if some came through some interesting papers that show some gaps so it will help me to write my proposal.

with regards!

I am interested in the use of Extreme Value Theory (EVT) to estimate global optima of optimization problems (using heuristic and metaheuristic algorithms), however, it is a bit difficult to find them since the use of EVT is not usually the main objective of the studies. Could you help me by sharing articles where this procedure is used? Thank you in advance.

I have simulated an industrial process by ASPEN and I want to optimize the operational parameters (decision) to maxmize the final yield (objective) while minmizing the energy comsumption (second objective)

Therefore, I would like to build a ML model to use it to optimize/decide the best operational parameters.

My plan is to use metaheuristic models such GA (Genetic Algorithm) but I have a difficulty to know what are the steps to build the model? How this optimization algorithm can be be implimented as a supervised ML Model?

To solve multi-objective problems using the lexicographic method.

Can we use for each step different algorithms? For instance, when we have two objectives to be minimized, can we use firstly the Genetic Algorithm to optimize the first objective, then use Particle swarm optimization Algorithm to solve the second objective while taking the result of the Genetic Algorithm as a constraint?

Hello all dear prof..

I need to combine metaheuristics with Deep learning to fine hyperparameter tuning.

I have experience in metaheuristics in tackling several problems.

But how I can use it in Deep learning?

I am open to collaborate to produce high quality research papers in this area.

Dear Researchers,

I'm looking for the implementation of advanced metaheuristic algorithms in Python. I would be incredibly thankful if someone could assist me with the execution of evolutionary algorithms or can provide me with the necessary codes in Python.

Thank you very much.

Specially hybridization NSGA-II with the other local searcher

Having come to realize the limitations that metaheuristics have by dint of the NFL theorem, I came across this interesting field of hyper-heuristics (heuristics searching for heuristics) and read a couple of papers on the topic. I was wondering whether any of you can give me a list of recommended books for further learning. Online video courses will also be greatly helpful. Thanks in advance.

What is the fastest and most accurate way to solve an MINLP model with nonlinear constraints, including triangle and quadratic terms? classic solvers, metaheuristics, or ...?

Any tips are appreciated.

Hi everyone, I'm working on SVM-WOA, I need to optimize the weight of SVM.... Thanks

I am solving an NP-Hard problem(say TSP). And then later on would like to take much larger scale/real-life problems with more constraints. The ultimate problem is a combianorial optimization NP-Hard problem.

Now, Considering the metaheuristic side, there are many algorithms:GA,PSO, etc

How to know which algorithm is the best? Since every other algorithm is available in its discrete form. Also it would be good if some pointers are given as to how to gauge based on the problem definition and the solution methodology of the algorithm, if its suitable for our problem.

Thanks

Can you advise me which journal is more suitable for submitting the article entitled "Novel metaheuristic optimization algorithm based on astronomy" which has about 40 pages?

I have a multi-objective optimization with the following properties:

Objective function: thee minimization objective functions) two non-linear functions and one linear function

Decision variable: two real variables (Bounded)

Constraint: three linear constraint (two bounding constraint and one relationship constraint)

Problem type: non-convex

Solution required: Global optimum

I have used two heuristic algorithms to solve the problem NSGA-II and NSGA-III.

I have performed NSGA-II and NSGA-III for the following instances (population size, number of generations, maximum number of functional evaluations(i.e. pop size x no. of gen)): (100,10,1000), (100,50,5000),(100,100,10000), (500, 10, 1000), (500, 50, 25000), and (500,100,50000).

My observations:

Hypervolume increases with increase in number of functional evaluations. However, for a given population size, as the number of generation increases the hypervolume reduces. Which I think should rather increase. Why am I getting such an answer?

I am working on a deep learning-based analysis to detect Melanoma Cancer. I want to use all types of algorithms to get the best possible solution.

Hi everyone,

We have implemented four metaheuristic algorithms to solve an optimization problem. Each algorithm is repeated 30 times for an instance of the problem, and we have stored the best objective function values for 30 independent runs for each algorithm.

We want to compare these four algorithms. Apart from maximum, minimum, average, and standard deviation, is there any statistical measure for comparison?

Alternatively, we have four independent samples each of size 30, and we want to test the null hypothesis that the means (or, medians) of these four samples are equal against an alternative hypothesis that they are not. What kind of statistical test should we perform?

Regards,

Soumen Atta

Dear friends,

We invite all researchers and practitioners who are developing algorithms, systems, and applications, to share their results, ideas, and experiences.

Topics of interest include, but are not limited to, the following:

Hybrid Metaheuristics

Theoretical aspects of hybridization

Automated parameter tuning

Parallelization

Evolutionary Computation Algorithms

Swarm Optimization

Multi-objective optimization

Multilevel segmentation

Object recognition

Computer vision

Image processing

Filtering and enhancement

Morphology

Edge detection and segmentation

Feature extraction

Quantum Image Processing

Image thresholding

Applications

more information: http://diegoliva.com/cfp/cbc_ahmip/

I want to compare metaheuristics on the optimizattion of Lennard-Jones clusters. There are many papers available that optimize Lennard-Jones clusters. Unfortunately, none of them provide the upper and lower bounds of the search space. In order to conduct a fair comparison, all metaheuristics should search in the same bounds of the search space. I found the global minima here: http://doye.chem.ox.ac.uk/jon/structures/LJ/tables.150.html but the search space is not defined.

Can anyone please tell me what are the recommended upper and lower bounds of the search space?

If we compare heuristic approach like local search or metaheuristic approach like genetic algorithms or use more advanced technique such as Graph Neural Networks. What is the best approach to adopt considering large scale graph? Both for mesh and radial kind of networks.

Suppose that if we compare two metaheuristics X and Y in a given real problem, X returns a better solution than Y, while when we use the same metaheuristics to solve global optimization problems, Y returns a better solution than X. Does this make sense? what is the reason?

Is there any real advantage of using reinforcement learning instead of factorial experimental design, like Taguchi or D-optimal design, for parameterization of metaheuristics?

Hybrid composite objective functions are common tools to verify the performance of metaheuristics. Is there anyone who can help with the Matlab code related to those functions, mainly the functions used in well-known and recent articles such as Grey Wolf Optimizer, Ant Lion Optimizer, Harris Hawk Algorithm, Arithmetic Optimization Algorithm, etc. ?

I know about the CEC 2005, but it seems that the above-mentioned papers used modified versions of CEC 2005. Therefore, for the sake of comparison, I highly appreciate any help provided.

Thanks and regards

Hello scientific community

Do you noting the following:

[I note that when a new algorithms has been proposed, most of the researchers walk quickly to improve it and apply for solving the same and other problems. I ask now, so why the original algorithm if it suffer from weakness, why the need for a new algorithm if there are an existing one that solved the same problems, I understand if the new algorithm solved the unsolved problem so welcome, else why?]

Therefore, I ask, is the scientific community need a novel metaheuristic algorithms (MHs) rather than the existing.

I think, we need to organized the existing metaheuristic algorithms and mentioned the pros and cons for each one, the solved problems by each one.

The repeated algorithms must be disappear and the complex also.

The dependent algorithms must be disappeared.

We need to benchmark the MHs similar as the benchmark test suite.

Also, we need to determine the unsolved problems and if you would like to propose a novel algorithm so try to solve the unsolved problem else stop please.

Thanks and I wait for the reputable discussion

I am looking to develop an overview/survey of specific experimental techniques and papers in which exploration is defined, measured, and analyzed as part of heuristic search (preferably for continuous domains).

Suggestions and references very much appreciated.

I have a problem where the objective function takes a considerable amount of time to evaluate. The problem has continuous search space. However, most of the variables don't need the level of accuracy provided with this continuous search space.

I was thinking of discretizing the search space where the optimizer would pick from a specific set of values for each parameter. This would give a near optimal solution that I am satisfied with if the optimization time is reduced.

I want your opinion or personal experience with similar problems.

Thanks in advance.

Various metaheuristic optimization algorithms with different inspiration sources have been proposed in recent past decades. Unlike mathematical methods, metaheuristics do not require any gradient information and are not dependent on the starting point. Furthermore, they are suitable for complex, nonlinear, and non-convex search spaces, especially when near-global optimum solutions are sought after using limited computational effort. However, some of these metaheuristics are trapped in a local optimum and are not able to escape out, for example. For this purpose, numerous researchers focus on adding efficient mechanisms for enhancing the performance of the standard version of the metaheuristics. Some of them are addressed in the following references:

I will be grateful If anyone can help me to find other efficient mechanisms.

Thanks in advance.

Large amounts of research focus on the balance between Exploration and Exploitation. However, this omits the power of Selection which can turn any metaheuristic into a hill climber by rejecting all exploratory search solutions.

To engage the broader research community in this important discussion, we are organizing a Workshop on Selection to be held at CEC2021 in Krakow, Poland. Please consider participating in our event or on our project page.

Wondering if there is any way to bypass the ordinary material testing procedure and use a formulated physical theory to estimate these properties by knowing (a) the kind of atoms, (b) type of crystalline structure (BCC, FCC, etc), (c) and the atomic bonds between them. I ask because I'm currently contemplating the idea of creating new materials using Artificial Neural Networks and Metaheuristics. Hope the idea is feasible. Thanks.

In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.

I have some optimal solutions of a discret space and I want to apply an heuristic search using those solutions as attractors. I started using distances as cost functions but I don't know if it's a good approach.

Mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems on the scale of one million design variables, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.

However, in some cases in which there are discontinuities in the objective function, there would be some problems due to non-differentiable problem. Some methods such as sub-gradients are proposed to solve such problems. However, I cannot find many papers in the state-of-the-art of engineering optimization of discontinuous optimization using mathematical programming. Engineers mostly use metaheuristics for such cases.

Can all problems with discontinuities be solved with mathematical programming? Is it easy to implement sub-gradients for large scale industrial problems? Do they work in non-convex problems?

A simple simple example of such a function is attached here.

We all know that mathematical programming is the best optimization tool with many years of strong theoretical background which presents reliable solutions with high efficiency. Also, the mathematical proof for the global optimality would be available. However, in some cases with a lack of knowledge in which, the analytical calculation of sensitives is impossible, researchers will prefer to use metaheuristics. However, they are inefficient and unreliable in large scale problems.

The development of surrogate models such as the Kriging method, model-based methods such as Radial basis function interpolation, and novel machine learning tools helps us to approximate the objective function. So, model-based sensitivity can be used instead. Also, machine learning can help to predict sensitivity information!

So, the improvement of function or sensitivity approximation, coupled with mathematical optimization will disappear the metaheuristics. In this way, I guess that there would be no need for metaheuristics (at least in continuous optimization as I know).

What do you think about it? Are you agreeing? Do you have any experience? Also, I am interested in both mathematical programming and metaheuristics, but prefer the efficiency.

Hi,

Ok so I'm all new to computer science and metaheuristics, and need to implement multi-objective optimization for an environemental economics problem with real world data.

The problem goes as this :

I have a large set of possible locations for an infrastructure within a city. Each location provides two benefits A and B, as well as costs (more benefits may be added in the future, depending on further research). Data points are unrelated, cost and benefit functions are not linear and not continuous.

I need a method for selecting a handful of locations that maximizes simultaneously A and B, under the constraint that the total costs < a budget parameter.

I went towards genetic algorithm (GA) for this problem, as it is is highly combinatorial, but I am facing the fact that most "traditional" GA I've looked at have fixed chromosome lengths, and ultimately only give out final individuals of n items. In my case, i am quite flexible on the quantity of best locations, as long as it either minimizes total costs, or handles it as a constraint. As a matter of fact, it would be quite interesting to have as a final result a pareto-front of individuals of different size (for example : in my problem, locations in city center have more "value" than locations in periurban areas, so a few centric locations could be as pareto-optimal as more numerous periurban locations). So I see the problem as a knapsack problem, where costs would be the "weight"; however there can't be any repetition of items here (a same location cannot be used twice).

Is there a way to handle costs constraint as to make a knapsack genetic algorithm that can provide a pareto front of individuals of heteogeneous length. I have been trying it with DEAP library but don't really find details in its documentation.

Many thanks

Georges Farina

I need MATLAB, C# or R code of above mentioned models. How could I find those?

The choice of something to ruin can be an implicit choice as to what should be preserved. A heuristic for preservation can thus lead to a heuristic for ruin. I've had what I think is a very interesting result for what to preserve (common solution components) in the context of genetic crossover operators that use constructive (as opposed to iterative) heuristics. I tried to share it with the Ruin and Recreate community with no success.

I guess my real question is -- How should I Ruin and Recreate this research to make it more relevant to Ruin and Recreate researchers?

Conference Paper The GENIE is out! (Who needs fitness to evolve?)

In comparison swarm-based with evolution-based algorithms, the former has some more advantages over the latter. Since evolutionary approaches have relatively more operators than swarm-based, they are more difficult to apply. Different than evolution-based approaches that immediately discard any obtained piece of information related to the old iteration once a new population is generated, swarm-based algorithms normally keep these valuable pieces of information over the subsequent iterations.

**Please**

**come**

**in**

**and**

**lower**

**your**

**bucket**

**in**

**this**

**discussion**

**thread.**

Bat-inspired algorithm is a metaheuristic optimization algorithm developed by Xin-She Yang in 2010. This bat algorithm is based on the echolocation behaviour of microbats with varying pulse rates of emission and loudness.

The idealization of the echolocation of microbats can be summarized as follows: Each virtual bat flies randomly with a velocity v

_{i}at position (solution) x_{i}with a varying frequency or wavelength and loudness A_{i}. As it searches and finds its prey, it changes frequency, loudness and pulse emission rate r. Search is intensified by a local random walk. Selection of the best continues until certain stop criteria are met. This essentially uses a frequency-tuning technique to control the dynamic behaviour of a swarm of bats, and the balance between exploration and exploitation can be controlled by tuning algorithm-dependent parameters in bat algorithm. (Wikipedia)What are the applications of bat algorithm? Any good optimization papers using bat algorithm? Your views are welcome! - Sundar

Hi All,

Are there any opinions and experiences of the LocalSolver solver?

Comparing for example accuracy, speed, etc. to other solvers, etc.

Interesting to hear about them ...

/Lars

The current metaheuristic methods are they a representation of a supposedly real natural intelligence (swarms, genetics, etc.) or the embodiment of another human intelligence (drawing intelligence from everything that surrounds it)

I’m trying to make a compendium of all the existing metaheuristics (bio-inspired or not). To consider a given algorithm (or set of guidelines) a metaheuristic I need at least one (serious) publication where the authors claim their algorithm to be a metaheuristic. Could you help me find all the existing metaheuristics?

I have applied metaheuristic algorithms such as PSO, GA in my research field which is recommender system but what I have found is these algorithms are very time consuming and really not practical, though the result is better than the existing algorithms. In recommender systems, we need fast algorithm. Thank you.

Is there really a significant difference between the performance of the different meta-heuristics other than "ϵ"?!!! I mean, at the moment we have many different meta-heuristics and the set expands. Every while you hear about a new meta-heuristic that outperforms the other methods, on a specific problem instance, with ϵ. Most of these algorithms share the same idea: randomness with memory or selection or name it to learn from previous steps. You see in MIC, CEC, SigEvo many repetitions on new meta-heuristiics. does it make sense to stuck here? now the same repeats with hyper-heuristics and .....

There are many research on metaheuristic optimization e.g. Particle Swarm Optimization, Genetic Algorithm, etc. Some study show that they are good for clustering task. But I don't find any comparation about it.

Which one is the best to be applied for optimizing the clustering process?

Some metaheuristics prove their superior performance in some kind of problems. Some of them are continuous optimization problems and others in discrete or binary optimization problems.

Whats are the latest meta heuristic algorithms proposed in last 1 to 2 years ?

When two metaheuristic algorithms are combined and a hybrid metaheuristic is developed, how can we evaluate the hybrid metaheuristic in order to make sure that it is better than the two original ones? Is it possible to do that with a small number of iterations (<3000)? Should we test the algorithms with the same number of populations (feasible solutions)?

Can heuristic or meta-heuristic fuzzy clustering algorithms help me? Any suggestions generally? I want to create learner’s profiles based on computational intelligence methods. The number of the groups (profiles) is unknown.

In 2018,

Which are the latest trends in studies of parameter tuning and control metaheuristics?

- Comparisons,

- Frameworks,

- Mathematical models,

- Demonstrations,

- Tools,

- Etc

How we can measure the quality of an element (cell) in a candidate solution in evolutionary algorithms? For example, a gene in a chromosome? In other words, objective function calculates the quality of each candidate solutions, while I want to find a method to measure the quality of each element of the candidate solution?

In 2018,

Which are the latest trends in studies of convergence of metaheuristics?

- Comparisons,

- Frameworks,

- Mathematical models,

- Demonstrations,

- Tools,

- Etc

Is the so-called "novel" metaheuristic methods are threatening to lead the area of metaheuristics away from scientific rigor?

Why we need other metaheuristic methods although all the new methods solve the same problems and in the same context, the unsolved problem is still unsolved also, there is no existing standard rule judge the mathematical model and pseudo-code?

In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.

Sources:

1- Del Ser, Javier, Eneko Osaba, Daniel Molina, Xin-She Yang, Sancho Salcedo-Sanz, David Camacho, Swagatam Das, Ponnuthurai N. Suganthan, Carlos A. Coello Coello, and Francisco Herrera. "Bio-inspired computation: Where we stand and what's next." Swarm and Evolutionary Computation 48 (2019): 220-250.

2- Sörensen, Kenneth. "Metaheuristics—the metaphor exposed." International Transactions in Operational Research 22, no. 1 (2015): 3-18.

Hello,

I am looking for a generic open access metaheuristic like Genetic Algorithm, Simulated Annealing, PSO, etc., implemented as a library/code in any of these platforms/languages: R, C/C++, AMPL, GAMS, Java. I am not interested in the efficiency of the algorithm, but I want to use it for teaching and to solve small instances.

I found some packages in R for PSO and Genetic Algorithms, but these can not handle integer/binary variables correctly.

I appreciate in advance your suggestions.

I am looking for research topic related to Electric Vehicle , something related to optimisation.

Like application of metaheuristics to it. I am much confused which one to choose for research like Battery management systems or inverter topologies or Control mechanism. Kindly suggest what other variations can be , as i am relatively new to it.

Hello researchers, I am looking for the most effecient algorithm solving the multi-objective and in particular "Bi-objective" shortest path problems in the field of road transport.

Is there a document that provides a comparison between label setting methods, metaheuristic methods "Ant colony, NSGA II, PSO..." and two-phase methods?

Are there any open source codes for this problem?

Thank you for your collaboration

Am doing my phd in Scheduling in Flexible manufacturing system by metaheuristics. Wanted ti publish a paper in SCI or SCIE journal. i have the impact factor list,but i want the acceptance rate or time for first decision list too. so that it will be easy for me to shortlist the journals and send my paper. Thanks in advance to all

I have to use an objective function with a high computational cost. Running Genetic Algorithms, solutions with high quality are obtained, but the search time is overwhelming because a lot of objective function evaluations. I want to apply some other metaheuristic that give me same quality for the solutions in less search time, according to a low objective function evaluations.

How can we decide between the use of metaheuristic unique solution (Hill climbing, tabou algo,...) and metaheuristic of population (Evolutionary algo)?

To optimize a production system by planning ~1000 timesteps ahead I try to solve an optimization problem with around 20000 dimensions containing binary and continuous variables and several complex constraints.

I know the provided information is little, but can someone give a hint which approach would be suitable for such big problems? Would you recommend some metaheuristic or a commercial solver?

Is there any new metaheuristic technique for optimization?

In the ε-constraint method, one objective will be used as the objective function, and the remaining objectives will be used as constraints using the epsilon value as the bound. In this case:

- Do we need to apply penalty method to handle the constraint?

- How to select the best solution?

- How to get the final Pareto set?

Hi all, I have the following questions:

1) Does the solution with meta-heuristic (say genetic algorithm or similar) or reinforcement learning converges faster for the same optimization problem?

2) Is the solution obtained using reinforcement learning an optimal one?

Thanks in advance

Opposition-based Learning (OBL) is a new concept in machine learning, inspired from the opposite relationship among entities. This concept is used to initial population that randomly generated.

We are facing a true tsunami of “novel” metaheuristic. Are all of them useful? I referred you to Sorensen - Intl. Trans. in Op. Res. (2013) 1-16, DOI: 10.1111/itor.12001.

Being Scandinavian I have read and seen countless of times some representation of the children's story about the emperor's new clothes - by Hans Christian Andersen.

As you might know the emperor's new costume is soooo fine that it is weightless, and no-one - but a child - dares to speak up to the emperor and exclaim that he in fact is naked! Thanks to the child, the curtain opens, and we truly see that he indeed is naked.

I find metaheuristics - to some degree - to be that, too, within the mathematical optimization domain: quite shallow, devoid of solid theory, and often (but not always) a game of draw, guess, and jump (I do not know the exact English translation of the name of the children's game). I have no problem with it when we are dealing with very complicated combinatorial/integer/bilevel problems in

**industry**, especially when we do not have an explicit formulation, such as when we need to deal with the use of simulations within the optimization, or uncertain coefficients. But then we are talking about industrial mathematics, which is something else than mathematics - which is an exact science.The theme of metaheuristics was, I hope, originally an attempt to find "reasonable" (however that is defined) feasible solutions to the most difficult and large problems, especially for nonlinear integer models in industry, with the explicit sign that with these techniques we might hope get a

**fairly good**solution if we are lucky, or we may**not**- as that is actually how it works: metaheuristics are NOT globally convergent in general, and they were - make a mental note of this - NOT EVEN CONSTRUCTED TO BE.Yet there are plenty of scholars - especially in this forum, for a reason I do not fully understand - that insist on applying their favourite metaheuristic(s) on just anything. Yesterday I think it was when I at RG found a paper on a metaheuristic used to "solve" a very, very simple linear program with one (1) linear constraint. I blew my top, as they say. WTF is going on? I blew my top because I know how to solve to guaranteed optimality such a problem in under a 1/1000 of a second on a slow computer. It's a problem of complexity O(n) - hence the easiest problem on this planet.

Can any sane person closer to that field address this, please? It is irrational, to begin with; or it is simply the fact that the world of scientific endeavours no longer are defined by codes of conduct? I am really troubled by more and more often seeing this unscientific methodology being used, and I sure hope it never will be seriously compared with mathematical optimization. Well, in fact, it is compared every day in industry, and math always wins.

I have successfully applied GA to the problem. I now wish to use other nature inspired metaheuristics to this problem. Pl suggest ways and methods to do so.

Hello. I want to use a machine learning method for function regression in order to speed up metaheuristic methods for optimization. Given all these methods such as multilayer perceptrons, radial basis networks, suport vector regression, etc., there is a variance both in accuracy and in duration of training and evaluation. In some tests for checking accuracy and duration, others are fast in training and evaluation, but produce poor results, while others produce very accurate results but are very slow in training and evaluation. Others are somewhere in the middle. Should I choose accuracy over duration or the opposite? Or perhaps should I take the middle way?

In the

**mixed-variable heuristic optimization**domain, what is done when a**categorical**variable determines the existence of**continuous**or ordered**discrete**variables in each possible solution?To

**illustrate**, imagine an optimization problem to determine the**best tool to cut paper**.In this problem, a variable

**tool**can have the values "**knife**" or "**scissors**".- If its value is "
**scissors**", there's the continuous-valued**blade_size**variable. - If it's "
**knife**", there is the same**blade_size**continuous variable and also a**num_of_teeth**discrete variable

How can I deal with this problems using some metaheuristic designed to hadle categorical, continuous and discrete ordered variables?

My

**first tought**was to set the problem to the max possible dimensionality and, after choosing the value of the categorical variable, select (*if*commands) which other variables are going to be optimized and used to evaluate the solution.This probably will work, but it seems

**naive**to me. Do other more**sophisticated**methods to deal with this kind of problem exists? If yes, what are these methods?I know that "applications of metaheuristics to design optimization of structures" is treated widely in the literature. This is not the case for structural analysis.

It would be interesting to see that this subject also is treated in this new book.

Good luck in your work.

*As can be read in the webpage of the metaheuristics network (http://www.metaheuristics.org), a metaheuristic is a set of concepts that can be used to define heuristic methods that can be applied to a wide set of different problems. In other words, a metaheuristic can be seen as a general algorithmic framework which can be applied to different optimization problems with relatively few modifications to make them adapted to a specific problem. Examples of metaheuristics include evolutionary algorithms, simulated annealing, tabu search, iterated local search, and ant colony optimization. Metaheuristics have been widely used to solve different combinatorial (and numerical) optimization problems, with the goal of obtaining a very good solution (but perhaps not the optimum) to NP-complete problems in which exact search methods are intractable even for small problem sizes.*

I'm looking for the package or source code to solve scheduling in hybrid flow shop using genetic algorithm or other metaheuristics.

*As can be read in the webpage of the metaheuristics network (http://www.metaheuristics.org), a metaheuristic is a set of concepts that can be used to define heuristic methods that can be applied to a wide set of different problems. In other words, a metaheuristic can be seen as a general algorithmic framework which can be applied to different optimization problems with relatively few modifications to make them adapted to a specific problem. Examples of metaheuristics include evolutionary algorithms, simulated annealing, tabu search, iterated local search, and ant colony optimization. Metaheuristics have been widely used to solve different combinatorial (and numerical) optimization problems, with the goal of obtaining a very good solution (but perhaps not the optimum) to NP-complete problems in which exact search methods are intractable even for small problem sizes.*