Science topic

Metaheuristic - Science topic

Explore the latest questions and answers in Metaheuristic, and find Metaheuristic experts.
Questions related to Metaheuristic
  • asked a question related to Metaheuristic
Question
2 answers
I am not able to find a suitable answer in any research paper, why exacly we use metaheuristic algorithms with MCDM techniques to solve decision making problems. Though only MCDM techniques are enough or metaheuristic itself is enough sometiome to solve the same problem.
Relevant answer
Answer
GA is suitable for more different shape @area that need to optimize in one time of solution... If your case has more shapes@areas its maybe a good piring for the Best shape@areas. While, if there only minimum of 2 aspect@parameyer that need to optimize, thus PSO is enough....
  • asked a question related to Metaheuristic
Question
2 answers
Please suggest some papers or a survey paper with non-linear parameter tuning for metaheuristics.
  • asked a question related to Metaheuristic
Question
3 answers
Hi there,
I am trying to identify 3 gaps In mobile robot path planning, using metaheuristic methods in an unknown environment, lots of work has been done in this area, but just wondering if some came through some interesting papers that show some gaps so it will help me to write my proposal.
with regards!
Relevant answer
Answer
Dear Mohamed-Mourad Lafifi thank you, much appreciated!
  • asked a question related to Metaheuristic
Question
2 answers
I am interested in the use of Extreme Value Theory (EVT) to estimate global optima of optimization problems (using heuristic and metaheuristic algorithms), however, it is a bit difficult to find them since the use of EVT is not usually the main objective of the studies. Could you help me by sharing articles where this procedure is used? Thank you in advance.
Relevant answer
Answer
Bettinger, P., J. Sessions, and K. Boston. 2009. A review of the status and use of validation procedures for heuristics used in forest planning. Mathematical and Computational Forestry & Natural-Resource Sciences. 1(1): 26-37.
Bettinger, P., J. Sessions, and K.N. Johnson. 1998. Ensuring the compatibility of aquatic habitat and commodity production goals in eastern Oregon with a Tabu search procedure. Forest Science. 44(1): 96-112.
Boston, K. and P. Bettinger. 1999. An analysis of Monte Carlo integer programming, simulated annealing, and tabu search heuristics for solving spatial harvest scheduling problems. Forest Science. 45(2): 292-301.
  • asked a question related to Metaheuristic
Question
5 answers
I have simulated an industrial process by ASPEN and I want to optimize the operational parameters (decision) to maxmize the final yield (objective) while minmizing the energy comsumption (second objective)
Therefore, I would like to build a ML model to use it to optimize/decide the best operational parameters.
My plan is to use metaheuristic models such GA (Genetic Algorithm) but I have a difficulty to know what are the steps to build the model? How this optimization algorithm can be be implimented as a supervised ML Model?
Relevant answer
Answer
Dear MO,
Generally, machine learning methods have an algorithm to learn and generalise from recorded data to construct projections on new samples, and they do not use for optimising a design directly. There are a large number of optimisation methods that can be applied to your problem, but you need a fitness (objective) function in order to evaluate the new solution proposed by the optimisation method. You can easily apply an optimisation method if you have such a function.
Please let me know if you need more information in this way.
  • asked a question related to Metaheuristic
Question
4 answers
To solve multi-objective problems using the lexicographic method.
Can we use for each step different algorithms? For instance, when we have two objectives to be minimized, can we use firstly the Genetic Algorithm to optimize the first objective, then use Particle swarm optimization Algorithm to solve the second objective while taking the result of the Genetic Algorithm as a constraint?
Relevant answer
Answer
You most certainly can. In fact, there may be an advantage in employing such a method since we know that some algorithms are better suited for some problems over others (NFL theorem). Just be sure to properly select the correct MH for any given objective function. This may require some experimentation on your part, but whatever you do, don't let the algorithm selection process be randomized. A methodological approach needs to be adopted here.
Personally, I would advise using Pareto-based techniques instead of conventional non-Pareto-based approaches like Lexicographic or sum weighted optimization. Nonetheless, good luck!
  • asked a question related to Metaheuristic
Question
10 answers
Hello all dear prof..
I need to combine metaheuristics with Deep learning to fine hyperparameter tuning.
I have experience in metaheuristics in tackling several problems.
But how I can use it in Deep learning?
I am open to collaborate to produce high quality research papers in this area.
Relevant answer
Answer
Applying meta-heuristics to any concept for hyperparameter tuning is just like solving for an optimization problem. The only difference is that the objective function becomes the output of the algorithm you are trying to tune.
For example, say you have a neural network, and you need to fine tune the number of neurons and the number of layers within the neural network. First, you would model a chromosome such that it contains values for each parameter you are tuning. For this example, your chromosome would look something like this:
Chromosome = [number_of_neurons, number_of_layers]
Thereafter, you would define your objective function as the neural network, and you would take the output of the neural network (lets say the output is the accuracy) and evaluate its performance by passing through the chromosome values into the objective function.
After those two steps, everything else is the same as coding a meta-heuristic algorithm. What i mean is, the steps thereafter will remain the same:
  • Crossover
  • Mutation
  • Evaluation
  • Termination
I have attached the code for my application using meta-heuristics to fine tune the hyperparameters of a GA that I had designed.
I hope this helps.
Good luck!
  • asked a question related to Metaheuristic
Question
16 answers
Dear Researchers,
I'm looking for the implementation of advanced metaheuristic algorithms in Python. I would be incredibly thankful if someone could assist me with the execution of evolutionary algorithms or can provide me with the necessary codes in Python.
Thank you very much.
Relevant answer
Answer
Dear Muhammad,
In addition to DEAP which is a great one, I suggest another library : PyMOO which also provides complementary single & multi objectives solvers for discrete and continuous variables problems.
Best regards !
  • asked a question related to Metaheuristic
Question
20 answers
Specially hybridization NSGA-II with the other local searcher
Relevant answer
Answer
Dear Ashkan,
I would be happy to share my experience in the hybrid optimisation methods and local search in solving the real-engineering problem. Please find some of my publication as follows:
  • asked a question related to Metaheuristic
Question
5 answers
Having come to realize the limitations that metaheuristics have by dint of the NFL theorem, I came across this interesting field of hyper-heuristics (heuristics searching for heuristics) and read a couple of papers on the topic. I was wondering whether any of you can give me a list of recommended books for further learning. Online video courses will also be greatly helpful. Thanks in advance.
Relevant answer
Answer
Much appreciation, Prof. Mohamed-Mourad Lafifi
  • asked a question related to Metaheuristic
Question
7 answers
What is the fastest and most accurate way to solve an MINLP model with nonlinear constraints, including triangle and quadratic terms? classic solvers, metaheuristics, or ...?
Any tips are appreciated.
Relevant answer
Answer
For solving your problem it is requiered to make evolve a solutions population. In each iteration you must to verify the fulfilment of your restriction and to add the corresponding value to the objective functions evaluations. For doing it you could different algorithms. In the works attached you can find a clear explanation about the mains ideas.
  • asked a question related to Metaheuristic
Question
4 answers
Hi everyone, I'm working on SVM-WOA, I need to optimize the weight of SVM.... Thanks
Relevant answer
Answer
Check this website or else directly contact the authors. They may provide you the codes.
  • asked a question related to Metaheuristic
Question
4 answers
I am solving an NP-Hard problem(say TSP). And then later on would like to take much larger scale/real-life problems with more constraints. The ultimate problem is a combianorial optimization NP-Hard problem.
Now, Considering the metaheuristic side, there are many algorithms:GA,PSO, etc
How to know which algorithm is the best? Since every other algorithm is available in its discrete form. Also it would be good if some pointers are given as to how to gauge based on the problem definition and the solution methodology of the algorithm, if its suitable for our problem.
Thanks
Relevant answer
Answer
You are right, but say I am working on a scheduling problem, and they have used GA on majority of papers.
Now say I have read some papers on "Comparison of EAs" and I have found that algo X is winning in majority of cases.
Is it a good idea to implement X to the scheduling problem ? Just because it has won on some benchmark functions and problems? Or How do you evaluate whether an algo is designed/suited for problem
I have heard from some profs that the above is not a good strategy as :
1. Just because an algo wins over others inn a comparison paper, doesn't mean its a universal winner.
2. There are certain algorithms that are designed for your problem. Those shall usually win over any other heuristic.
My question is:
How to succeed in point 2? I am currently working on airline crew pairing.
the literature has applied the following:
1.Branch & cut, branch and bound, branch & price, column generation
2. Genetic algo
3. Simulated annealing
4. Tabu search
Majority have conducted extensive research on column generation.
Now, A prof has said that it is well defined/concluded in the literature that standard IP algos have won many times over GAs/other metaheuristics.
But I still feel there might be a goldmine.
In a recent comparison paper, it showed that CS(Cuckoo search)and DE(differential evolution) were winning majority. I know that applying an algo X to problem Y is not as such a real contribution ..
But How do I assess whether the algo is even designed for my problem? This is the main hurdle I am facing since I am a beginner.
There seem to be many algos winning in their own respective fields, but I am unable to assess if it suitable to even be applied to my problem
Some tricks/key-points with regard to "Identification" shall be really helpful.
Thanks
  • asked a question related to Metaheuristic
Question
2 answers
Can you advise me which journal is more suitable for submitting the article entitled "Novel metaheuristic optimization algorithm based on astronomy" which has about 40 pages?
Relevant answer
Answer
Of course it (publishing a paper) all depends on the quality of the paper, and the field in which the heuristic is being applied. There is the European Journal of Operational Research, the Journal of Heuristics, and many other computer science and AI-related journals. You should visit the Clarivate Journal Citation Reports website to browse the journals related to your topic.
  • asked a question related to Metaheuristic
Question
4 answers
I have a multi-objective optimization with the following properties:
Objective function: thee minimization objective functions) two non-linear functions and one linear function
Decision variable: two real variables (Bounded)
Constraint: three linear constraint (two bounding constraint and one relationship constraint)
Problem type: non-convex
Solution required: Global optimum
I have used two heuristic algorithms to solve the problem NSGA-II and NSGA-III.
I have performed NSGA-II and NSGA-III for the following instances (population size, number of generations, maximum number of functional evaluations(i.e. pop size x no. of gen)): (100,10,1000), (100,50,5000),(100,100,10000), (500, 10, 1000), (500, 50, 25000), and (500,100,50000).
My observations:
Hypervolume increases with increase in number of functional evaluations. However, for a given population size, as the number of generation increases the hypervolume reduces. Which I think should rather increase. Why am I getting such an answer?
Relevant answer
Answer
Greetings to you all.
Please how can I find MATLAB code for Accelerated Particle Swarm Optimization algorithm for tuning PID controller.
  • asked a question related to Metaheuristic
Question
2 answers
I am working on a deep learning-based analysis to detect Melanoma Cancer. I want to use all types of algorithms to get the best possible solution.
Relevant answer
Answer
I hope this email finds you well.
I would like to invite you to have a look on the following links.
Thank you very much for your time and efforts.
  • asked a question related to Metaheuristic
Question
40 answers
Hi everyone,
We have implemented four metaheuristic algorithms to solve an optimization problem. Each algorithm is repeated 30 times for an instance of the problem, and we have stored the best objective function values for 30 independent runs for each algorithm.
We want to compare these four algorithms. Apart from maximum, minimum, average, and standard deviation, is there any statistical measure for comparison?
Alternatively, we have four independent samples each of size 30, and we want to test the null hypothesis that the means (or, medians) of these four samples are equal against an alternative hypothesis that they are not. What kind of statistical test should we perform?
Regards,
Soumen Atta
Relevant answer
Answer
1-Find the average of fitness functions (30 runs) for two algorithms separately: find the p value using t-test.
2-Find the average of fitness functions (30 runs) for more than 2 algorithms separately: find the p value using ANOVA.
  • asked a question related to Metaheuristic
Question
31 answers
Dear friends,
We invite all researchers and practitioners who are developing algorithms, systems, and applications, to share their results, ideas, and experiences.
Topics of interest include, but are not limited to, the following:
Hybrid Metaheuristics
Theoretical aspects of hybridization
Automated parameter tuning
Parallelization
Evolutionary Computation Algorithms
Swarm Optimization
Multi-objective optimization
Multilevel segmentation
Object recognition
Computer vision
Image processing
Filtering and enhancement
Morphology
Edge detection and segmentation
Feature extraction
Quantum Image Processing
Image thresholding
Applications
Relevant answer
Answer
Prof. Diego Oliva: Yes, with pleasure. Thank you for this useful information.
  • asked a question related to Metaheuristic
Question
5 answers
I want to compare metaheuristics on the optimizattion of Lennard-Jones clusters. There are many papers available that optimize Lennard-Jones clusters. Unfortunately, none of them provide the upper and lower bounds of the search space. In order to conduct a fair comparison, all metaheuristics should search in the same bounds of the search space. I found the global minima here: http://doye.chem.ox.ac.uk/jon/structures/LJ/tables.150.html but the search space is not defined.
Can anyone please tell me what are the recommended upper and lower bounds of the search space?
Relevant answer
Answer
Miha Ravber : for me, [-2, 2] was enough because I fixed the first atom at (0, 0, 0), the second at (>= 0, 0, 0), etc. If you don't, you get free coordinates between your bounds.
You can definitely start at [-10, 10] and see what the results are, then adjust.
  • asked a question related to Metaheuristic
Question
3 answers
If we compare heuristic approach like local search or metaheuristic approach like genetic algorithms or use more advanced technique such as Graph Neural Networks. What is the best approach to adopt considering large scale graph? Both for mesh and radial kind of networks.
Relevant answer
Answer
Henning Meyerhenke There are some equality constraints satisfying power flow (since it is a power distribution grid partition problem). Some inequality constraints such as voltage limits, limits on minimum number of nodes within partition. There are numerous objectives such as path redundancy between loads and generators within a partition. The challenges are it is hard to obtain pareto optimal soln with GA when increasing the equality constraints.
  • asked a question related to Metaheuristic
Question
19 answers
Suppose that if we compare two metaheuristics X and Y in a given real problem, X returns a better solution than Y, while when we use the same metaheuristics to solve global optimization problems, Y returns a better solution than X. Does this make sense? what is the reason?
Relevant answer
Answer
No Free Lunch Theory
  • asked a question related to Metaheuristic
Question
3 answers
Is there any real advantage of using reinforcement learning instead of factorial experimental design, like Taguchi or D-optimal design, for parameterization of metaheuristics?
  • asked a question related to Metaheuristic
Question
4 answers
Hybrid composite objective functions are common tools to verify the performance of metaheuristics. Is there anyone who can help with the Matlab code related to those functions, mainly the functions used in well-known and recent articles such as Grey Wolf Optimizer, Ant Lion Optimizer, Harris Hawk Algorithm, Arithmetic Optimization Algorithm, etc. ?
I know about the CEC 2005, but it seems that the above-mentioned papers used modified versions of CEC 2005. Therefore, for the sake of comparison, I highly appreciate any help provided.
Thanks and regards
  • asked a question related to Metaheuristic
Question
14 answers
Hello scientific community
Do you noting the following:
[I note that when a new algorithms has been proposed, most of the researchers walk quickly to improve it and apply for solving the same and other problems. I ask now, so why the original algorithm if it suffer from weakness, why the need for a new algorithm if there are an existing one that solved the same problems, I understand if the new algorithm solved the unsolved problem so welcome, else why?]
Therefore, I ask, is the scientific community need a novel metaheuristic algorithms (MHs) rather than the existing.
I think, we need to organized the existing metaheuristic algorithms and mentioned the pros and cons for each one, the solved problems by each one.
The repeated algorithms must be disappear and the complex also.
The dependent algorithms must be disappeared.
We need to benchmark the MHs similar as the benchmark test suite.
Also, we need to determine the unsolved problems and if you would like to propose a novel algorithm so try to solve the unsolved problem else stop please.
Thanks and I wait for the reputable discussion
Relevant answer
Answer
The last few decades have seen the introduction of a large number of "novel" metaheuristics inspired by different natural and social phenomena. While metaphors have been useful inspirations, I believe this development has taken the field a step backwards, rather than forwards. When the metaphors are stripped away, are these algorithms different in their behaviour? Instead of more new methods, we need more critical evaluation of established methods to reveal their underlying mechanics.
  • asked a question related to Metaheuristic
Question
13 answers
I am looking to develop an overview/survey of specific experimental techniques and papers in which exploration is defined, measured, and analyzed as part of heuristic search (preferably for continuous domains).
Suggestions and references very much appreciated.
Relevant answer
Answer
I just wanted to say that I'm quite partial to the recent paper by Marjan Mernik
  • asked a question related to Metaheuristic
Question
6 answers
I have a problem where the objective function takes a considerable amount of time to evaluate. The problem has continuous search space. However, most of the variables don't need the level of accuracy provided with this continuous search space.
I was thinking of discretizing the search space where the optimizer would pick from a specific set of values for each parameter. This would give a near optimal solution that I am satisfied with if the optimization time is reduced.
I want your opinion or personal experience with similar problems.
Thanks in advance.
Relevant answer
Answer
Hi Amr,
In general, I think it's safe to say that the answer to your question is yes. Discretization generally leads to a reduced search space which implies that the algorithm has less intermediate values to work with and that subsequently translates to less computational effort. I should point out, however, that there are a few cases where this generalization is untrue. During my research in the field, I have noticed how algorithms that work with discrete variables show significantly reduced performance when tackling problems with higher dimensions. Basically, it requires more effort from the algorithm to escape a local optimum when you have so many decision variables to change. That can be remedied with increased exploration from the algorithm, but continuous optimization, I have found, is the best alternative in that case.
Furthermore, if algorithmic stability is a concern for you in your present study, then having discrete variables will most surely reduce that. Higher mean, worst, and standard deviation values from the multiple runs you perform will become immediately manifest. This is true whether the dimensionality of the problem is large or small.
My advice is to try discretization if the number of decision variables is less than 10. Anything greater than that, will only lead to slower performance. But every problem is different, so I guess give it a try and see what you get. You may need to perform another parameter sensitivity analysis for acceptable results. Good luck.
  • asked a question related to Metaheuristic
Question
32 answers
Various metaheuristic optimization algorithms with different inspiration sources have been proposed in recent past decades. Unlike mathematical methods, metaheuristics do not require any gradient information and are not dependent on the starting point. Furthermore, they are suitable for complex, nonlinear, and non-convex search spaces, especially when near-global optimum solutions are sought after using limited computational effort. However, some of these metaheuristics are trapped in a local optimum and are not able to escape out, for example. For this purpose, numerous researchers focus on adding efficient mechanisms for enhancing the performance of the standard version of the metaheuristics. Some of them are addressed in the following references:
I will be grateful If anyone can help me to find other efficient mechanisms.
Thanks in advance.
Relevant answer
Answer
I recommend you to check also the CCSA algorithm implemented by a Conscious Neighborhood-based approach which is an effective mechanism to improve other metaheuristic algorithms as well. The CCSA and its full source code are available here:
  • asked a question related to Metaheuristic
Question
7 answers
Large amounts of research focus on the balance between Exploration and Exploitation. However, this omits the power of Selection which can turn any metaheuristic into a hill climber by rejecting all exploratory search solutions.
To engage the broader research community in this important discussion, we are organizing a Workshop on Selection to be held at CEC2021 in Krakow, Poland. Please consider participating in our event or on our project page.
Relevant answer
Answer
I think you need to be leery against the use of "selection" in metaheuristics as an independent attribute from exploitation. If anything, the two really augment one another. History has shown that metahueristics which have a selection property to it invariably have weak explorative capabilities. Prime well-known examples are Artificial Bee Colony algorithm and GAs.
My guess is that the same metrics that we currently use for estimating algorithmic performance (optimized solution, standard deviation, number of functional analysis, diversity index, etc.) can be modified to examine the so-called "selection - exploration" balance for algorithms, since as I said earlier, selection and exploitation are really just two faces of the same coin.
  • asked a question related to Metaheuristic
Question
3 answers
Wondering if there is any way to bypass the ordinary material testing procedure and use a formulated physical theory to estimate these properties by knowing (a) the kind of atoms, (b) type of crystalline structure (BCC, FCC, etc), (c) and the atomic bonds between them. I ask because I'm currently contemplating the idea of creating new materials using Artificial Neural Networks and Metaheuristics. Hope the idea is feasible. Thanks.
Relevant answer
Answer
  • asked a question related to Metaheuristic
Question
7 answers
In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.
Relevant answer
Answer
The nature based algorithms are particular cases of making evolve codes algorithms, called in some my own works with the general denomination of the Integration of Variables method. I think such kind of algorithms will appear a great quantity in a next future.
  • asked a question related to Metaheuristic
Question
12 answers
I have some optimal solutions of a discret space and I want to apply an heuristic search using those solutions as attractors. I started using distances as cost functions but I don't know if it's a good approach.
Relevant answer
Answer
Antonio Bolufé-Röhler Thanks for your answer.
  • asked a question related to Metaheuristic
Question
24 answers
Mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems on the scale of one million design variables, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.
However, in some cases in which there are discontinuities in the objective function, there would be some problems due to non-differentiable problem. Some methods such as sub-gradients are proposed to solve such problems. However, I cannot find many papers in the state-of-the-art of engineering optimization of discontinuous optimization using mathematical programming. Engineers mostly use metaheuristics for such cases.
Can all problems with discontinuities be solved with mathematical programming? Is it easy to implement sub-gradients for large scale industrial problems? Do they work in non-convex problems?
A simple simple example of such a function is attached here.
Relevant answer
Answer
Your ideas for dividing the region and using local optimizer are so nice!
Thanks a lot!
  • asked a question related to Metaheuristic
Question
12 answers
We all know that mathematical programming is the best optimization tool with many years of strong theoretical background which presents reliable solutions with high efficiency. Also, the mathematical proof for the global optimality would be available. However, in some cases with a lack of knowledge in which, the analytical calculation of sensitives is impossible, researchers will prefer to use metaheuristics. However, they are inefficient and unreliable in large scale problems.
The development of surrogate models such as the Kriging method, model-based methods such as Radial basis function interpolation, and novel machine learning tools helps us to approximate the objective function. So, model-based sensitivity can be used instead. Also, machine learning can help to predict sensitivity information!
So, the improvement of function or sensitivity approximation, coupled with mathematical optimization will disappear the metaheuristics. In this way, I guess that there would be no need for metaheuristics (at least in continuous optimization as I know).
What do you think about it? Are you agreeing? Do you have any experience? Also, I am interested in both mathematical programming and metaheuristics, but prefer the efficiency.
Relevant answer
Answer
Mathematical optimisation kills all heuristics - if you pick them right.
  • asked a question related to Metaheuristic
Question
5 answers
Hi,
Ok so I'm all new to computer science and metaheuristics, and need to implement multi-objective optimization for an environemental economics problem with real world data.
The problem goes as this :
I have a large set of possible locations for an infrastructure within a city. Each location provides two benefits A and B, as well as costs (more benefits may be added in the future, depending on further research). Data points are unrelated, cost and benefit functions are not linear and not continuous.
I need a method for selecting a handful of locations that maximizes simultaneously A and B, under the constraint that the total costs < a budget parameter.
I went towards genetic algorithm (GA) for this problem, as it is is highly combinatorial, but I am facing the fact that most "traditional" GA I've looked at have fixed chromosome lengths, and ultimately only give out final individuals of n items. In my case, i am quite flexible on the quantity of best locations, as long as it either minimizes total costs, or handles it as a constraint. As a matter of fact, it would be quite interesting to have as a final result a pareto-front of individuals of different size (for example : in my problem, locations in city center have more "value" than locations in periurban areas, so a few centric locations could be as pareto-optimal as more numerous periurban locations). So I see the problem as a knapsack problem, where costs would be the "weight"; however there can't be any repetition of items here (a same location cannot be used twice).
Is there a way to handle costs constraint as to make a knapsack genetic algorithm that can provide a pareto front of individuals of heteogeneous length. I have been trying it with DEAP library but don't really find details in its documentation.
Many thanks
Georges Farina
Relevant answer
Answer
Knapsack problems of economic type can be easily solved by Genetic algorithm see the following sites for an example of knapsack business problems:
Knapsack Problem Using Genetic Algorithm With Source Code | by Muzamil Shahbaz | Medium
Genetic algorithm for knapsack problem in Matlab - Free Open Source Codes - CodeForge.com
  • asked a question related to Metaheuristic
Question
12 answers
I need MATLAB, C# or R code of above mentioned models. How could I find those?
Relevant answer
  • asked a question related to Metaheuristic
Question
4 answers
The choice of something to ruin can be an implicit choice as to what should be preserved.  A heuristic for preservation can thus lead to a heuristic for ruin.  I've had what I think is a very interesting result for what to preserve (common solution components) in the context of genetic crossover operators that use constructive (as opposed to iterative) heuristics.  I tried to share it with the Ruin and Recreate community with no success.
I guess my real question is -- How should I Ruin and Recreate this research to make it more relevant to Ruin and Recreate researchers?
Relevant answer
Answer
In general, my impression of a ruin and recreate process would be to change assignment(s) to decision variables (randomly or otherwise) in a feasible solution, effectively ruining it (in value) and perhaps making the solution infeasible. Then, some sort of repair operator(s) are applied to place the solution back in the feasible region of the solution space.
  • asked a question related to Metaheuristic
Question
11 answers
In comparison swarm-based with evolution-based algorithms, the former has some more advantages over the latter. Since evolutionary approaches have relatively more operators than swarm-based, they are more difficult to apply. Different than evolution-based approaches that immediately discard any obtained piece of information related to the old iteration once a new population is generated, swarm-based algorithms normally keep these valuable pieces of information over the subsequent iterations.
Please come in and lower your bucket in this discussion thread.
Relevant answer
Answer
What about "Object Detection"?
  • asked a question related to Metaheuristic
Question
23 answers
Bat-inspired algorithm is a metaheuristic optimization algorithm developed by Xin-She Yang in 2010. This bat algorithm is based on the echolocation behaviour of microbats with varying pulse rates of emission and loudness.
The idealization of the echolocation of microbats can be summarized as follows: Each virtual bat flies randomly with a velocity vi at position (solution) xi with a varying frequency or wavelength and loudness Ai. As it searches and finds its prey, it changes frequency, loudness and pulse emission rate r. Search is intensified by a local random walk. Selection of the best continues until certain stop criteria are met. This essentially uses a frequency-tuning technique to control the dynamic behaviour of a swarm of bats, and the balance between exploration and exploitation can be controlled by tuning algorithm-dependent parameters in bat algorithm. (Wikipedia)
What are the applications of bat algorithm? Any good optimization papers using bat algorithm? Your views are welcome! - Sundar
Relevant answer
Answer
Bat algorithm (BA) is a bio-inspired algorithm developed by Xin-She Yang in 2010 and BA has been found to be very efficient .
  • asked a question related to Metaheuristic
Question
9 answers
Hi All,
Are there any opinions and experiences of the LocalSolver solver?
Comparing for example accuracy, speed, etc. to other solvers, etc.
Interesting to hear about them ...
/Lars
Relevant answer
Answer
Dear Lars
I agreed 100 % with you in that a Solver must identify unfeasibility, but my question is: How many solvers or MCDM methods do you know that have that capacity?
Only one: Linear Programming
The procedure is very simple: It compares criteria independent values, and if a solution satisfies these values. If only one criterion is not satisfied the project is unfeasible.
Nowadays, problems are 'solved' assuming that a problem is feasible, not taking into account that t hat circumstance may not exist.
I have read hundreds of comments and papers from our colleagues. How many of them posed this problem?
Nobody.
I wrote in RG almost a year ago about this problem, you can see it in my profile under the number 304, and again, in May 2020 under the number 318. Both have had some moderate reading but nobody came forward to acknowledge and discuss it. You are the only person that addresses that issue.
Regarding LocalSolver I know what it, is but my experience on it is null.
  • asked a question related to Metaheuristic
Question
3 answers
The current metaheuristic methods are they a representation of a supposedly real natural intelligence (swarms, genetics, etc.) or the embodiment of another human intelligence (drawing intelligence from everything that surrounds it)
Relevant answer
Answer
Most real-world search and optimization problems involve complexities such as non-convexity, nonlinearities, discontinuities, mixed nature of variables, multiple disciplines and large dimensionality, a combination of which renders classical provable algorithms to be either ineffective, impractical or inapplicable. There do not exist any known mathematically motivated algorithms for finding the optimal solution for all such problems in a limited computational time. Thus, in order to solve such problems to practicality, search and optimization algorithms are usually developed using certain heuristics that though lacking in strong mathematical foundations, are nevertheless good at reaching an approximate solution in a reasonable amount of time. These so-called meta-heuristic methods do not guarantee finding the exact optimal solution, but can lead to a near-optimal solution in a computationally efficient manner. Due to this practical appeal combined with their ease of implementation, meta-heuristic methodologies are gaining popularity in several application domains. Most meta-heuristic methods are stochastic in nature and mimic a natural, physical or biological principle resembling a search or an optimization process.
  • asked a question related to Metaheuristic
Question
23 answers
I’m trying to make a compendium of all the existing metaheuristics (bio-inspired or not). To consider a given algorithm (or set of guidelines) a metaheuristic I need at least one (serious) publication where the authors claim their algorithm to be a metaheuristic. Could you help me find all the existing metaheuristics?
Relevant answer
Answer
You are right.... " Almost every day we see a new one of metaheuristics "
  • asked a question related to Metaheuristic
Question
10 answers
I have applied metaheuristic algorithms such as PSO, GA in my research field which is recommender system but what I have found is these algorithms are very time consuming and really not practical, though the result is better than the existing algorithms. In recommender systems, we need fast algorithm. Thank you.
Relevant answer
Answer
Of course not !!! Metaheuristics are here to solve the problem of the time.
Metaheuristics are cleverly done since the exhaustive search with the infinite running time is useless especially for large-scale and real-life optimization problems.
Table one of the following reference gives a detailed answer to this issue:
  • asked a question related to Metaheuristic
Question
22 answers
Is there really a significant difference between the performance of the different meta-heuristics other than "ϵ"?!!! I mean, at the moment we have many different meta-heuristics and the set expands. Every while you hear about a new meta-heuristic that outperforms the other methods, on a specific problem instance, with ϵ. Most of these algorithms share the same idea: randomness with memory or selection or name it to learn from previous steps. You see in MIC, CEC, SigEvo many repetitions on new meta-heuristiics. does it make sense to stuck here? now the same repeats with hyper-heuristics and .....   
Relevant answer
Answer
Apart from the foregoing mentioned discussion, all metaheuristic optimization approaches are alike on average in terms of their performance. The extensive research studies in this field show that an algorithm may be the topmost choice for some norms of problems, but at the same, it may become to be the inferior selection for other types of problems. On the other hand, since most real-world optimization problems have different needs and requirements that vary from industry to industry, there is no universal algorithm or approach that can be applied to every circumstance, and, therefore, it becomes a challenge to pick up the right algorithm that sufficiently suits these essentials.
A discussion of this issue is at section two of the following reference:
  • asked a question related to Metaheuristic
Question
16 answers
There are many research on metaheuristic optimization e.g. Particle Swarm Optimization, Genetic Algorithm, etc. Some study show that they are good for clustering task. But I don't find any comparation about it. 
Which one is the best to be applied for optimizing the clustering process?
Relevant answer
Answer
The following current-state-of-the-art optimization algorithms give you the answer:
N. K. T. El-Omari, "Sea Lion Optimization Algorithm for Solving the Maximum Flow Problem", International Journal of Computer Science and Network Security (IJCSNS), e-ISSN: 1738-7906, DOI: 10.22937/IJCSNS.2020.20.08.5, 20(8):30-68, 2020.
It has a complete discussion about your question.
Or refer to the same paper at the following link:
  • asked a question related to Metaheuristic
Question
25 answers
Some metaheuristics prove their superior performance in some kind of problems. Some of them are continuous optimization problems and others in discrete or binary optimization problems.
Relevant answer
Answer
Simply look at this research:
N. K. T. El-Omari, "Sea Lion Optimization Algorithm for Solving the Maximum Flow Problem", International Journal of Computer Science and Network Security (IJCSNS), e-ISSN: 1738-7906, DOI: 10.22937/IJCSNS.2020.20.08.5, 20(8):30-68, 2020.
It has a complete discussion about your question.
Or refer to the same paper at the following address:
  • asked a question related to Metaheuristic
Question
131 answers
Whats are the latest meta heuristic algorithms proposed in last 1 to 2 years ?
Relevant answer
Answer
"Sea Lion Optimization Algorithm" is one of current-state-of-the-art meta heuristic algorithms . More info can be found here:
N. K. T. El-Omari, "Sea Lion Optimization Algorithm for Solving the Maximum Flow Problem", International Journal of Computer Science and Network Security (IJCSNS), e-ISSN: 1738-7906, DOI: 10.22937/IJCSNS.2020.20.08.5, 20(8):30-68, 2020.
Or simply refer to the same paper at the following link:
  • asked a question related to Metaheuristic
Question
16 answers
When two metaheuristic algorithms are combined and a hybrid metaheuristic is developed, how can we evaluate the hybrid metaheuristic in order to make sure that it is better than the two original ones? Is it possible to do that with a small number of iterations (<3000)? Should we test the algorithms with the same number of populations (feasible solutions)?
Relevant answer
Answer
Dear Shervin,
You should evaluate the new hybrid algorithm with a similar termination criterion and computational time as the baseline algorithms. I generally recommend to measure performance (average and best solution quality over several runs with different seeds) and (average) computational time on a wide enough set of benchmark instances for the problem at hand, and ideally compare with other studies published in your field.
Finally, I recommend demonstrating the "added-value" of your new components by evaluating the performance of the same method when you change some key parameters or deactivate some of these components (this is sometimes called "ablation study"). This information is essential for future works since other authors are likely to re-use some components of your approach but not necessarily the entire algorithm. Therefore, they will wish to know what is the relative contribution of all the design choices that you made.
If you wish some examples of component analyses in hybrid algorithms, you can consult Section 4.2 of the following paper:
or Section 5.3 of this one:
Good luck!
--Thibaut
  • asked a question related to Metaheuristic
Question
3 answers
Can heuristic or meta-heuristic fuzzy clustering algorithms help me? Any suggestions generally? I want to create learner’s profiles based on computational intelligence methods. The number of the groups (profiles) is unknown.
  • asked a question related to Metaheuristic
Question
7 answers
In 2018,
Which are the latest trends in studies of parameter tuning and control metaheuristics?
- Comparisons,
- Frameworks,
- Mathematical models,
- Demonstrations,
- Tools,
- Etc
Relevant answer
Answer
Dear;
Please you can read it in the following link:
Title of the article : Autonomous search and tracking of objects using model predictive control of unmanned aerial vehicle and gimbal: Hardware-in-the-loop simulation of payload and avionics.
Regards
  • asked a question related to Metaheuristic
Question
9 answers
How we can measure the quality of an element (cell) in a candidate solution in evolutionary algorithms? For example, a gene in a chromosome? In other words, objective function calculates the quality of each candidate solutions, while I want to find a method to measure the quality of each element of the candidate solution?
Relevant answer
Answer
There are some technical programming points you should take care of them regardless of the above answers: first, you can follow what has been proposed above directly, as in such a case, maybe your solution goes infeasible when you keep a unique cell. So, your analysis will be a misleading chain of function feedbacks. Another mistake can be due to monitoring of weakly-defined fitness function, instead of monitoring the loss of a function that is sensitive to all independent genes. So, you can also change the definition of fitness function to be a another valid function, and also, it can track (sensing) each cell instead of the whole solution block. If you protect your model from these flaws or not able to redefine the objective function, you can focus on the way you programmed your code (functional programming or not). Then, you can simply eliminate the set of selected elements one by one (using a conditional selection scheme) and re-test the evaluations in the loop.
  • asked a question related to Metaheuristic
Question
6 answers
DSA, Metaheuristic
Relevant answer
Answer
It is not easy to identify a single parameter that directly controls exploration and exploitation behaviour. Rather, it is an emerging behaviour of all the parameters used. In case of the Differential Evolution algorithm, the first thing you may change is the mutation strategy applied. For example, you can use "DE/rand/" strategies to increase exploration or use "DE/best/" to increase exploitation. Besides, you can increase the population size for better exploration. Other parameters such as CR and F also affect this balance, however this effect will be dependent on other parameters you decided before.
  • asked a question related to Metaheuristic
Question
13 answers
In 2018,
Which are the latest trends in studies of convergence of metaheuristics?
- Comparisons,
- Frameworks,
- Mathematical models,
- Demonstrations,
- Tools,
- Etc
Relevant answer
Answer
If a metaheuristic - under specified problem conditions - were to be shown to be convergent to a stationary point, then I might get slightly interested in it. As it is, caution is needed, as meta-people are getting louder and and louder - with no real excuse.
  • asked a question related to Metaheuristic
Question
58 answers
Is the so-called "novel" metaheuristic methods are threatening to lead the area of metaheuristics away from scientific rigor?
Why we need other metaheuristic methods although all the new methods solve the same problems and in the same context, the unsolved problem is still unsolved also, there is no existing standard rule judge the mathematical model and pseudo-code?
In recent years, the field of combinatorial optimization has witnessed a true tsunami of so-called "novel" metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together -- it seems that no idea is too far-fetched to serve as an inspiration to launch yet another metaheuristic.
Sources:
1- Del Ser, Javier, Eneko Osaba, Daniel Molina, Xin-She Yang, Sancho Salcedo-Sanz, David Camacho, Swagatam Das, Ponnuthurai N. Suganthan, Carlos A. Coello Coello, and Francisco Herrera. "Bio-inspired computation: Where we stand and what's next." Swarm and Evolutionary Computation 48 (2019): 220-250.
2- Sörensen, Kenneth. "Metaheuristics—the metaphor exposed." International Transactions in Operational Research 22, no. 1 (2015): 3-18.
Relevant answer
Answer
Metaheuristics are here to stay, and as has been discussed before they have their potentials in some circumstances.
But: being a professor in applied math, I would probably very seldom recommend to start by running metaheuristics: I would first always look at the problem at hand, figure out a few equivalent formulations (always profitable!), and run small/medium instances that make sense (with an array of existing math programming tools, of course), to see whether a few of the models appear to yield better results than some others, and ultimately have one to three model representations that I would let "compete" on both small instances and large-scale ones. By doing that you can collect observations that will be a good help to fix, for example, parameter settings for subsets of instances having different characteristics.
It's a way better way to "learn" the problem than to throw it into a random metaheuristic and write "Q.E.D." - as to be lazy is not a virtue. And you wouldn't have a clue how far away you are from an optimum.
  • asked a question related to Metaheuristic
Question
9 answers
Hello,
I am looking for a generic open access metaheuristic like Genetic Algorithm, Simulated Annealing, PSO, etc., implemented as a library/code in any of these platforms/languages: R, C/C++, AMPL, GAMS, Java. I am not interested in the efficiency of the algorithm, but I want to use it for teaching and to solve small instances.
I found some packages in R for PSO and Genetic Algorithms, but these can not handle integer/binary variables correctly.
I appreciate in advance your suggestions.
Relevant answer
Answer
I recommend using BORG (http://borgmoea.org/). It is an advanced version of NSGA-II with built-in functions that reduce user inputs for algorithm's operations (cross-over, mutation, selection, etc.). After defining the objective function and constraints etc, all you need to give to the algorithm is the number of maximum generation, initial population size, and search resolution (epsilon) in case of multi-objective problems. It is free for academic uses, and has decent manual with good costumer services (if you have technical questions).
It is primarily written in C, but has API to Matlab, Python, R, and Java. So, you could use it in those languages too. I used the Python version of BORG in my PhD works and enjoyed it. Also, note that BORG is mainly for multi-objective problems but you can use it for single-objective problems too. You only need to introduce a fake (dummy) epsilon as it has no effect on the quality of solution in a single-objective problem. I tried it and it worked for me.
With the main code being written in C, you can also introduce binary and integer variables too. But, I found it impossible to do in the Python version of it. I even linked BORG to GAMS, where GAMS was used to solve an internal (separate) non-linear problem ( ).
I cannot recommend it more - it's a great free tool which gives you peace of mind because you don't need to check how your optimal solution is sensitive to the algorithm's operations (the algorithm is smart and chooses the best state for mutation and crossover fraction, even the population size, etc).
Please feel free to ask me questions about BORG or other optimisation algorithms/techniques.
Regards,
Majed Khadem
  • asked a question related to Metaheuristic
Question
6 answers
I am looking for research topic related to Electric Vehicle , something related to optimisation.
Like application of metaheuristics to it. I am much confused which one to choose for research like Battery management systems or inverter topologies or Control mechanism. Kindly suggest what other variations can be , as i am relatively new to it.
Relevant answer
Answer
I would recommend decentralized planning and Scheduling while integrating concepts of sensor fusion to have better localization. To this end, you need to investigate stochastic planning and scheduling, probably game theoretic approaches if you have strong mathimatical skills. Otherwise, go for investigating heuristic based approaches. You can also integrate some Machine learning based techniques and such work will be of high value for AAAI, ROS etc.
  • asked a question related to Metaheuristic
Question
4 answers
Hello researchers, I am looking for the most effecient algorithm solving the multi-objective and in particular "Bi-objective" shortest path problems in the field of road transport.
Is there a document that provides a comparison between label setting methods, metaheuristic methods "Ant colony, NSGA II, PSO..." and two-phase methods?
Are there any open source codes for this problem?
Thank you for your collaboration
Relevant answer
Answer
  • asked a question related to Metaheuristic
Question
6 answers
Am doing my phd in Scheduling in Flexible manufacturing system by metaheuristics. Wanted ti publish a paper in SCI or SCIE journal. i have the impact factor list,but i want the acceptance rate or time for first decision list too. so that it will be easy for me to shortlist the journals and send my paper. Thanks in advance to all
Relevant answer
Answer
Ali Khater thank you
Alexandru Szel
where is the document sir?
  • asked a question related to Metaheuristic
Question
8 answers
I have to use an objective function with a high computational cost. Running Genetic Algorithms, solutions with high quality are obtained, but the search time is overwhelming because a lot of objective function evaluations. I want to apply some other metaheuristic that give me same quality for the solutions in less search time, according to a low objective function evaluations.
Relevant answer
Answer
Hello,
I assume that you do not have an easy and/or efficient access to the derivative of your objective function. If this is the case, I suggest you to have a look at the field of Black-box optimization and derivative-free optimization methods where this problematic of finding good solutions within a reasonable budget of function evaluations is heavily studied.
As always, there is no perfect answer: methods require more or less function evaluations in order to achieve good solutions. So the choice will depend on how heavy is the cost of evaluating your objective function (what is the maximum number of function evaluations you think is reasonable).
I suggest you to have a look at the CMA-ES method (http://cma.gforge.inria.fr/), which is a state-of art method with nice properties and performances. I am not sure it will fit your needs, but it is a start.
  • asked a question related to Metaheuristic
Question
10 answers
How can we decide between the use of metaheuristic unique solution (Hill climbing, tabou algo,...) and metaheuristic of population (Evolutionary algo)?
Relevant answer
Answer
You should first investigate methods with a solid theory behind it - metaheuristics are famous for NOT providing any guidelines, or for that matter any guarantees to find a satisfying answer. In any case, you should mention whatever you know about the problem, in order for any of us to provide pointers to the right way of modelling the problem, perhaps, and also what tools to use when trying to solve the problem you have.
  • asked a question related to Metaheuristic
Question
7 answers
To optimize a production system by planning ~1000 timesteps ahead I try to solve an optimization problem with around 20000 dimensions containing binary and continuous variables and several complex constraints.
I know the provided information is little, but can someone give a hint which approach would be suitable for such big problems? Would you recommend some metaheuristic or a commercial solver?
Relevant answer
Answer
One strategy is to define variables as time intervals - if the time interval is used for something, then a variable is set to one, and you then pay the fee - or whatever - for this use. By starting with long time intervals the production planning problem may be easy to solve, as the number of variables is small. You then use that solution as a starting point for a new problem, in which the time-steps are, say, 1/2 or even 1/10th of the previous time interval; your previous solution is feasible in the new problems, which will help the solver to quickly find an optimum to the problem. If you re-iterate like this until you have as small a time interval that you want, your optimum will be the solution of the last problem. (This is a very simple way of performing a discretization procedure.)
  • asked a question related to Metaheuristic
Question
21 answers
Is there any new metaheuristic technique for optimization?
Relevant answer
Answer
Sure - at least one every day. Typically a rehash of an old one.
  • asked a question related to Metaheuristic
Question
16 answers
In the ε-constraint method, one objective will be used as the objective function, and the remaining objectives will be used as constraints using the epsilon value as the bound. In this case:
- Do we need to apply penalty method to handle the constraint?
- How to select the best solution?
- How to get the final Pareto set?
Relevant answer
Answer
You can perform a web search on "epsilon constrained method multi objective" which provides quite many hits on Google scholar - and see what others have done in the near past. It is a quite popular tool.
You may look at the useful links below:
I hope it can help u!
  • asked a question related to Metaheuristic
Question
5 answers
Hi all, I have the following questions:
1) Does the solution with meta-heuristic (say genetic algorithm or similar) or reinforcement learning converges faster for the same optimization problem?
2) Is the solution obtained using reinforcement learning an optimal one?
Thanks in advance
Relevant answer
Answer
Some times the approaches maintain certain simmilarity
  • asked a question related to Metaheuristic
Question
5 answers
Opposition-based Learning (OBL) is a new concept in machine learning, inspired from the opposite relationship among entities. This concept is used to initial population that randomly generated.
Relevant answer
Answer
I think Its really depend on how you initialize the population.
I assume that you randomize population S with n solution, then take the opposition population S^ based on S. Now, there are two possible ways you can do here is:
+ 1st: Compare each solution in S and S^ with their respective index. Take the solution with better fitness.
+ 2nd: You concatenate S and S^ then take n solution with the best fitness.
Both ways have the weakness is: Maybe, all position of all solution placed at the 1 side of search space. Especially, when that side belongs local optima, It will make your algorithm fast convergence (but in local optima)
I think the best way to use the opposition-based technique for the initial population is:
You randomize population S with n/2 solution, then create n/2 solution left with opposition-based technique. This way make sure that all position doesn't fall in the 1 side of the search space.
  • asked a question related to Metaheuristic
Question
8 answers
We are facing a true tsunami of “novel” metaheuristic. Are all of them useful? I referred you to Sorensen - Intl. Trans. in Op. Res. (2013) 1-16, DOI: 10.1111/itor.12001.
Relevant answer
Answer
there are people that thinks so. The reason is that apear every week a new algorithm which seeasm to be a copy from other only with different names
  • asked a question related to Metaheuristic
Question
34 answers
Being Scandinavian I have read and seen countless of times some representation of the children's story about the emperor's new clothes - by Hans Christian Andersen.
As you might know the emperor's new costume is soooo fine that it is weightless, and no-one - but a child - dares to speak up to the emperor and exclaim that he in fact is naked! Thanks to the child, the curtain opens, and we truly see that he indeed is naked.
I find metaheuristics - to some degree - to be that, too, within the mathematical optimization domain: quite shallow, devoid of solid theory, and often (but not always) a game of draw, guess, and jump (I do not know the exact English translation of the name of the children's game). I have no problem with it when we are dealing with very complicated combinatorial/integer/bilevel problems in industry, especially when we do not have an explicit formulation, such as when we need to deal with the use of simulations within the optimization, or uncertain coefficients. But then we are talking about industrial mathematics, which is something else than mathematics - which is an exact science.
The theme of metaheuristics was, I hope, originally an attempt to find "reasonable" (however that is defined) feasible solutions to the most difficult and large problems, especially for nonlinear integer models in industry, with the explicit sign that with these techniques we might hope get a fairly good solution if we are lucky, or we may not - as that is actually how it works: metaheuristics are NOT globally convergent in general, and they were - make a mental note of this - NOT EVEN CONSTRUCTED TO BE.
Yet there are plenty of scholars - especially in this forum, for a reason I do not fully understand - that insist on applying their favourite metaheuristic(s) on just anything. Yesterday I think it was when I at RG found a paper on a metaheuristic used to "solve" a very, very simple linear program with one (1) linear constraint. I blew my top, as they say. WTF is going on? I blew my top because I know how to solve to guaranteed optimality such a problem in under a 1/1000 of a second on a slow computer. It's a problem of complexity O(n) - hence the easiest problem on this planet.
Can any sane person closer to that field address this, please? It is irrational, to begin with; or it is simply the fact that the world of scientific endeavours no longer are defined by codes of conduct? I am really troubled by more and more often seeing this unscientific methodology being used, and I sure hope it never will be seriously compared with mathematical optimization. Well, in fact, it is compared every day in industry, and math always wins.
Relevant answer
Answer
A friend just told me not to answer (even if it is only now and then). He supposed that even good answers do not change people's mind. Anyway, we had these discussions again and again. But these discussions seem to get lost every now and then. So, let me bring it to one or two threads again, may be using different words.
Like in every field there may be good research and bad research. So, do not put all of it together into one pot with possibly even wrong comprehensions.
Regarding the metaheuristics community, there is some awareness at least in parts of it that not everything that seems shiny is shining. A good reference to explore this view is by our friend Kenneth Sörensen:
Metaheuristics—the metaphor exposed
And another good pointer is to the word matheuristics which we framed to investigate the interoperation of metaheuristics and mathematical programming techniques. Is it allowed to point to a wikipedia page? May be yes: https://en.wikipedia.org/wiki/Matheuristics
Have fun exploring and learning how to separate the wheat from the chaff :-)
  • asked a question related to Metaheuristic
Question
3 answers
I have successfully applied GA to the problem. I now wish to use other nature inspired metaheuristics to this problem. Pl suggest ways and methods to do so.
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
5 answers
Hello. I want to use a machine learning method for function regression in order to speed up metaheuristic methods for optimization. Given all these methods such as multilayer perceptrons, radial basis networks, suport vector regression, etc., there is a variance both in accuracy and in duration of training and evaluation. In some tests for checking accuracy and duration, others are fast in training and evaluation, but produce poor results, while others produce very accurate results but are very slow in training and evaluation. Others are somewhere in the middle. Should I choose accuracy over duration or the opposite? Or perhaps should I take the middle way?
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
5 answers
In the mixed-variable heuristic optimization domain, what is done when a categorical variable determines the existence of continuous or ordered discrete variables in each possible solution?
To illustrate, imagine an optimization problem to determine the best tool to cut paper.
In this problem, a variable tool can have the values "knife" or "scissors".
  • If its value is "scissors", there's the continuous-valued blade_size variable.
  • If it's "knife", there is the same blade_size continuous variable and also a num_of_teeth discrete variable
How can I deal with this problems using some metaheuristic designed to hadle categorical, continuous and discrete ordered variables?
My first tought was to set the problem to the max possible dimensionality and, after choosing the value of the categorical variable, select (if commands) which other variables are going to be optimized and used to evaluate the solution.
This probably will work, but it seems naive to me. Do other more sophisticated methods to deal with this kind of problem exists? If yes, what are these methods?
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
3 answers
I know that "applications of metaheuristics to design optimization of structures" is treated widely in the literature. This is not the case for structural analysis.
It would be interesting to see that this subject also is treated in this new book.
Good luck in your work.
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
4 answers
As can be read in the webpage of the metaheuristics network (http://www.metaheuristics.org), a metaheuristic is a set of concepts that can be used to define heuristic methods that can be applied to a wide set of different problems. In other words, a metaheuristic can be seen as a general algorithmic framework which can be applied to different optimization problems with relatively few modifications to make them adapted to a specific problem. Examples of metaheuristics include evolutionary algorithms, simulated annealing, tabu search, iterated local search, and ant colony optimization. Metaheuristics have been widely used to solve different combinatorial (and numerical) optimization problems, with the goal of obtaining a very good solution (but perhaps not the optimum) to NP-complete problems in which exact search methods are intractable even for small problem sizes.
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
5 answers
I'm looking for the package or source code to solve scheduling in hybrid flow shop using genetic algorithm or other metaheuristics.
Relevant answer
Answer
All the answers are great
  • asked a question related to Metaheuristic
Question
5 answers
As can be read in the webpage of the metaheuristics network (http://www.metaheuristics.org), a metaheuristic is a set of concepts that can be used to define heuristic methods that can be applied to a wide set of different problems. In other words, a metaheuristic can be seen as a general algorithmic framework which can be applied to different optimization problems with relatively few modifications to make them adapted to a specific problem. Examples of metaheuristics include evolutionary algorithms, simulated annealing, tabu search, iterated local search, and ant colony optimization. Metaheuristics have been widely used to solve different combinatorial (and numerical) optimization problems, with the goal of obtaining a very good solution (but perhaps not the optimum) to NP-complete problems in which exact search methods are intractable even for small problem sizes.
Relevant answer
Answer
All the answers are great