Questions related to Metaheuristic Algorithm
Over the last few decades, there have been numerous metaheuristic optimization algorithms developed with varying inspiration sources. However, most of these metaheuristics have one or more weaknesses that affect their performances, for example:
- Trapped in a local optimum and are not able to escape.
- No trade-off between the exploration and exploitation potentials
- Poor exploitation.
- Poor exploration.
- Premature convergence.
- Slow convergence rate
- Computationally demanding
- Highly sensitive to the choice of control parameters
Metaheuristics are frequently improved by adding efficient mechanisms aimed at increasing their performance like opposition-based learning, chaotic function, etc. What are the best efficient mechanisms you suggest?
I am not able to find a suitable answer in any research paper, why exacly we use metaheuristic algorithms with MCDM techniques to solve decision making problems. Though only MCDM techniques are enough or metaheuristic itself is enough sometiome to solve the same problem.
I want to use pyomo to model MILP problems in python such as Green Vehicle Routing Problem and others but use metaheuristics algorithms (GA, Harmony search, etc.) to solve those problems instead of using pyomo solvers, its part of my phd project and can not use derivative-based solvers at all; If the answer that I can not, what is the alternative to model the problems in python then solve them with metaheuristics algorithms because i found pyomo easy and time saving to be used in formulating the problems.
In general, the optimized model gives more accurate results than the local model but in my case, the traditional regression model shows a higher R2 value than the optimized model? Thank you in advance for your kind feedback.
Hey Everyone I need some assistance in implementing my work in Cloudsim tool. (Java Code) If anyone’s interested and has hands on experience in the same , kindly let me know !! Thanks
I am interested in the use of Extreme Value Theory (EVT) to estimate global optima of optimization problems (using heuristic and metaheuristic algorithms), however, it is a bit difficult to find them since the use of EVT is not usually the main objective of the studies. Could you help me by sharing articles where this procedure is used? Thank you in advance.
I have simulated an industrial process by ASPEN and I want to optimize the operational parameters (decision) to maxmize the final yield (objective) while minmizing the energy comsumption (second objective)
Therefore, I would like to build a ML model to use it to optimize/decide the best operational parameters.
My plan is to use metaheuristic models such GA (Genetic Algorithm) but I have a difficulty to know what are the steps to build the model? How this optimization algorithm can be be implimented as a supervised ML Model?
I'm looking for the implementation of advanced metaheuristic algorithms in Python. I would be incredibly thankful if someone could assist me with the execution of evolutionary algorithms or can provide me with the necessary codes in Python.
Thank you very much.
I am a beginner at the optimization of trusses with metaheuristic algorithms. anyone, please help me get any MatLab code that helps me optimize 10 bar truss?
What are the most effective operators that can be incorporated into metaheuristic algorithms to rectify the problem of low population diversity and hence premature convergence? Any additional tips on this matter would be very helpful.
Suppose that if we compare two metaheuristics X and Y in a given real problem, X returns a better solution than Y, while when we use the same metaheuristics to solve global optimization problems, Y returns a better solution than X. Does this make sense? what is the reason?
Is there any libraries of keras python or software where we can use different metaheuristic algorithms to train Artificial neural network ? or Books that explain this methodology
By softwares, i mean Rapidminer, Matlab, Knime, Microsoft Azure ML ... etc or libraries of python, C++ that enable to use different metaheuristic algorithms to train ANN and SVM models.
The goal is to compare a developped model to these different combinations of metaheuristic trained models.
Like other meta-heuristic algorithms, some algorithms tend to be trapped in low diversity, local optima and unbalanced exploitation ability.
1- Enhance its exploratory and exploitative performance.
2- Overcome premature convergence (increase the fast convergence) and ease of falling (trapped) into a local optimum.
3- Increase the diversity of population and alleviate the prematurity convergence problem
4- The algorithm suffers from an immature balance between exploitation and exploration.
5- Maintain the diversity of solutions during the search, so that the tendency of stagnation towards the sub-optimal solutions can be avoided and the convergence rate can be boosted to obtain more accurate optimal solutions.
6- Slow convergence speed, inability to jump out of local optima and fixed step length.
7- Improve its population diversity in the search space.
I am looking a free of charge International Conference in metaheuristic algorithm or data mining issue, is there ay one can help me?
Both genetic algorithms and metaheuristic algorithms are optimization algorithms. Is one category of these two is included under the other?
I have a multi-objective optimization with the following properties:
Objective function: thee minimization objective functions) two non-linear functions and one linear function
Decision variable: two real variables (Bounded)
Constraint: three linear constraint (two bounding constraint and one relationship constraint)
Problem type: non-convex
Solution required: Global optimum
I have used two heuristic algorithms to solve the problem NSGA-II and NSGA-III.
I have performed NSGA-II and NSGA-III for the following instances (population size, number of generations, maximum number of functional evaluations(i.e. pop size x no. of gen)): (100,10,1000), (100,50,5000),(100,100,10000), (500, 10, 1000), (500, 50, 25000), and (500,100,50000).
Hypervolume increases with increase in number of functional evaluations. However, for a given population size, as the number of generation increases the hypervolume reduces. Which I think should rather increase. Why am I getting such an answer?
I want to use Firefly algorithm for feature selection. Can anyone help in suggesting different transformation methods?
We have implemented four metaheuristic algorithms to solve an optimization problem. Each algorithm is repeated 30 times for an instance of the problem, and we have stored the best objective function values for 30 independent runs for each algorithm.
We want to compare these four algorithms. Apart from maximum, minimum, average, and standard deviation, is there any statistical measure for comparison?
Alternatively, we have four independent samples each of size 30, and we want to test the null hypothesis that the means (or, medians) of these four samples are equal against an alternative hypothesis that they are not. What kind of statistical test should we perform?
I intend to define a frequency domain objective function to design an optimal controller for Load-Frequency Control (LFC) of a power system. My purpose is to optimize this objective function (finding the proper location of zeros and poles) using meta-heuristic methods.
I will be happy if you share your valuable relevant and informative experiences, references and articles in this field including how to define and how to code.
We invite all researchers and practitioners who are developing algorithms, systems, and applications, to share their results, ideas, and experiences.
Topics of interest include, but are not limited to, the following:
Theoretical aspects of hybridization
Automated parameter tuning
Evolutionary Computation Algorithms
Filtering and enhancement
Edge detection and segmentation
Quantum Image Processing
more information: http://diegoliva.com/cfp/cbc_ahmip/
I'm working on some optimal strategies for an environmental surveillance network. My solution is almost based on the meta-heuristic solution. I have to know what the advantages or disadvantages are of heuristic and meta-heuristic optimizations.
Hello scientific community
Do you noting the following:
[I note that when a new algorithms has been proposed, most of the researchers walk quickly to improve it and apply for solving the same and other problems. I ask now, so why the original algorithm if it suffer from weakness, why the need for a new algorithm if there are an existing one that solved the same problems, I understand if the new algorithm solved the unsolved problem so welcome, else why?]
Therefore, I ask, is the scientific community need a novel metaheuristic algorithms (MHs) rather than the existing.
I think, we need to organized the existing metaheuristic algorithms and mentioned the pros and cons for each one, the solved problems by each one.
The repeated algorithms must be disappear and the complex also.
The dependent algorithms must be disappeared.
We need to benchmark the MHs similar as the benchmark test suite.
Also, we need to determine the unsolved problems and if you would like to propose a novel algorithm so try to solve the unsolved problem else stop please.
Thanks and I wait for the reputable discussion
I am conducting a study on how to best use hybrid methods to improve MPPT accuracy and speed. I've hit a stumbling block with the experiment design. I've been advised to use a factorial design but I'm not sure which factors to use as independent variables. My research (honours research) is based on using the grey wolf optimization and p&o methods in an mppt. I have chosen the perturbation step size and number of iterations as independent variable and took the convergence time and accuracy as dependent variables. I've never done a factorial experiment before, are there any suggestions as to which independent variables I should pick that would be worth studying?
I know that PSO is an iterative and evolutionary metaheuristic algorithm but, to save my energy, I have to get some clues about coding it in GAMS.
Is it possible to implemnt it into GAMS or no need to struggle with it?
The routing problem can be easily solved using ILP or mixed ILP, why metaheuristic algorithms are required to solve this problem?
Hi, I am working on a research paper in which I want to compare the performance of several (meta)heuristics (including GA) in solving a certain problem. I have run each algorithm several times and found out that my GA is not able to find the good solution that other (meta)heuristics find in a short time. It converges to a solution which I know is not the best (because other algorithms converge to a way better solution. I have increased the mutation rate to 0.2 in order to avoid getting trapped in a local optima and my crossover rate is 0.9.
I want to have an acceptable comparison/evaluation of the performance of these algorithms, So
my question is: Is there a problem with my GA or can I simply report the GA solution and explain that it performs poorly?
List the critical solved and unsolved global/constrained/complex optimization problems, combinatorial and engineering problems?
Metaheuristic algorithms can be classified in many ways
Several Metaheuristic algorithms are proposed to solve different optimization problems, but the solved is resolved and the unsolved is still unsolved.
List the unsolved problem or the critical solved problem that need an optimal solution to help the researcher to solve the unsolved problems.
Thanks for you contributions
Mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems on the scale of one million design variables, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.
However, in some cases in which there are discontinuities in the objective function, there would be some problems due to non-differentiable problem. Some methods such as sub-gradients are proposed to solve such problems. However, I cannot find many papers in the state-of-the-art of engineering optimization of discontinuous optimization using mathematical programming. Engineers mostly use metaheuristics for such cases.
Can all problems with discontinuities be solved with mathematical programming? Is it easy to implement sub-gradients for large scale industrial problems? Do they work in non-convex problems?
A simple simple example of such a function is attached here.
Instead of manual tuning of algorithm's parameters, it is recommended to utilize automatic algorithm configuration software. Mostly beacuse it was shown that they increase manyfold the algorithm's perfomance. However, there are some differences among the proposed configuration software and beside those listed in (Eiben, Smit, 2011) it is important to gather experiances from the researchers. I would like to hear how does one decide on the stopping criteria, or values for parameters, for heuristic steps within the stochastic algorithm... there are so many questions.
I have experimental tests databases for steel structures (bracing and outrigger). If any of you have interest and specialty in this area can join our research group. We decided to include artificial neural network (ANN) combined with metaheuristic algorithm to estimate the results of experimental test.
I can share more information if you are interested.
You may contact me here or my personal email address:
When any metaheuristic algorithm is applied then how we can say that the problem is free from premature or local convergence.
Is it equal to the size of the population given initially?
E.g. If I taken initial population size=100 and no. of generations=100, then is it necessary that the size of population in 100th generation is 100?
I have taken Population size =100
No.of generations=100 (default)
after i run GA in optim tool, I have observed that population size that is obtained in last generation is 100x28
Bat-inspired algorithm is a metaheuristic optimization algorithm developed by Xin-She Yang in 2010. This bat algorithm is based on the echolocation behaviour of microbats with varying pulse rates of emission and loudness.
The idealization of the echolocation of microbats can be summarized as follows: Each virtual bat flies randomly with a velocity vi at position (solution) xi with a varying frequency or wavelength and loudness Ai. As it searches and finds its prey, it changes frequency, loudness and pulse emission rate r. Search is intensified by a local random walk. Selection of the best continues until certain stop criteria are met. This essentially uses a frequency-tuning technique to control the dynamic behaviour of a swarm of bats, and the balance between exploration and exploitation can be controlled by tuning algorithm-dependent parameters in bat algorithm. (Wikipedia)
What are the applications of bat algorithm? Any good optimization papers using bat algorithm? Your views are welcome! - Sundar
I have applied metaheuristic algorithms such as PSO, GA in my research field which is recommender system but what I have found is these algorithms are very time consuming and really not practical, though the result is better than the existing algorithms. In recommender systems, we need fast algorithm. Thank you.
Some metaheuristics prove their superior performance in some kind of problems. Some of them are continuous optimization problems and others in discrete or binary optimization problems.
Since the early 90’s, metaheuristic algorithms have been continually improved in order to solve a wider class of optimization problems. To do so, different techniques such as hybridized algorithms have been introduced in the literature. I would be appreciate if someone can help me to find some of the most important techniques used in these algorithms.
- Orthogonal learning
- Algorithms with dynamic population
I have a question if someone can answer me please..
are there other ways to compute the performance of a metaheuristic algorithm if there is no best known solution to compare with(I created my own data set of instances). I know that it is possible to do a comparison with lower bounds, but I want to know if other manners exist. I'm talking about a mono objective problem. The metaheristic I used is one of the variants of VNS.
Thank you in advance
Recently, there have been published many metaheuristic algorithms mostly based on swarm intelligence. The good future for these field can be applying these algorithms for solving some real problems in the different sectors such as business, marketing, management, intelligent traffic systems, engineering, health care and medicine. Please let's discuses about their applications in the real world and share our case studies.
When two metaheuristic algorithms are combined and a hybrid metaheuristic is developed, how can we evaluate the hybrid metaheuristic in order to make sure that it is better than the two original ones? Is it possible to do that with a small number of iterations (<3000)? Should we test the algorithms with the same number of populations (feasible solutions)?
I have a question if someone can answer me please.
As for the majorioty of metaheuristic algorithm, a step called initialization is a necessary for the methods. There are some ways to complete the initialization such as random initializtion which require seting the upper and lower bound. Now, I wonder if the metaheuristic algorithm could be applied for calculating the unbounded problem.
We know, In MATLAB software working in two types of environment. MATLAB Coding Script and MATLAB Simulink. For optimization smart grid operation many respected authors used different types of metaheuristic algorithms to solve this problem. The maximum author chose MATLAB Code to solve metaheuristic algorithms. Now my question: is it efficient to use MATLAB Simulink to replace MATLAB code to solve metaheuristic algorithms for optimization smart grid operation?
Hell, everyone. I am a student of electrical engineering and my research field is related to the optimization of a power system.
I know that the algorithm that we should choose depends on our problem but there are lots of heuristics, metaheuristic algorithms available to choose from. It will also take some time to understand a specific algorithm and after that maybe we came to know that the chosen algorithm was not the best for my problem. So as per my problem how can I choose the best algorithm?
Is there any simple solution available that can save my time as well?
Thank you for your precious time.
Basically I get many research paper where are different types of metaheuristic algorithms used in the smart grid or the smart microgrid operation for optimization. I just want to know which algorithms are essential to optimize smart grid operation.
Hi dear researcher,
i want to do the same optimization as the follows fminsearchbnd optimization with one of metaheuristic algorithm as GA:
% % %%%read data
filename = 'D_test2.xlsx';
sheet = 1;
xlRange = 'A2:E60';
Mm = xlsread(filename, sheet, xlRange);
% % %Get the data
% % %X=(Mm(:,1:13)); %%% money ens data
HB=(Mm(:,2)); %%% hauteur brute
P=(Mm(:,3)); %%% puissance
% options = optimset('PlotFcns',@optimplotfval);
%options = optimset('PlotFcns',@optimplotfval);
opts = optimset('fminsearch');
%opts = optimset('fminsearch',@optimplotfval);
opts.Display = 'iter';
%opts.TolX = 1.e-12;
opts.TolFun = 1.e-12;
opts.MaxFunEvals = 100;
sse =@(x)sum((x1 - x(3)./(k.*x(2).*x(4))).^ 2);
LB=[36.5*ones(n,1) 50*ones(n,1) 0.73*ones(n,1)];
UB=[41.5*ones(n,1) 396*ones(n,1) 0.94*ones(n,1)];
%sse = fminsearch(sse,x0,options)
%xsol = fminsearchbnd(sse,x0,LB,UB,opts)
[xsol,fval,exitflag,output] = fminsearchbnd(sse,x0,LB,UB)
Here i expect to find a predictor vector which minimizes my objective function. I attached here a "D_test2" need in this programm.
Hi all, I have the following questions:
1) Does the solution with meta-heuristic (say genetic algorithm or similar) or reinforcement learning converges faster for the same optimization problem?
2) Is the solution obtained using reinforcement learning an optimal one?
Thanks in advance
In problems with many local optima (multimodal) and many variables to optimize (multidimensional) which PSO variants are those that provide:
- better exploration capabilities at the beginning of the search,
- possibility of escaping local optima,
- capabilities to find the optimal solution when it is not at the center of the coordinate system,
- better quality of the final solution (more exploitation in the final period of the search process), and
- low computational load (less evaluations of the objective function, shorter computation times)
It is grateful that in your response the bibliographic source where the PSO version is published is informed.
In reference point-based multi-objective optimization, the reference points are determined by DM.
What are the methods that can be used to calculate or determine reference points for a multi-objective optimization problem?
At my study, I am going to use the binary version of metaheuristic algorithm in the feature selection problem for a big data set (for example 1000 features or more). The metaheuristic algorithms such as binary PSO, binary ACO,...
But the point is that how can I select the relevant features? Whether are they automatically determined by the algorithm or they are fixed to the algorithm which can determine the best features with the same fixed value?
What I want to know is that is there any limitation to select the number of desirable features?
In advance thank you so much
How does Reinforcement Learning (RL) relate to Metaheuristic Search (MS) techniques like Genetic Algorithm, Particle Swarm Optimization, etc. and to Generative Adversarial Learning used in GANs? I feel that MS is similar to RL in the sense that it creates the next population based on feedback about the suitability of candidates from the present population based on function evaluations. Similarly, the generator and discriminator networks in GANs train based on feedback from the each other. What are your views? What are some other methods apart from MS and GANs that are similar to RL but not consider to be RL exactly?
I want to calculate inverted generational distance (IGD) to evaluate the performance of a multi-objective optimization algorithm. I have the approximate Pareto fronts. But, I could not find the true Pareto front for the structural engineering problems, such as, welded beam, spring, gear design problems,... etc. Any one has the data of true PF for them?
It will be really good if the suggested journal doesn't spend much time in revision cycles, because I submitted this algorithm to "Applied soft computing" journal 1 year ago, and after 6 revision cycles, they just reject it with no real reasons.
As we know, exploration and exploitation are two important criteria in a population-based metaheuristic. I could find some formula to obtain a quantitative comparison of the exploration ability among different algorithms. but how we can compare the exploitation ability quantitatively among different metaheuristic algorithms?
I've read something about that for plastic analysis and design of tall buildings,we can't use the VPS metaheuristic algorithm... is it a general limitation for all algorithm due to complexity of our problem and generating and combining the mechanisms or something else?
I have a science project in topic about metaheuristic algorithm which using Golden Section Search Algorithm, the characteristic of this algorithm is optimize 1 variable, how to optimize multi-variable same time?
Hope anybody help me, thankyou
I want to simulate a routing protocol in the UWSN by using metaheuristic algorithm. We chose matlab as the simulator. We distribute the sensors in a 3D environment. we want to measure energy consumption.
We may need to code: Energy consumption model, Network lifetime, and Propagation speed.
What else may be required? I need this for implementation of UWSN routing scheme. Please advice. Any recommendations?
Pareto solutions are found below true PF in case of 3D DTLZ1 problem. Whereas they are found above true PF in case of 2D DTLZ1 problem.
I have checked and confirmed problem formulation and variable bound. ( variables used are 3).
Algorithm converges on DTLZ-2, 3, 4.
What might be the cause or area of improvement in MO algorithm.
Reference: Deb K., Thiele L., Laumanns M., Zitzler E. (2005) Scalable Test Problems for Evolutionary Multiobjective Optimization. In: Abraham A., Jain L., Goldberg R. (eds) Evolutionary Multiobjective Optimization. Advanced Information and Knowledge Processing. Springer, London
My solution set in the Cartesian system placed in range x=0.5 and y=0.3 as diagonal line, but pareto set is in range placed in range x=0.5 and y=0.5 as diagonal line.
Is it possible the solution set pass from the pareto set ?
or Is it possible the solution set being bether than the pareto set ?
Can you recommend papers that prove theoretically that different versions of metaheuristic algorithms are actually giving similar performance? So far I have looked at the No Free Lunch (NFL) theorem and also read some papers that compare the performances, but these papers only report empirical results.
I want a list of Meta-heuristics algorithms that consider the velocity vector to update the solution's position in the next iteration like: PSO, GSA, DA (Dragonfly) ...
I would like to know which is the best strategy for a in "incomplete" TSP. It is incomplete because I do not know the reciprocal distances among the "cities" but I can evaluate the overall cost to travel across all "cities" (each city is visited, and only once etc..). I have to deal with ~3000 cities and the function to evaluate the path is pretty costly (~40 seconds) but I can make use of multiprocessing (I span computation on 20 processors). I have tried Tabu Search that shows I can improve solution respect to previous "greedy" attempts. Unfortunately Genetic Algorithms did not yield good solutions (the distance increases!!) likely because crossover and/or mutation inject noise instead of variable information. I used "ordered" crossover and I implemented mutation as swaps among "contiguous" cities. Maybe I would better go to Simulate Annealing but I am interested to know if for such a problem there might be a chance for GAs?!