All content in this area was uploaded by J. David Schaffer on Nov 25, 2014
Content may be subject to copyright.
... We utilize Excel Solver to execute a GA with these default settings. Schaffer et al. (1989) suggest that these default parameters can be adjusted to better suit specific optimization tasks. Firstly, according to Schaffer et al., the population parameter can vary between 10 and 200, offering flexibility in balancing exploration and computational demands. ...
... Secondly, mutation, a key genetic operator, introduces random changes to individuals in the population, fostering diversity and the discovery of novel solutions. The mutation rate, usually falling between 0.001 and 0.1, dictates how frequently these mutations occur (Schaffer et al., 1989); Thirdly, the convergence value, often referred to as the tolerance, establishes the stopping criterion for the algorithm. Smaller convergence values enhance precision but may necessitate more iterations and more computational time. ...
... Compared to 86.03, a tolerance as small as 0.0001 should guarantee that the algorithm attains a satisfactory level of precision. The other parameter settings fall within the range examined by Schaffer et al. (1989). In brief, these default settings for GA parameters are typically chosen to strike a balance between computational efficiency and the likelihood of finding a satisfactory solution within a reasonable timeframe. ...
... (12) [49]. Table 2 GA alternatives GA 1 : Three values for this alternative: population size, crossover rate, and mutation rate, are proposed by [41] GA 2 : Population size and mutation rate are proposed by [42] and crossover rate by [41] GA 3 : Population size and mutation rate are proposed by [41] and crossover rate by [42] GA 4 : Population size, crossover rate, and mutation rate are proposed by [43] In Eq. (12), F is the CDF, X (i) 's are ordered observations, F = (1/n) n i=1 F , and n are sample sizes. The RMSE is a statistical measure used to assess model performance in studies such as climate research and air quality. ...
... (12) [49]. Table 2 GA alternatives GA 1 : Three values for this alternative: population size, crossover rate, and mutation rate, are proposed by [41] GA 2 : Population size and mutation rate are proposed by [42] and crossover rate by [41] GA 3 : Population size and mutation rate are proposed by [41] and crossover rate by [42] GA 4 : Population size, crossover rate, and mutation rate are proposed by [43] In Eq. (12), F is the CDF, X (i) 's are ordered observations, F = (1/n) n i=1 F , and n are sample sizes. The RMSE is a statistical measure used to assess model performance in studies such as climate research and air quality. ...
Wind speed modeling is a crucial tool for the use of sustainable energy by reducing fossil fuel dependence. This implies energy efficiency of a wind turbine and the assessment of wind energy potential for renewable energy development. Weibull distribution is commonly used in wind speed modeling due to its flexibility and effectiveness and to determine wind speed patterns. Therefore, this paper focuses on parameter estimates of the Weibull distribution using genetic algorithm (GA) optimization based on the maximum likelihood (ML) method. This study addresses the evaluation of the different fitness functions and the selection of different GA parameter sets, including population size, crossover rate, and mutation rate. The proposed fitness function in GA provides to estimate the shape parameter of the Weibull distribution. The proposed alternatives to the GA method are evaluated based on Kolmogorov–Smirnov (KS), coefficient of determination (R ² ), root mean square error (RMSE), Akaike information criterion, Bayesian information criterion, and power density error (PDE) over three different wind speed datasets. The grey relational analysis is used in ranking the GA alternatives. The best GA alternative is also compared to particle swarm optimization based on maximum likelihood estimation and provides satisfactory results based on R ² , RMSE, and PDE. A simulation study is performed to evaluate the performances of GA alternatives with respect to deficiency criterion. Finally, we recommend the GA 1 and GA 3 alternatives for Weibull parameter estimation; these alternatives contribute to the selection of GA parameter sets in practice.
... This package in particular and genetic algorithms in general have lots of parameters or knobs to tune for good performance. Since [26] provides robust settings after extensive experiments on the same test suite as mine, I used the settings suggested there. For reference, the list of the parameters and their settings is also given in Table 1. ...
... This test suite was a standard test suite in the genetic algorithms community in 1990s. The last two functions (F6 and F7) are from [26]. These two functions were specifically designed to optimize the control parameters of genetic search. ...
The traveling salesman problem (TSP) and the graph partitioning problem (GPP) are two important combinatorial optimization problems with many applications. Due to the NP-hardness of these problems, heuristic algorithms are commonly used to find good, or hopefully near-optimal, solutions. Kernighan and Lin have proposed two of the most successful heuristic algorithms for these problems: The Lin-Kernighan (LK) algorithm for TSP and the Kernighan-Lin (KL) algorithm for GPP. Although these algorithms are problem specific to TSP and GPP, they share a problem-agnostic mechanism, called variable depth search, that has wide applicability for general search. This paper expresses this mechanism as part of a general search algorithm, called the Kernighan-Lin Search algorithm, to facilitate its use beyond the TSP and GPP problems. Experimental comparisons with other general search algorithms, namely, genetic algorithms, hill climbing, and simulated annealing, on function optimization test suites confirm that the new algorithm is very successful in solution quality and running time.
... For MADCP, we tune the following parameters: population size (from 50 to 500), number of generations (from 50 to 200), crossover rate (from 0.6 to 1.0), light absorption coefficient ( ) (from 0.1 to 1.0), and inertia weight ( ) (from 0.4 to 0.9). For GA, following the guidance from [48], we tune the following parameters: population size (from 50 to 500), number of generations (from 50 to 200), crossover rate (from 0.6 to 1.0), mutation rate (from 0.01 to 0.05). For FA, following the guidance from [49], we focus on tuning: number of fireflies (from 20 to 200), randomization parameter ( ) (from 0.1 to 1.0), attractiveness coefficient ( ) (from 0.1 to 1.0), and light absorption coefficient ( ) (from 0.1 to 1.0). ...
... During the training phase, the video resolution is set to 480p, while in the evaluation phase, it is reduced to 240p to test the adaptability of different techniques. In the training phases, across all three metrics, IMPALA consistently demonstrates superior performance, achieving the fastest convergence (around iteration [40][41][42][43][44][45][46][47][48][49][50] and the lowest final values. PPO emerges as the second-best performer, followed by A3C with moderate performance. ...
The growing IoT landscape requires effective server deployment strategies to meet demands including real-time processing and energy efficiency. This is complicated by heterogeneous, dynamic applications and servers. To address these challenges, we propose ReinFog, a modular distributed software empowered with Deep Reinforcement Learning (DRL) for adaptive resource management across edge/fog and cloud environments. ReinFog enables the practical development/deployment of various centralized and distributed DRL techniques for resource management in edge/fog and cloud computing environments. It also supports integrating native and library-based DRL techniques for diverse IoT application scheduling objectives. Additionally, ReinFog allows for customizing deployment configurations for different DRL techniques, including the number and placement of DRL Learners and DRL Workers in large-scale distributed systems. Besides, we propose a novel Memetic Algorithm for DRL Component (e.g., DRL Learners and DRL Workers) Placement in ReinFog named MADCP, which combines the strengths of Genetic Algorithm, Firefly Algorithm, and Particle Swarm Optimization. Experiments reveal that the DRL mechanisms developed within ReinFog have significantly enhanced both centralized and distributed DRL techniques implementation. These advancements have resulted in notable improvements in IoT application performance, reducing response time by 45%, energy consumption by 39%, and weighted cost by 37%, while maintaining minimal scheduling overhead. Additionally, ReinFog exhibits remarkable scalability, with a rise in DRL Workers from 1 to 30 causing only a 0.3-second increase in startup time and around 2 MB more RAM per Worker. The proposed MADCP for DRL component placement further accelerates the convergence rate of DRL techniques by up to 38%.
... Screenshots illustrating the three different BIM inputs used in the methodology are displayed in Figure 7. Various researchers have suggested different optimal sizes for population parameter. Recommended population sizes range from as low as 16 [80] to approximately 20-30 [67,81] and occasionally extend to larger ranges like 50-100 population size. Taking into account the findings of other researchers, we have chosen a population size range of 20-100. ...
This study introduces a new approach by applying the Whale Optimization Algorithm (WOA) to create construction schedules using geometric data from Building Information Modeling (BIM). The algorithm utilizes 3D model information to establish stability criteria, which are organized in a Directed Design Structure Matrix (DSM). These criteria are integrated into the WOA Fitness function to enhance the constructability of schedules, where each schedule is symbolized as a unique whale. Through iterative WOA computations, the approach consistently achieves maximum constructability scores starting from randomly generated schedules, affirming the efficacy of this method. The results reveal that the proposed algorithm effectively produced fully executable project schedules from diverse inputs. Despite variations in computational times due to different input parameters, the experiments verified the consistent generation of schedules that are 100% executable.
... Considering the existing experiences, the most appropriate crossover rate ranges between 0.75 and 0.9 (Schaffer, 1989) and the mutation rate should be much lower than the crossover rate (Cabrera et al. 2002). The crossover rate and the mutation rate were set as 0.9 and 1/ funcno respectively in NSGA-II (Deb et al., 2002) for real-coded GAs. ...
As the work environment changes dynamically in real-world manufacturing systems, the dynamic flexible job shop scheduling is an essential problem in operations research. Some works have taken rescheduling approaches to solve it as the multi-objective optimization problem. However, previous studies focus more on solution quality improvements while ignoring computation time. To get a quick response in the dynamic scenario, this paper develops a fully parallel Non-dominated Sorting Genetic Algorithm-II (NSGA-II) on GPUs and uses it to solve the multi-objective dynamic flexible job shop scheduling problem. The mathematical model is NP-hard which considers new arrival jobs and seeks a trade-off between shop efficiency and schedule stability. The proposed algorithm can be executed entirely on GPUs with minimal data exchange while parallel strategies are used to accelerate ranking and crowding mechanisms. Finally, numerical experiments are conducted. As our approach keeps the original structure of the conventional NSGA-II without sacrificing the solutions’ quality, it gains better performance than other GPU-based parallel methods from four metrics. Moreover, a case study of a large-size instance is simulated at the end and displays the conflicting relationship between the two objectives.
... To comprehensively assess the effectiveness of GA, this study employs multiple sets of crossover and mutation rates. Specifically, we utilize nine sets of crossover and mutation rate parameters from the [54], encompassing mutation rates of 0.001, 0.01, and 0.1, as well as crossover rates of 0.05, 0.55, and 0.95. Table 6 summarizes the key parameters used in the simple GA. ...
... It is well known that the probabilities of crossover and mutation play a crucial role in the performance of the genetic algorithm [50]. However, traditional genetic algorithms use fixed probabilities, which hinders their ability to maintain population diversity and lead to early convergence to local optima [51]. To address this problem, Srinivas et al. proposed the adaptive genetic algorithm wherein the probabilities of crossover and mutation are varied depending on the fitness values of the solutions [48]. ...
Herd effect is a widespread phenomenon in real-world situations. This study explores how the herd effect can be used to manage network flow effectively. We examined its impact on travelers’ route choices and propose a mixed network flow evolution process that incorporates the herd effect, considering two types of travelers: those who receive route subsidy information and those who do not. Based on this evolution process, we developed a bi-level optimization model to determine the optimal subsidized routes, the subsidy amounts per kilometer, and the proportion of travelers receiving subsidy information. A hybrid algorithm with two iterative procedures was proposed to solve the model, in which the adaptive genetic algorithm (AGA) was employed to solve the upper-level nonlinear mixed-integer programming problem, and the partial linearization method was used to solve the lower-level network flow evolution process. Numerical results indicate that the presence of herd effect can effectively reduce both the total travel time of the network and the overall subsidy costs. The findings of this study have significant implications for the utilization of the herd effect in designing navigation software and developing congestion pricing strategies.
... Since small size islands with the migration lead to premature convergence while the algorithm with large size islands converges slower [31], we set there are 64 (8×8) individuals in one island. Furthermore, the values of crossover rate and mutation rate are given as 0.9 and 0.1 respectively on account of the existing experiences that the most appropriate crossover rate ranging between 0.75 and 0.9 [36] and the mutation rate is supposed to be much lower than the crossover rate [37]. ...
Due to new government legislation, customers' environmental concerns and continuously rising cost of energy, energy efficiency is becoming an essential parameter of industrial manufacturing processes in recent years. Most efforts considering energy issues in scheduling problems have focused on static scheduling. But in fact, scheduling problems are dynamic in the real world with uncertain new arrival jobs after the execution time. This paper proposes a dynamic energy efficient flexible flow shop scheduling model using peak power value with the consideration of new arrival jobs. As the problem is strongly NP-hard, a priority based hybrid parallel Genetic Algorithm with a predictive reactive complete rescheduling approach is developed. In order to achieve a speedup to meet the short response in the dynamic environment, the proposed method is designed to be highly consistent with NVIDIA CUDA software model. Finally, numerical experiments are conducted and show that our approach can not only achieve better performance than the traditional static approach, but also gain competitive results by reducing the time requirements dramatically.
The percentages of losses in water supply systems are alarming. Hydraulic modeling and simulation are widely used resources for evaluating network behavior and identifying better operating conditions and configurations. To this end, EPANET is a widely used software because of its open-source code and the possibility of programming in different languages. In this study, a methodology was proposed where two theoretical networks with different sizes and configurations were developed, but with pressures higher than the recommended range in the standard. The aim of the study was to apply and compare two algorithms developed in Python and coupled with EPANET: a random search and one involving evolutionary theory—genetic algorithms, aiming at aligning the network pressures within the range established in the standard. Nine scenarios of the GA were tested, varying recombination and mutation parameters. Both algorithms were able to adjust the network pressures to Brazilian standards, reducing the percentages of losses with relatively similar results. Among the studied scenarios, a greater influence of the mutation rate was observed in relation to the recombination rate, with lower losses in those with a mutation rate of 5%.
Keywords: hydraulic simulation; water distribution networks; genetic algorithms
ResearchGate has not been able to resolve any references for this publication.