Fig 3 - uploaded by Simone A. Ludwig
Content may be subject to copyright.
Clustering results for the Mouse data set, where the black boxes represent the centroids. 3(a) The original Mouse data set. 3(b) The clustering results with CGSO using fitness function F3. 3(c) The clustering results with K-means. TABLE IV RUNNING TIME AND NUMBER OF ITERATIONS
Source publication
High-quality clustering techniques are required for the effective analysis of the growing data. Clustering is a common data mining technique used to analyze homogeneous data instance groups based on their specifications. The clustering based nature-inspired optimization algorithms have received much attention as they have the ability to find better...
Contexts in source publication
Context 1
... CGSO-F3 obtained the best entropy for the VaryDensity data set. Figure 3 shows the visualization of clustering quality results (best run is selected from the highest function results) of the Mouse data set. Figure 3(b) shows that the clustering quality results of CGSO-F3 (obtains the best results among the three functions), and Figure 3(c) shows the clustering quality results of K-means. ...
Context 2
... 3 shows the visualization of clustering quality results (best run is selected from the highest function results) of the Mouse data set. Figure 3(b) shows that the clustering quality results of CGSO-F3 (obtains the best results among the three functions), and Figure 3(c) shows the clustering quality results of K-means. It can been seen that CGSO-F3 is able to assign the data instances to the correct clusters, with highest purity of 0.896, while K-means' purity result is 0.827. ...
Context 3
... 3 shows the visualization of clustering quality results (best run is selected from the highest function results) of the Mouse data set. Figure 3(b) shows that the clustering quality results of CGSO-F3 (obtains the best results among the three functions), and Figure 3(c) shows the clustering quality results of K-means. It can been seen that CGSO-F3 is able to assign the data instances to the correct clusters, with highest purity of 0.896, while K-means' purity result is 0.827. ...
Similar publications
Citations
... Subsequently, the decision variables (UAV paths) are updated according to the selection order. The first phase optimizes the flight path through the JADE algorithm (line 11), and the second phase refines the flight path through the Dubins algorithm (line 13), followed by an update of the environmental variables b and the selection of the next parent subpopulation (lines [15][16][17][18][19][20][21][22]. Finally, the algorithm outputs a path planning solution that effectively selects decision variables during the planning process, improves the computational efficiency of the algorithm, and ensures the efficient and safe operation of the UAVs. ...
When dealing with UAV path planning problems, evolutionary algorithms demonstrate strong flexibility and global search capabilities. However, as the number of UAVs increases, the scale of the path planning problem grows exponentially, leading to a significant rise in computational complexity. The Cooperative Co-Evolutionary Algorithm (CCEA) effectively addresses this issue through its divide-and-conquer strategy. Nonetheless, the CCEA needs to find a balance between computational efficiency and algorithmic performance while also resolving convergence difficulties arising from the increased number of decision variables. Moreover, the complex interrelationships between the decision variables of each UAV add to the challenge of selecting appropriate decision variables. To tackle this problem, we propose a novel collaborative algorithm called CCEA-ADVS. This algorithm reduces the difficulty of the problem by decomposing high-dimensional variables into sub-variables for collaborative optimization. To improve the efficiency of decision variable selection in the collaborative algorithm and to accelerate the convergence speed, an adaptive decision variable selection strategy is introduced. This strategy selects decision variables according to the order of solving single-UAV constraints and multi-UAV constraints, reducing the cost of the optimization objective. Furthermore, to improve computational efficiency, a two-stage evolutionary optimization process from coarse to fine is adopted. Specifically, the Adaptive Differential Evolution with Optional External Archive algorithm (JADE) is first used to optimize the waypoints of the UAVs to generate a basic path, and then, the Dubins algorithm is combined to optimize the trajectory, yielding the final flight path. The experimental results show that in four different scenarios involving 40 UAVs, the CCEA-ADVS algorithm significantly outperforms the Grey Wolf Optimizer (GWO), Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), and JADE algorithms in terms of path performance, running time, computational efficiency, and convergence speed. In addition, in large-scale experiments involving 500 UAVs, the algorithm also demonstrates good adaptability, stability, and scalability.
... During each restart, the obstacle list is cleared (line 9), and then paths are optimized using PSO according to priority, with the obstacle list updated to avoid conflicts (lines [11][12][13][14]. The restart strategy randomly reassigns priorities and repeats path planning until the preset number of restarts is reached (lines [8][9][10][11][12][13][14][15][16][17]. Finally, the algorithm outputs a path planning solution that can adapt to the complex environment optimization and ensure the efficient and safe operation of the UAV. ...
Evolutionary algorithms exhibit flexibility and global search advantages in multi-UAV path planning, effectively addressing complex constraints. However, when there are numerous obstacles in the environment, especially narrow passageways, the algorithm often struggles to quickly find a viable path. Additionally, collaborative constraints among multiple UAVs complicate the search space, making algorithm convergence challenging. To address these issues, we propose a novel hybrid particle swarm optimization algorithm called PPSwarm. This approach initially employs the RRT* algorithm to generate an initial path, rapidly identifying a feasible solution in complex environments. Subsequently, we adopt a priority planning method to assign priorities to UAVs, simplifying collaboration among them. Furthermore, by introducing a path randomization strategy, we enhance the diversity of the particle swarm, thereby avoiding local optimum solutions. The experimental results show that, in comparison to algorithms such as DE, PSO, ABC, GWO, and SPSO, the PPSwarm algorithm demonstrates significant advantages in terms of path quality, convergence speed, and runtime when addressing path planning issues for 40 UAVs across four different scenarios. In larger-scale experiments involving 500 UAVs, the proposed algorithm also exhibits excellent processing capability and scalability.
... The algorithm attempts to minimize the squared error function or objective function presented in Eq. (14). ...
This study presents the K-means clustering-based grey wolf optimizer, a new algorithm intended to improve the optimization capabilities of the conventional grey wolf optimizer in order to address the problem of data clustering. The process that groups similar items within a dataset into non-overlapping groups. Grey wolf hunting behaviour served as the model for grey wolf optimizer, however, it frequently lacks the exploration and exploitation capabilities that are essential for efficient data clustering. This work mainly focuses on enhancing the grey wolf optimizer using a new weight factor and the K-means algorithm concepts in order to increase variety and avoid premature convergence. Using a partitional clustering-inspired fitness function, the K-means clustering-based grey wolf optimizer was extensively evaluated on ten numerical functions and multiple real-world datasets with varying levels of complexity and dimensionality. The methodology is based on incorporating the K-means algorithm concept for the purpose of refining initial solutions and adding a weight factor to increase the diversity of solutions during the optimization phase. The results show that the K-means clustering-based grey wolf optimizer performs much better than the standard grey wolf optimizer in discovering optimal clustering solutions, indicating a higher capacity for effective exploration and exploitation of the solution space. The study found that the K-means clustering-based grey wolf optimizer was able to produce high-quality cluster centres in fewer iterations, demonstrating its efficacy and efficiency on various datasets. Finally, the study demonstrates the robustness and dependability of the K-means clustering-based grey wolf optimizer in resolving data clustering issues, which represents a significant advancement over conventional techniques. In addition to addressing the shortcomings of the initial algorithm, the incorporation of K-means and the innovative weight factor into the grey wolf optimizer establishes a new standard for further study in metaheuristic clustering algorithms. The performance of the K-means clustering-based grey wolf optimizer is around 34% better than the original grey wolf optimizer algorithm for both numerical test problems and data clustering problems.
... We use entropy as a quality measure for a clustering algorithm [17,18]. It is calculated using Eq. ...
... 5.2b Entropy: Entropy is used to check the algorithm's ability to find semantic classes within each cluster, calculated by Eq. (6) [17,18]. The best value for the entropy measure is 0. It ranges in between 0 to 1. ...
K-medoids clustering algorithm is a simple yet effective algorithm that has been applied to solve many clustering problems. Instead of using the mean point as the centre of a cluster, K-medoids uses an actual point to represent it. Medoid is the most centrally located object of the cluster, with a minimum sum of distances to other points. K-medoids can correctly represent the cluster centre as it is robust to outliers. However, the K-medoids algorithm is unsuitable for clustering arbitrary shaped groups of objects and large scale datasets. This is because it uses compactness as a clustering criterion instead of connectivity. An improved k-medoids algorithm based on the crow search algorithm is proposed to overcome the above problems. This research uses the crow search algorithm to improve the balance between the exploration and exploitation process of the K-medoids algorithm. Experimental result comparison shows that the proposed improved algorithm performs better than other competitors.
... It is the ratio of the data with the highest number of data to the total number of data for each cluster and is calculated by Eq. 17. By maximizing Purity, we obtain better results [75,76]. ...
Clustering analysis is widely used in many areas such as document grouping, image recognition, web search, business intelligence, bio information, and medicine. Many algorithms with different clustering approaches have been proposed in the literature. As they are easy and straightforward, partitioning methods such as K-means and K-medoids are the most commonly used algorithms. These are greedy methods that gradually improve clustering quality, highly dependent on initial parameters, and stuck a local optima. For this reason, in recent years, heuristic optimization methods have also been used in clustering. These heuristic methods can provide successful results because they have some mechanism to escape local optimums. In this study, for the first time, Artificial Algae Algorithm was used for clustering and compared with ten well-known bio-inspired metaheuristic clustering approaches. The proposed AAA clustering efficiency is evaluated using statistical analysis, convergence rate analysis, Wilcoxon’s test, and different cluster evaluating measures ranking on 25 well-known public datasets with different difficulty levels (features and instances). The results demonstrate that the AAA clustering method provides more accurate solutions with a high convergence rate than other existing heuristic clustering techniques.
... Many optimization techniques are inspired by the social behavior of creatures in searching for food such as particle swarm optimization [110], [111], ant colony [118], artificial bee colony [119], [120], [121], glowworm [122], [123], firefly [124], [125], cuckoo search [126], bat search [127], and hunting search [128]. The swarm optimization algorithms are simple, robust, and do not require the implementation of complex mathematical formulations [129]. ...
Switched reluctance machines (SRMs) have recently attracted more interest in many applications due to the volatile prices of rare-earth permanent magnets (PMs) used in permanent magnet synchronous machines (PMSMs). They also have rugged construction and can operate at high speeds and high temperatures. However, acoustic noise and high torque ripples, in addition to the relatively low torque density, present significant challenges. Geometry and topology optimization are applied to overcome these challenges and enable SRMs to compete with PMSMs. Key geometric design parameters are optimized to minimize various objective functions within geometry optimization. On the other hand, the material distribution in a particular design space within the machine domain may be optimized using topology optimization. We discuss how these techniques are applied to optimize the geometries and topologies of SRMs to enhance machine performance. As optimizing the machine geometry and material distribution at the design phase is of substantial significance, this work offers a comprehensive literature review on the current state of the art and the possible trends in the optimization techniques of SRMs. The paper also reviews different configurations of SRMs and stochastic and deterministic optimization techniques utilized in optimizing different configurations of the machine.
... The glowworm swarm optimization (GSO) is a nature-inspired optimization algorithm based on lighting worms' natural behavior, which controls their light emission using it for different purposes [191]. K-means algorithm was combined with basic glowworm swarm optimization by Zhou et al. [192] for their proposed novel K-means image clustering algorithm based on GSO termed ICGSO to effectively override the problems inherent in the K-means algorithm and produce better clustering qualities. ...
K-means clustering algorithm is a partitional clustering algorithm that has been used widely in many applications for traditional clustering due to its simplicity and low computational complexity. This clustering technique depends on the user specification of the number of clusters generated from the dataset, which affects the clustering results. Moreover, random initialization of cluster centers results in its local minimal convergence. Automatic clustering is a recent approach to clustering where the specification of cluster number is not required. In automatic clustering, natural clusters existing in datasets are identified without any background information of the data objects. Nature-inspired metaheuristic optimization algorithms have been deployed in recent times to overcome the challenges of the traditional clustering algorithm in handling automatic data clustering. Some nature-inspired metaheuristics algorithms have been hybridized with the traditional K-means algorithm to boost its performance and capability to handle automatic data clustering problems. This study aims to identify, retrieve, summarize, and analyze recently proposed studies related to the improvements of the K-means clustering algorithm with nature-inspired optimization techniques. A quest approach for article selection was adopted, which led to the identification and selection of 147 related studies from different reputable academic avenues and databases. More so, the analysis revealed that although the K-means algorithm has been well researched in the literature, its superiority over several well-established state-of-the-art clustering algorithms in terms of speed, accessibility, simplicity of use, and applicability to solve clustering problems with unlabeled and nonlinearly separable datasets has been clearly observed in the study. The current study also evaluated and discussed some of the well-known weaknesses of the K-means clustering algorithm, for which the existing improvement methods were conceptualized. It is noteworthy to mention that the current systematic review and analysis of existing literature on K-means enhancement approaches presents possible perspectives in the clustering analysis research domain and serves as a comprehensive source of information regarding the K-means algorithm and its variants for the research community.
... Such exploring of neighborhoods using local decision boundary was proposed by Ghose 2005, 2009). Finding of node neighborhoods (or agent neighborhood) using local decision boundary has proved to be useful in applications such as clustering, hazard sensing in ubiquitous environments, wireless sensor networks, and robotic applications (Krishnanand and Ghose 2008;Aljarah and Ludwig 2013;Zeng et al. 2016). Section 2 includes the algorithms of Topological and attribute-based firefly algorithms and its frameworks, Sect. 3 includes evaluation metrics consists of AUC, precsion and recall, Sect. 4 includes result analysis of the proposed method which includes the comparative study with small and largesized networks 2 Framework of BFA-ALN range for link prediction Nature-inspired firefly improvement strategy is utilized to inspect the node to node-link prediction issues in social platforms. ...
Understanding dynamic social networks evolution occurs with the formation of new links is an emerging, prominent and challenging task gaining the attention of researchers from various domains. The formation of new links is referred to as a link prediction which also has gained significant interest in the last decade. In this article, we propose a new adaptive approach to solve the problem of link prediction among the nodes in the social networks. The approach is inspired by the intelligent behavior of collective swarms called fireflies, combined with adaptive neighborhood selection method around the fireflies. The paper presents an experimental study on real-world data sets to estimate the proposed technique against the existing techniques in the literature. The outcomes obtained show better accuracy and robustness of the proposed algorithm. The results indicate that applying intelligence of the swarms to link prediction problem gives a promising result to improve accuracy and robustness.
... Therefore, heuristic algorithms have become more popular to solve complex optimization problem [7,8]. Heuristic planning methods mainly include ant colony optimization (ACO) [9,10], particle swarm optimization (PSO) [11][12][13], firefly algorithm (FA) [14,15], differential evolution (DE) [16,17], genetic algorithm (GA) [18][19][20]. Heuristic algorithms can be used to solve NP-Hard problems and obtain near-optimal solutions, while path planning problems are actually a type of NP-Hard problems. ...
The path planning of unmanned aerial vehicle (UAV) in three-dimensional (3D) environment is an important part of the entire UAV’s autonomous control system. In the constrained mission environment, planning optimal paths for multiple UAVs is a challenging problem. To solve this problem, the time stamp segmentation (TSS) model is adopted to simplify the handling of coordination cost of UAVs, and then a novel hybrid algorithm called HIPSO-MSOS is proposed by combining improved particle swarm optimization (IPSO) and modified symbiotic organisms search (MSOS). The exploration and exploitation abilities are combined efficiently, which brings good performance to the proposed algorithm. The cubic B-spline curve is used to smooth the generated path so that the planned path is flyable for UAV. To assess performance, the simulation is carried out in the virtual three-dimensional complex terrain environment. The experimental results show that the HIPSO-MSOS algorithm can successfully generate feasible and effective paths for each UAV, and its performance is superior to the other five algorithms, namely PSO, Firefly, DE, MSOS and HSGWO-MSOS algorithms in terms of accuracy, convergence speed, stability and robustness. Moreover, HIPSO-MSOS performs better than other tested methods in multi-objective optimization problems. Thus, the HIPSO-MSOS algorithm is a feasible and reliable alternative for some difficult and practical problems.
... Evolutionary clustering algorithms use specific fitness function to assess the convergence towards the optimal solution (Aljarah et al. 2013(Aljarah et al. , 2020a. The selection of the fitness function for the new evolutionary clustering algorithms in the literature is usually done without a proper justification. ...
Evolutionary algorithms have shown their powerful capabilities in different machine learning problems including clustering which is a growing area of research nowadays. In this paper, we propose an efficient clustering technique based on the evolution behavior of genetic algorithm and an advanced variant of nearest neighbor search technique based on assignment and election mechanisms. The goal of the proposed algorithm is to improve the quality of clustering results by finding a solution that maximizes the separation between different clusters and maximizes the cohesion between data points in the same cluster. Our proposed algorithm which we refer to as “EvoNP” is tested with 15 well-known data sets using 5 well-known external evaluation measures and is compared with 7 well-regarded clustering algorithms . The experiments are conducted in two phases: evaluation of the best fitness function for the algorithm and evaluation of the algorithm against other clustering algorithms. The results show that the proposed algorithm works well with silhouette coefficient fitness function and outperforms the other algorithms for the majority of the data sets. The source code of EvoNP is available at http://evo-ml.com/evonp/.