
Handing WangXidian University · Department of Artificial Intelligence
Handing Wang
PhD
About
111
Publications
24,402
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,692
Citations
Introduction
Handing Wang currently works at the School of Artificial Intelligence, Xidian University, China. Handing does research in Algorithms, Computing in Mathematics, Natural Science, Engineering and Medicine and Artificial Intelligence. Their current project is 'Data-driven surrogate-assisted evolutionary fluid dynamic optimization'.
Additional affiliations
September 2018 - present
July 2015 - June 2018
March 2012 - March 2013
Position
- Visiting Student
Publications
Publications (111)
Trauma systems have been shown to reduce death and disability from injury but must be appropriately configured. A systematic approach to trauma system design can help maximize geospatial effectiveness and reassure stakeholders that the best configuration has been chosen.
This article describes the GEOS [Geospatial Evaluation of Systems of Trauma Ca...
Abstract There can be a complicated mapping relation between decision variables and objective functions in multi-objective optimization problems (MOPs). It is uncommon that decision variables influence objective functions equally. Decision variables act differently in different objective functions. Hence, often, the mapping relation is unbalanced,...
Thermal layout optimization problems are common in integrated circuit design, where a large number of electronic components are placed on the layout, and a low temperature (i.e., high efficiency) is achieved by optimizing the positions of the electronic components. The operating temperature value of the layout is obtained by measuring the temperatu...
Traditional engine cycle innovation is limited by human experiences, imagination, and currently available engine component performance expectations. Thus, the engine cycle innovation process is quite slow for the past 90 years. In this work, we propose a mixed variable multi-objective evolutionary optimization method for automatic engine cycle desi...
Optimization problems whose evaluations of the objective and constraints involve costly numerical simulations or physical experiments are referred to as expensive constrained optimization (ECO) problems. Such problems can be solved by evolutionary algorithms (EAs) in conjunction with computationally cheap surrogates that separately approximate the...
With more and more deep neural networks being deployed as various daily services, their reliability is essential. It's frightening that deep neural networks are vulnerable and sensitive to adversarial attacks, the most common one of which for the services is evasion-based. Recent works usually strengthen the robustness by adversarial training or le...
Learning to optimize (L2O) has emerged as a powerful framework for black-box optimization (BBO). L2O learns the optimization strategies from the target task automatically without human intervention. This paper focuses on obtaining better performance when handling high-dimensional and expensive BBO with little function evaluation cost, which is the...
In preference-based multi-objective optimization, knee solutions are termed as the implicit preferred promising solution, particularly when users have trouble in articulating any sensible preferences. However, finding knee solutions by existing posteriori knee identification methods is hard when the function evaluations are expensive, because the c...
Hypervolume-based evolutionary algorithms have been widely used to handle many-objective optimization problems. In such algorithms, hypervolume-based environmental selection (HVES), which aims at selecting a subpopulation with the maximal hypervolume (HV) from the current population, plays a crucial role in guiding evolution. However, the computati...
Network structure evolves with time in the real world, and the discovery of changing communities in dynamic networks is an important research topic that poses challenging tasks. Most existing methods assume that no significant change occurs; namely, the difference between adjacent snapshots is slight. However, great change exists in the real world...
In dealing with expensive constrained multi-objective optimization problems using surrogate-assisted evolutionary algorithms, it is a great challenge to reduce the negative impact caused by the approximate errors of surrogate models for constraints. To address this issue, we propose a Kriging-assisted evolutionary algorithm with two search modes to...
The automatic design of soft robots characterizes as jointly optimizing structure and control. As reinforcement learning is gradually used to optimize control, the time-consuming controller training makes soft robots design an expensive optimization problem. Although surrogate-assisted evolutionary algorithms have made a remarkable achievement in d...
Multi-objective feature selection aims to find a set of feature subsets that achieves a trade-off between two objectives, i.e., reducing the number of selected features and improving the classification performance. However, these two objectives might not be always conflicting during the optimization process and have varying difficulties in optimiza...
To accelerate the performance estimation in neural architecture search, recently proposed algorithms adopt surrogate models to predict the performance of neural architectures instead of training the network from scratch. However, it is time-consuming to collect sufficient labeled architectures for surrogate model training. To enhance the capability...
Network structure evolves with time in the real world, and the discovery of changing communities in dynamic networks is an important research topic that poses challenging tasks. Most existing methods assume that no significant change in the network occurs; namely, the difference between adjacent snapshots is slight. However, great change exists in...
Various evolutionary algorithms (EAs) have been proposed to address feature selection (FS) problems, in which a large number of fitness evaluations are needed. With the rapid growth of data scales, the fitness evaluation becomes time consuming, which makes FS problems expensive optimization problems. Surrogate-assisted EAs (SAEAs) have been widely...
Expensive dynamic multi-objective optimization problems (EXDMOPs) involve multiple objective functions changing over time steps. In this kind of problems, only a small number of function evaluations can be allowed in each time step. The challenge of EXDMOP is how to quickly and accurately track the changing optimal solutions with only a small numbe...
Recent studies show that deep neural networks are vulnerable to adversarial attacks in the form of subtle perturbations to the input image, which leads the model to output wrong prediction. Such an attack can easily succeed by the existing white-box attack methods, where the perturbation is calculated based on the gradient of the target network. Un...
The phenomenon that deep neural networks are vulnerable to adversarial examples has been found for several years. Under the black-box setting, transfer-based methods usually produce the adversarial examples on a white-box model, which serves as the surrogate model in the black-box attack, and hope that the same adversarial examples can also fool th...
In space engineering, the electronic component layout has a very important impact on the centroid stability and heat dissipation of devices. However, the expensive thermodynamic simulations in the component thermal layout optimization problems bring great challenges to the current optimization algorithms. To reduce the cost, a surrogate-assisted ev...
Although surrogate-assisted evolutionary algorithms (SAEAs) have been widely developed to address computationally expensive multi-objective optimization problems (MOPs), they still encounter difficulties in solving the expensive and noisy combinatorial MOPs. To this end, we propose a novel SAEA to handle this kind of problem. In the proposed algori...
Surrogate-assisted many-objective optimization is to locate Pareto optimal solutions using a limited number of function evaluations. Most existing surrogate-assisted evolutionary algorithms are designed to embed in a specific many-objective evolutionary algorithm. The Pareto-based bi-indicator infill sampling criterion has been proven to be effecti...
A number of real-world multiobjective optimization problems (MOPs) are driven by the data from experiments or computational simulations. In some cases, no new data can be sampled during the optimization process and only a certain amount of data can be sampled before optimization starts. Such problems are known as offline data-driven MOPs. Although...
To solve noisy and expensive multi-objective optimization problems, there are only a few function evaluations can be used due to the limitation of time and/or money. Because of the influence of noises, the evaluations are inaccurate. It is challenging for the existing surrogate-assisted evolutionary algorithms. Due to the influence of noises, the p...
Dynamic time-linkage optimization problems (DTPs) are special dynamic optimization problems (DOPs) with the time-linkage property. The environment of DTPs changes not only over time, but also depends on the previous applied solutions. DTPs are hardly solved by existing dynamic evolutionary algorithms because they ignore the time-linkage property. I...
Many real-world optimization tasks suffer from noise. So far, the research on noise-tolerant optimization algorithms is still restricted to low-dimensional problems with less than 100 decision variables. In reality, many problems are high-dimensional. Cooperative coevolutionary (CC) algorithms based on a divide-and-conquer strategy are promising in...
Real-world industrial engineering optimization problems often have a large number of decision variables. Most existing large-scale evolutionary algorithms need a large number of function evaluations to achieve high-quality solutions. However, the function evaluations can be computationally intensive for many of these problems, particularly, which m...
Evolutionary Algorithms (EAs) are nature-inspired population-based search methods which work on Darwinian principles of natural selection. Due to their strong search capability and simplicity of implementation, EAs have been successfully applied to solve many complex optimization problems, which cannot be easily solved by traditional mathematical p...
Supplementary material for the paper "A Survey of Normalization Methods in Multiobjective Evolutionary Algorithms".
This chapter introduces the basic evolutionary algorithms, including the canonical genetic algorithms, real-coded genetic algorithms, evolution strategies, genetic programming, ant colony optimization algorithms, particle swarm optimization, and differential evolution. In addition, memetic algorithms that combine evolutionary search with local sear...
This chapter introduces the typical machine learning problems, describes the widely used machine learning models, and presents the basic learning algorithms suited for solving various machine learning problems. Note that a machine learning model may be used for accomplishing different machine learning tasks, provided that a proper learning algorith...
This chapter briefly introduces the most widely used traditional optimization algorithms, including the gradient based method and its variants, basic methods for constrained optimization, pattern search for non-differentiable or black-box optimization problems, and deterministic global optimization methods.
Offline data-driven optimization does not allow to sample new data during the optimization, making it hard to verify the solution and update the surrogates. One additional challenge is to select appropriate solutions for final implementation, in particular in multi- or many-objective optimization. Nevertheless, this does not necessarily mean that n...
Lack of training data is one major challenge in data-driven optimization, since data collection is either computationally expensive or costly in many data-driven optimization problems. To address this issue, this chapter presents three classes of knowledge transfer approaches in data-driven evolutionary optimization. The first approach is based on...
It becomes increasingly difficult to train a high-quality surrogate model as the dimension of a problem increases, especially for expensive optimization problems where only a limited number of samples can be afforded. This chapter focuses on addressing high-dimensional expensive problems that have over 30 and up to some 200 decision variables. The...
This chapter presents surrogate-assisted evolutionary algorithms for single-objective optimization that employ multiple surrogates. Multiple surrogates can not only improve the prediction performance and estimate the degree of prediction uncertainty, but also capture both global and local features of the fitness landscape. The multiple surrogates c...
With the recent booming development of deep neural networks, the demand for automated design of efficient deep neural architectures has been increasing. This chapter introduces the basics of automated neural architecture search and discusses the current remaining challenges, focusing on scalability and flexibility of network architecture representa...
Multi-objective evolutionary optimization has found increasing applications in the real world, many of which are expensive. This chapter starts with introducing three main categories of evolutionary algorithms for multi-objective optimization, namely decomposition based, Pareto dominance based and performance indicator based. This is followed by a...
Solving many-objective optimization problems is challenging due to the increase in the number of objectives. The challenges include the increased complexity in the structure of the Pareto front, the increased number of solutions needed to represent the Pareto front, and the selection of solutions. Many-objective optimization becomes even more chall...
This chapter introduces the fundamentals of optimization, including the mathematical formulation of an optimization problem, convexity and types of optimization problems, single- and multi-objective optimization, and other important aspects of optimization such as robust optimization and dynamic optimization. Robustness optimization over time, a re...
There are some practical optimization problems that can be only optimized using historical data, which is known as offline data-driven optimization problems. Since the real function evaluations are not available in the optimization process, surrogate models must replace the real fitness evaluations to guide the search. The key issue in offline data...
Recently, evolutionary algorithms have made great achievements in multi-objective optimization problems (MOPs), but there is a little research on how to deal with noisy multi-objective optimization problems (NMOPs), which are quite common in real life. The work in this paper attempts to find the commonality of noises in images/signals and NMOPs and...
This chapter introduces the definition of and motivations behind data-driven optimization. Two basic data-driven optimization paradigms, offline and online data-driven optimization, are introduced. A variety of heuristic population and individual based surrogate management strategies for surrogate assisted evolutionary optimization are presented, a...
A real-world multiobjective optimization problem (MOP) usually has differently scaled objectives. Objective space normalization has been widely used in multiobjective optimization evolutionary algorithms (MOEAs). Without objective space normalization, most of the MOEAs may fail to obtain uniformly distributed and well-converged solutions on MOPs wi...
Only a small number of function evaluations can be afforded in many real-world multi-objective optimization problems where the function evaluations are economically/computationally expensive. Such problems pose great challenges to most existing multi-objective evolutionary algorithms which require a large number of function evaluations for optimiza...
Infill sampling criteria play a crucial role in saving expensive evaluations for surrogate-assisted multiobjective evolutionary algorithms. Promoting convergence and maintaining diversity in the population are the two main goals of designing a new infilling sampling criterion, which is naturally a bi-objective optimization problem. In this paper, a...
In offline data-driven evolutionary optimization, no real fitness evaluations is allowed during the optimization, making it extremely challenging to build high-quality surrogates on limited amount of data. This is especially true for large-scale optimization problems where typically a large amount of data is needed for constructing reliable surroga...
Minimax optimization is a widely-used formulation for robust design in multiple operating or environmental scenarios, where the worst-case performance among multiple scenarios is the optimization objective requiring a large number of quality assessments. Consequently, minimax optimization using evolutionary algorithms becomes prohibitive when each...
A number of sparse multi-objective optimization problems (SMOPs) exist in the real world. Decision variables in their Pareto optimal solutions are not only large-scale but also very sparse, most decision variables are zero, which poses difficulties for the optimization. Existing multi-objective evolutionary algorithms need many function evaluations...
Real-world optimization applications in complex systems always contain multiple factors to be optimized, which can be formulated as multi-objective optimization problems. These problems have been solved by many evolutionary algorithms like MOEA/D, NSGA-III, and KnEA. However, when the numbers of decision variables and objectives increase, the compu...
Many real-world combinatorial optimization problems have both expensive objective and constraint functions. Although surrogate models for the discrete decision variables can be trained to replace the expensive fitness evaluations in evolutionary algorithms, the approximation errors of the surrogate models for the constraint function easily misguide...
This book constitutes the refereed proceedings of the 11th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2021 held in Shenzhen, China, in March 2021.
The 47 full papers and 14 short papers were carefully reviewed and selected from 120 submissions. The papers are divided into the following topical sections: theory; algor...
Intended for researchers and practitioners alike, this book covers carefully selected yet broad topics in optimization, machine learning, and metaheuristics. Written by world-leading academic researchers who are extremely experienced in industrial applications, this self-contained book is the first of its kind that provides comprehensive background...
Optimization of many real-world optimization problems relies on numerical simulations for function evaluations. In some cases, both high- and low-fidelity simulations are available, where the high fidelity evaluation is accurate but time-consuming, whereas the low-fidelity evaluation is less accurate but computationally cheap. To find an acceptable...
This chapter presents some recent advances in surrogate-assisted evolutionary optimization of large problems. By large problems, we mean either the number of decision variables is large, or the number of objectives is large, or both. These problems pose challenges to evolutionary algorithms themselves, constructing surrogates and surrogate manageme...
Many real-world optimization applications have more than one objective, which are modeled as multiobjective optimization problems. Generally, those complex objective functions are approximated by expensive simulations rather than cheap analytic functions, which have been formulated as data-driven multiobjective optimization problems. The high compu...
This work presents a summary of the results obtained during the activities developed within the GARTEUR AD/AG-52 group. GARTEUR stands for “Group for Aeronautical Research and Technology in Europe” and is a multinational organization that performs high quality, collaborative, precompetitive research in the field of aeronautics to improve technologi...
Convolutional neural networks (CNNs) have shown remarkable performance in various real-world applications. Unfortunately, the promising performance of CNNs can be achieved only when their architectures are optimally constructed. The architectures of state-of-the-art CNNs are typically hand-crafted with extensive expertise in both CNNs and the inves...
The papers in this special section focus on the use of computational intelligence in data driven optimization applications. Most evolutionary algorithms and other meta-heuristic search methods typically assume that there are explicit objective functions available for fitness evaluations. In the real world, such explicit objective functions may not...
Many real-world optimization problems can be solved by using the data-driven approach only, simply because no analytic objective functions are available for evaluating candidate solutions. In this paper, we address a class of expensive data-driven constrained multiobjective combinatorial optimization problems, where the objectives and constraints c...
Most evolutionary optimization algorithms assume that the evaluation of the objective and constraint functions is straightforward. In solving many real-world optimization problems, however, such objective functions may not exist. Instead, computationally expensive numerical simulations or costly physical experiments must be performed for fitness ev...
Cooperative coevolutionary (CC) algorithms decompose a problem into several subcomponents and optimize them separately. Such a divide-and-conquer strategy makes CC algorithms potentially well suited for large-scale optimization. However, decomposition may be inaccurate, resulting in a wrong division of the interacting decision variables into differ...
In solving many real-world optimization problems, neither mathematical functions nor numerical simulations are available for evaluating the quality of candidate solutions. Instead, surrogate models must be built based on historical data to approximate the objective functions and no new data will be available during the optimization process. Such pr...
Surrogate-assisted evolutionary algorithms have been developed mainly for solving expensive optimization problems where only a small number of real fitness evaluations are allowed. Most existing surrogate-assisted evolutionary algorithms are designed for solving low-dimensional single or multi-objective optimization problems, which are not well sui...
Background:
Trauma center designation in excess of need risks dilution of experience, reduction in research and training opportunities, and increased costs. The objective of this study was to evaluate the use of a novel data-driven approach (whole-system mathematical modelling of patient flow) to compare the configuration of an existing trauma sys...