About
69
Publications
8,518
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
808
Citations
Introduction
Skills and Expertise
Current institution
Additional affiliations
September 2010 - January 2016
November 2014 - November 2015
Publications
Publications (69)
Loss function approach is effective for multi-response optimization. However, previous loss function approaches ignore the dispersion performance of squared error loss and model uncertainty. In this paper, a weighted loss function is proposed to simultaneously consider the location and dispersion performances of squared error loss to optimize corre...
Nd: YLF laser beam machining (LBM) process has a great potential to manufacture intricate shaped microproducts with its unique characteristics. Continuous improvement (CI) effort for LBM process is usually realised by response surface methodology, which is an important tool in Design of Six Sigma. However, when determining the optimal machining par...
Response surface-based design optimization has been commonly used to seek the optimal input settings in process or product design problems. Yet in most of the existing researches, there is no model parameter uncertainty in the modeling process and the optimal settings can be implemented at the precise values. These two assumptions are far from the...
Response-surface-based design optimization has been commonly used in robust process design (RPD) to seek optimal process settings for minimizing the output variability around the target value. Recently, online RPD strategy has attracted increasing research interests as it is expected to provide a better performance than offline RPD by utilizing onl...
Empirical models that relate multiple quality features to a set of design variables play a vital role in many industrial process optimization methods. Many of the current modeling methods usually employ a single-response normal model to analyze industrial processes without taking into consideration the high correlations and the non-normality among...
The process capability index (PCI), C pk , one of the widely-used tools for assessing the capability of a manufacturing process, expresses the deviation of the process mean from the midpoint of the specification limits. The C pk is known to perform well under the general assumption that the experimental data are normally distributed without contami...
The purpose of time-dependent reliability analysis is evaluating the reliability of a system or component within a specified timeframe. Several approaches have been suggested for addressing time-dependent reliability challenges, including analytical methods and methods based on response surrogate models. As a representative of Machine Learning meth...
We consider the process capability index C pmc when a tolerance cost function is introduced. It is well known that C pmc performs well under the general assumption that the data is not contaminated. Under this assumption, the standard sample mean and sample variance are used to estimate C pmc. However, it is also well known that this estimate is ex...
The traditional variable control charts, such as the X-bar chart, are widely used to monitor variation in a process. They have been shown to perform well for monitoring processes under the general assumptions that the observations are normally distributed without data contamination and that the sample sizes from the process are all equal. However,...
We consider the process capability index, a widely-used quality-related statistic used to assess the quality of products and performance of monitored processes in various industries. It is widely known that the conventional process capability indices perform well when the quality process being monitored has a normal distribution. Unfortunately, usi...
Hybrid reliability analysis with mixed random and interval uncertainties is a significant challenge in the reliability assessment of engineering structures. The situation will be more intractable when involving incomplete interval data. To obtain reliable estimates of the failure probability limits, an effective parameter estimation method, integra...
The Shewhart-type control chart based on the geometric distribution is widely used to monitor the number of normal events between two consecutive appearances of adverse events, but this control chart relies on the normality assumption and may thus result in unsatisfactory performance due to the high skewness of the geometric distribution. Thus, one...
Metamodels are widely used as fast surrogates to facilitate the optimization of simulation models. Stochastic kriging (SK) is an effective metamodeling tool for a mean response surface implied by stochastic simulation. In SK, it is usually assumed that the experimental data are normally distributed and uncontaminated. However, these assumptions can...
It is well known that the performance of g and h control charts heavily depends on how accurate the estimation of an unknown process parameter is. However, conventional methods, such as the method of moments and the maximum likelihood method, are easily influenced by data contamination. Thus, the performance of control charts with these estimators...
Active-learning Kriging models have gained more and more popularity for structural reliability analysis (SRA) in recent years. Improving the efficiency of simulations while maintaining high accuracy is essential for building Kriging-based SRA approaches. In this article, we propose a novel active-learning Kriging reliability analysis method based o...
A growing area of focus is using multi-fidelity(MF) simulations to predict the behavior of complex physical systems. In order to adequately utilize the popular sequential designs to improve the effectiveness of the MF method, two challenges involving good projection properties in the presence of effect sparsity and the sample allocation between the...
As an important part of the Design for X tools, Design for Quality (DFQ) is used to reduce cost and improve quality of products while maintaining reliability in preliminary design phase. As a powerful tool to reduce and eliminate possible failures, failure mode and effects analysis (FMEA) is broadly applied in detail design phase. However, scholars...
Multi-objective stochastic simulation optimization plays an important role in designing complex engineering systems. To identify optimal solutions via simulations, Bayesian optimization, which utilizes metamodels and an acquisition function to determine the next design point, has been popular in machine learning. However, studies on Bayesian optimi...
As a significant analytical tool in reliability management, FMEA has been extensively used in various fields. Nevertheless, conventional FMEA has been criticized for some defects. To compensate this situation, this article proposes an improved FMEA method under the environment of probabilistic linguistic terms. The multiformity and indeterminacy of...
Surrogate models have been proven to be powerful tools to alleviate the computational burden of structural reliability analysis. An appropriate surrogate model can guarantee prediction accuracy with limited samples. However, the traditional single modeling technique ignores the model‐form uncertainty due to insufficient knowledge of the physical sy...
In this paper, a novel sparse regression Kriging method termed SRK is proposed, putting an emphasis on efficiently identifying an adaptive overall trend. The main idea underlying SRK is that, by applying a Cholesky decomposition on the correlation matrix, a general Gaussian scale mixture prior- based sparse Bayesian learning scheme can be naturally...
Failure mode and effects analysis (FMEA) is an effective risk assessment tool for detecting and reducing possible risks during a manufacturing process. However, traditional FMEA has some shortcomings when used in the real world. In recent years, improved FMEA approaches have been proposed to eliminate the inherent shortcomings of FMEA, but the risk...
Contours have been commonly employed to gain insights into the influence of inputs in designing engineering systems. Estimating a contour from computer experiments via sequentially updating kriging [also called Gaussian process (GP) models] has received increasing attention for obtaining an accurate prediction within a limited simulation budget. In...
The g-type quality control charts based on the geometric distribution are commonly used to monitor the number of conforming cases between the two consecutive appearances of nonconformities. The process parameter in these charts is generally estimated based on conventional methods such as the maximum likelihood and minimum variance unbiased estimato...
Airfoils play significant roles in aerodynamic engineering as they are important devices in designing aircraft and engines. For airfoils, geometric design variables often deviate from nominal values that deteriorate airfoil quality. Thus, the design optimization of airfoils under the noises of design variables is important. However, the existing li...
Healthcare waste (HCW) management plays a vital role in the development of modern society. In HCW management, failure mode and effects analysis (FMEA) is a popular method to implement risk management for improving the quality of healthcare. However, the shortcomings of the traditional FMEA method have been widely discussed in literatures. This pape...
Integrated parameter design and tolerance design (IPTD) is an effective way to improve product quality and reduce manufacturing cost in micro-manufacturing processes. However, the current modeling techniques rarely analyze the influence of model uncertainty on the optimal machining parameters. It may not obtain the robust optimal machining paramete...
Various creative multi-response optimisation approaches have been developed in the literature. Most of these researches are based on the normality assumption of the response distribution. However, this assumption does not necessarily hold in some real cases, such as non-normal multiple responses. Also, the reproducibility of optimisation results do...
The process capability index, Cpk, is a useful tool for assessing the capability of a manufacturing process. There exist three well-known
confidence intervals for the process capability index. These intervals are based on the standard bootstrap, the percentile bootstrap and the
bias-corrected percentile bootstrap, respectively. We propose three var...
Many industrial process optimisation methods rely on empirical models that relate output responses to a set of design variables. One of the most crucial problems in process optimisation is how to efficiently implement model selection and model estimation. This paper presents a Bayesian hierarchical modelling approach to process optimisation based o...
This paper proposes a new Bayesian modeling and optimization method for multi-response surfaces (MRS). The proposed approach not only measures the conformance probability (i.e., the probability of all responses simultaneously falling within their corresponding specification limits) through the posterior predictive function but also takes into accou...
Reliability-based design optimization (RBDO) for the design process of the flexure-based bridge-type amplification mechanisms (FBTAMs) relies on an accurate surrogate model. At present, ensemble modeling approaches have been widely used. However, existing ensemble modeling approaches for the RBDO have not considered the model form selection in the...
Noise factors and controllable factors exert some influence on quality characteristic. This paper proposes a new robust optimization method based on Kriging model and robust optimization ideology. Firstly, based on the hypothesis that the noise factor and the controllable factor follow normal distribution, we present a robust optimization method re...
Integrated parameter and tolerance design is a cost‐effective method to multiresponse quality improvement. However, previous methods usually ignore model parameter uncertainty, dispersion effect, or correlation among responses. This may lead to the obtained optimal solutions far from the true optimal values of parameters and tolerances. To address...
Failure mode and effect analysis (FMEA) is an effective quality tool to eliminate the risks and enhance the stability and safety in the fields of manufacturing and service industry. Nevertheless, the conventional FMEA has been criticized for its drawbacks in the evaluation process of risk factors or the determination of risk priority number (RPN),...
In robust design, it is usually assumed that the experimental data are normally distributed and uncontaminated. However, in many practical applications, these assumptions can be easily violated. It is well known that normal model departure or data contamination can result in biased estimation of the optimal operating conditions of the control facto...
This paper proposes an ensemble radial basis function neural network that selects important RBF subsets based on Pareto chart using Bootstrap samples. Then, the analysis of variance method is used to determine the choice of the unequal/equal weights. The effectiveness of the proposed technique is illustrated with a micro-drilling process. The compa...
//This paper has been published in Journal of Management Science and Engineering which was held by National Nature Science Foundation of China (NSFC) on Dec. 2016.// More information about this paper can be seen in https://www.sciencedirect.com/science/article/pii/S2096232019300769.
Abstract:
This paper proposes an ensemble radial basis function n...
Space‐filling and projective properties are probably the two most important features in computer experiment. The existing research works have tried to develop different kinds of sequential Latin hypercube design (LHD) to meet these two properties. However, most if not all of them cannot simultaneously ensure these two properties in their versions o...
A multi-response parameters and tolerances concurrent optimization strategy, considering quality loss and manufacturing cost, is proposed to solve the problem of model parameter uncertainty. Firstly, multi-response quality loss function is developed based on model parameter uncertainty and the tolerances of design variables. Secondly, tolerance cos...
To solve the design optimization problems of complex engineering systems with black-box constraints, a surrogate-based optimization algorithm was proposed based on Kriging models and bi-objective constraint-handling strategy. By taking the expected improvement criterion of improving the objective function and the feasibility probability criterion o...
Due to the fact that the traditional point estimates of optimal settings may not be properly estimated resulting from lack of sufficient experiment data under data contamination, a robust design method is proposed based on bootstrap technique for obtaining confidence interval of the optimum operating conditions. Firstly, Hodges-Lehman and Shamos me...
The purpose of this paper is to establish a new reliability model of the system subject to multiple dependent competing risks. For a system subject to multiple dependent competing risks, the total degradation consists of natural degradation amount and sudden degradation increments (SDIs) caused by random shocks arriving at the system. Most research...
As to micro-manufacturing processes, constructing accurate models plays an important role in the continuous quality improvement. This paper proposes an ensemble modeling technique based on 0–1 programming, in which the redundant models will be eliminated from a set of candidates and then the selected models are combined to construct the final ensem...
Quality improvement in micro-manufacturing processes relies on empirical models. However, if an estimated model varies from the true model because of random errors in experiments, the resulting operating conditions may be located far from the true optimal operating conditions. Using the Pareto chart, which highlights the most important among a set...
Aiming at the situation that many factors are involved in most simulation experiments, such as supply chain system or other computer simulation experiment, sequential bifurcation (SB) is widely used due to its high efficiency in screening. However, SB usually ignores the factors which are significant to dispersion effects. According to the theory o...
Aiming to resolve the problem of reliability assessment of a single-unit system which is affected by natural degradation and random shocks, a reliability model based on two competing failure processes is developed. The model takes into account the impact of shocks to degradation, which is the increase of degradation resulted from random shocks. The...
The basic underlying assumption in robust design is that the experimental data have a normal distribution. However, in many practical cases, the experimental data may actually have an underlying distribution that is not normal. The existence of model departure can have a significant effect on the optimal operating condition estimates of the control...
Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers’ preference information...
Aiming at the risk-averse stochastic simulation optimization problems with multiple responses, by combining robust design approach with conditional value at risk criterion, an optimal strategy of mean-conditional value at risk was proposed based on Kriging model was proposed. Kriging models for mean response and conditional value at risk response w...
A satisfactory parameter design approach based on confidence interval is proposed to solve the model uncertainty problem of multivariate quality characteristics. The response models of mean and standard deviation are built by using the dual response surface method, and then their confidence interval expressions are obtained. The satisfactory sets w...
In most engineering problems, model uncertainty is inevitably involved in the robust parameter design. Ensemble of surrogates based on the encompassing test (ET-EOS) is proposed to consider model uncertainty for response surface modeling. Firstly, sub-surrogates are assured according to the practical problem and characteristics of models, then diff...
Multi-response surface (MRS) optimization in quality design often involves some problems such as correlation among multiple responses, robustness measurement of multivariate process, confliction among multiple goals, prediction performance of the process model and the reliability assessment for optimization results. In this paper, a new Bayesian ap...
In robust design, it is common to estimate empirical models that relate an output response variable to controllable input variables and uncontrollable noise variables from experimental data. However, when determining the optimal input settings that minimise output variability, parameter uncertainties in noise factors and response models are typical...
For the manufacturer wants to get the lowest total costs, increase productivity and improve product quality, the selection of the optimal process target is one of the key research problems. The traditional method solving this problem involves using a quality loss function, for example regression analysis that based on historical data concerning cus...
Multi-response optimization methods rely on empirical process models based on the estimates of model parameters that relate response variables to a set of design variables. However, in determining the optimal conditions for the design variables, model uncertainty is typically neglected, resulting in an unstable optimal solution. This paper proposes...
Desirability function approach is very popular for multiresponse optimization problems. However, the approach ignores the correlations among multiple responses and does not consider how to reasonably determine the relative weights of multiple responses. In this paper, an integrative desirability function approach is proposed to simultaneously consi...
Loss function approach is effective for multiresponse optimization. However, previous loss function approach ignore model uncertainty and decision-makers' regret effect when determining the optimal input settings. In this paper, an overall loss function is proposed to optimize correlated multiple responses via confidence intervals, and we refer to...
The quality level measurement of a given process is essential to six sigma process improvement. Indicators such as rolled throughput yield and six sigma level have been applied to estimate the efficiency of total process in a certain organization. As an effective indicator, however, rolled throughput yield has not thoroughly explored in scarce stud...
How to consider the objective information of interesting responses in the course of new product design and development is discussed, when the subjective information cannot be accurately expressed by decision-makers. A modified desirability function approach is proposed to achieve robustness and optimization for multi-response optimization (MRO) pro...
Aiming at the robust parameter design problem under model uncertainty, on the basis of Bayesian model averaging method, Bayesian Model Averaging based on Effect Principle (BMA-EP) robust design methodology was proposed by taking the factor effect principle into account. With prior information and Bayesian law, the main effects' posterior probabilit...
When the demand is sensitive to retail price, revenue sharing
contract and two-part tariff contract have been shown to be able
to coordinate supply chains with risk neutral agents. We extend
the previous studies to consider a risk-averse retailer in a
two-echelon fashion supply chain. Based on the classic
mean-variance approach in finance, the issu...