Conference Paper

Multifidelity Robust Topology Optimization for Material Uncertainties with Digital Manufacturing

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... 7), where robustness can be defined as a combination of the mean and the variance of a QoI. [12][13][14]. We use the MFMC method to solve an uncertainty propagation problem for estimating the mean and the variance of the maximum temperature of the batteries and the terminal voltage. ...
Conference Paper
Full-text available
This study addresses safety concerns within the rapidly evolving Electric Vertical Takeoff and Landing (eVTOL) aircraft domain, focusing on efficient tools to quantify uncertainties in lithium-ion battery behavior-a critical aspect of eVTOL. One major issue with quantifying uncertainty is the prohibitive computational cost associated with many queries of an expensive-to-evaluate computational model. This work employs three physics-based battery models of varying fidelity and cost to estimate the mean and the variance of the selected quantities of interest through a multifidelity method to reduce the computation cost. By combining information from multiple cheaper, lower-fidelity models through the Multifidelity Monte Carlo method, we significantly reduce the number of high-fidelity samples required for a prescribed mean-squared error, consequently reducing computational costs down to a tractable level. The proposed methodology is applied to estimate the mean and the variance of the battery temperature and voltage, accounting for uncertainties in flight conditions and materials. The first example focuses on a 580-second flight and is benchmarked against a standard Monte Carlo sampling technique. Results indicate a notable fourfold speed-up using the Multifidelity Monte Carlo method compared to the standard Monte Carlo method for the same mean-squared error for the voltage estimate. To showcase the method's generality, the multifidelity method is then applied to a longer flight of 3580 seconds for estimating the mean and the variance and utilizing these statistics to approximately estimate the probability of the flight completion. This demonstrates the adaptability of the methodology to various power profiles and considered uncertainties, with potential extensions to any battery chemistry. In conclusion, the presented multifidelity method offers a robust approach to enhance eVTOL safety by efficiently estimating uncertainties in battery behavior.
Article
Full-text available
Uncertainty quantification (UQ) in metal additive manufacturing (AM) has attracted tremendous interest in order to dramatically improve product reliability. Model-based UQ, which relies on the validity of a computational model, has been widely explored as a potential substitute for the time-consuming and expensive UQ solely based on experiments. However, its adoption in the practical AM process requires overcoming two main challenges: (1) the inaccurate knowledge of uncertainty sources and (2) the intrinsic uncertainty associated with the computational model. Here, we propose a data-driven framework to tackle these two challenges by combining high throughput physical/surrogate model simulations and the AM-Bench experimental data from the National Institute of Standards and Technology (NIST). We first construct a surrogate model, based on high throughput physical simulations, for predicting the three-dimensional (3D) melt pool geometry and its uncertainty with respect to AM parameters and uncertainty sources. We then employ a sequential Bayesian calibration method to perform experimental parameter calibration and model correction to significantly improve the validity of the 3D melt pool surrogate model. The application of the calibrated melt pool model to UQ of the porosity level, an important quality factor, of AM parts, demonstrates its potential use in AM quality control. The proposed UQ framework can be generally applicable to different AM processes, representing a significant advance toward physics-based quality control of AM products.
Article
Full-text available
This paper addresses the dependency of design parameters and random parameters within robust design optimization. If the stochastic distributions of random input parameters are design-dependent, then this dependency must be included in the gradient, when using gradient-based optimization methods. The paper provides the basic theoretical principles and two approaches for incorporating design-dependent input distributions in robust design optimization: one approach based on Monte Carlo sampling and another based on Taylor series expansions. Both these approaches do not require additional structural analyses (e.g. finite element simulations). Describing the design dependency of input distributions can, however, be a challenging task. Numerical applications to different academic examples are presented, demonstrating the potential of the proposed approaches and several implications that may emerge in the process.
Article
Full-text available
The use of surrogate models (response surface models, curve fits) of various types (radial basis functions, Gaussian process models, neural networks, support vector machines, etc.) is now an accepted way for speeding up design search and optimization in many fields of engineering that require the use of expensive computer simulations, including problems with multiple goals and multiple domains. Surrogates are also widely used in dealing with uncertainty quantification of expensive black-box codes where there are strict limits on the number of function evaluations that can be afforded in estimating the statistical properties of derived performance quantities. Here, we tackle the problem of robust design optimization from the direction of Gaussian process models (Kriging). We contrast two previously studied models, co-Kriging and combined Kriging (sometimes called level 1 Kriging), and propose a new combined approach called combined co-Kriging that attempts to make best use of the key ideas present in these methods.
Article
Full-text available
Robust topology optimization has long been considered computationally intractable as it combines two highly expensive computational strategies. This paper considers simultaneous minimization of expectancy and variance of compliance in the presence of uncertainties in loading magnitude, using exact formulations and analytically derived sensitivities. It shows that only a few additional load cases are required, which scales in polynomial time with the number of uncertain load cases. The approach is implemented using the level set topology optimization method. Shape sensitivities are derived using the adjoint method. Several examples are used to investigate the effect of including variance in robust compliance optimization. Copyright © 2013 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Article
We present an efficient multilevel Monte Carlo (MLMC) method for the topology optimization of flexoelectric structures. A flexoelectric composite consisting of flexoelectric and purely elastic building blocks is investigated. The governing equations are solved by Non-Uniform Rational B-spline (NURBS)-based isogeometric analysis (IGA) exploiting its higher order continuity. Genetic algorithms (GA) based integer-valued optimization is used to obtain the optimal topological design. The uncertainties in the material properties and the volume fraction of the constituents are considered to quantify the uncertainty in the electromechanical coupling effect. Then, a multilevel hierarchy of computational meshes is obtained by a uniform refinement according to a geometric sequence. We estimate the growth rate of the simulation cost, in addition to the rates of decay in the expectation and the variance of the differences between the approximations over the hierarchy. Finally, we determine the minimum number of simulations required on each level to achieve the desired accuracy at different prescribed error tolerances. The results show that the proposed method reduces the computational cost in the numerical experiments without loss of the accuracy. The overall computation saving was in the range 2.0–3.5.
Article
Metamaterials are synthetic materials designed to have unique properties like negative Poisson ratio (NPR). NPR metamaterials, also known as auxetics, offer significant value in applications that require high energy absorption, e.g., packing materials, medical knee pads, footwear. However, material uncertainty arising out of manufacturing tolerance, inhomogeneity of material properties, and others could lead to significant variations in the response of the metamaterials. Thus, a SIMP based robust topology optimization (RTO) design for the NPR metamaterials under material uncertainty is investigated. The weighted mean and variance of the deterministic objective function is utilized to form a robust objective function. The variation in effective Poisson’s ratio with respect to the lower bound goes from 15.40% to 105% with deterministic topology optimization. In contrast, RTO produces more stable designs and shows the variation of only 1.72% to 2.54%. Several parametric studies are used to demonstrate the feasibility of the proposed RTO methodology.
Article
This paper focuses on robust topology optimization for fiber-reinforced composite structures under loading uncertainty. An effective method is presented for simultaneous optimization of fiber angles and structural topology. Specifically, a new parameterization scheme is developed to obtain the continuous spatial variation of fiber angles. The solid isotropic material with penalization method is employed to obtain the material distribution. The Monte Carlo simulation method is adopted to handle the uncertainty of loading magnitude and direction with probabilistic distributions. Subject to a volume fraction constraint, the problem of minimizing a weighted sum of the mean and standard deviation of structural compliance is investigated. Sensitivity analysis is conducted. To reduce the computational cost, Kriging metamodel is constructed to calculate the objective value and sensitivity information. Numerical examples are provided to demonstrate the effectiveness of the proposed method.
Article
Uncertainty quantification (UQ) includes the characterization, integration, and propagation of uncertainties that result from stochastic variations and a lack of knowledge or data in the natural world. Monte Carlo (MC) method is a sampling‐based approach that has widely used for quantification and propagation of uncertainties. However, the standard MC method is often time‐consuming if the simulation‐based model is computationally intensive. This article gives an overview of modern MC methods to address the existing challenges of the standard MC in the context of UQ. Specifically, multilevel Monte Carlo (MLMC) extending the concept of control variates achieves a significant reduction of the computational cost by performing most evaluations with low accuracy and corresponding low cost, and relatively few evaluations at high accuracy and corresponding high cost. Multifidelity Monte Carlo (MFMC) accelerates the convergence of standard Monte Carlo by generalizing the control variates with different models having varying fidelities and varying computational costs. Multimodel Monte Carlo method (MMMC), having a different setting of MLMC and MFMC, aims to address the issue of UQ and propagation when data for characterizing probability distributions are limited. Multimodel inference combined with importance sampling is proposed for quantifying and efficiently propagating the uncertainties resulting from small data sets. All of these three modern MC methods achieve a significant improvement of computational efficiency for probabilistic UQ, particularly uncertainty propagation. An algorithm summary and the corresponding code implementation are provided for each of the modern MC methods. The extension and application of these methods are discussed in detail. This article is categorized under: Statistical and Graphical Methods of Data Analysis > Monte Carlo Methods Statistical and Graphical Methods of Data Analysis > Sampling
Article
Robust design optimization (RDO) is a field of optimization in which certain measure of robustness is sought against uncertainty. Unlike conventional optimization, the number of function evaluations in RDO is significantly more which often renders it time consuming and computationally cumbersome. This paper presents two new methods for solving the RDO problems. The proposed methods couple differential evolution algorithm (DEA) with polynomial correlated function expansion (PCFE). While DEA is utilized for solving the optimization problem, PCFE is utilized for calculating the statistical moments. Three examples have been presented to illustrate the performance of the proposed approaches. Results obtained indicate that the proposed approaches provide accurate and computationally efficient estimates of the RDO problems. Moreover, the proposed approaches outperforms popular RDO techniques such as tensor product quadrature, Taylor’s series and Kriging. Finally, the proposed approaches have been utilized for robust hydroelectric flow optimization, demonstrating its capability in solving large scale problems.
Article
Based on our experience gained from uncertainty quantification (UQ) of traditional manufacturing, this paper discusses UQ for additive manufacturing with a focus on the prediction of material properties. Applications of UQ methods in traditional manufacturing are briefly summarized first. Based on that, we investigate how the state of the art UQ techniques can be applied to AM process to quantify the uncertainty in the material properties due to various sources of uncertainty. The UQ of ultimate tensile strength of a structure obtained from laser sintering of nanoparticles is used as an example to illustrate the proposed UQ framework.
Article
It is important to design robust and reliable systems by accounting for uncertainty and variability in the design process. However, performing optimization in this setting can be computationally expensive, requiring many evaluations of the numerical model to compute statistics of the system performance at every optimization iteration. This paper proposes a multifidelity approach to optimization under uncertainty that makes use of inexpensive, low-fidelity models to provide approximate information about the expensive, high-fidelity model. The multifidelity estimator is developed based on the control variate method to reduce the computational cost of achieving a specified mean square error in the statistic estimate. The method optimally allocates the computational load between the two models based on their relative evaluation cost and the strength of the correlation between them. This paper also develops an information reuse estimator that exploits the autocorrelation structure of the high-fidelity model in the design space to reduce the cost of repeatedly estimating statistics during the course of optimization. Finally, a combined estimator incorporates the features of both the multifidelity estimator and the information reuse estimator. The methods demonstrate 90% computational savings in an acoustic horn robust optimization example and practical design turnaround time in a robust wing optimization problem. Copyright © 2014 John Wiley & Sons, Ltd.
Article
Shannon-type expected information gain can be used to evaluate the relevance of a proposed experiment subjected to uncertainty. The estimation of such gain, however, relies on a double-loop integration. Moreover, its numerical integration in multi-dimensional cases, e.g., when using Monte Carlo sampling methods, is therefore computationally too expensive for realistic physical models, especially for those involving the solution of partial differential equations. In this work, we present a new methodology, based on the Laplace approximation for the integration of the posterior probability density function (pdf), to accelerate the estimation of the expected information gains in the model parameters and predictive quantities of interest. We obtain a closed-form approximation of the inner integral and the corresponding dominant error term in the cases where parameters are determined by the experiment, such that only a single-loop integration is needed to carry out the estimation of the expected information gain. To deal with the issue of dimensionality in a complex problem, we use a sparse quadrature for the integration over the prior pdf. We demonstrate the accuracy, efficiency and robustness of the proposed method via several nonlinear numerical examples, including the designs of the scalar parameter in a one-dimensional cubic polynomial function, the design of the same scalar in a modified function with two indistinguishable parameters, the resolution width and measurement time for a blurred single peak spectrum, and the boundary source locations for impedance tomography in a square domain.
Article
In this study, a robust topology optimization method is proposed for compliant mechanisms, where the effect that variation of the input load direction has on the output displacement is considered. Variations are evaluated through a sensitivity-based robust optimization approach, with the variance evaluated using first-order derivatives. The robust objective function is defined as a combination of maximizing the output deformation under the mean input load and minimizing variation in the output deformation as the input load is varied, where variance due to changes in load can be obtained through mutual compliance and the presence of a pseudo load. For the topology optimization, a modified homogenization design method using the continuous approximation assumption of material distribution is adopted. The validity of the proposed method is confirmed with two compliant mechanism design problems. The effect that varying the input load direction has upon the obtained configurations is investigated by comparing these with deterministic optimum topology design results.
Article
A computational strategy is proposed for robust structural topology optimization in the presence of uncertainties with known second order statistics. The strategy combines deterministic topology optimization techniques with a perturbation method for the quantification of uncertainties associated with structural stiffness, such as uncertain material properties and/or structure geometry. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. This in turn leads to significant computational savings when compared with Monte Carlo-based optimization algorithms which involve multiple formations and inversions of the global stiffness matrix. Examples from truss structures are presented to show the importance of including the effect of controlling the variability in the final design. It is also shown that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.