Figure 2 - uploaded by Takuya Iwanaga
Content may be subject to copyright.
The six major themes of 'challenges and outlook' in the theory, methods and application of SA.

The six major themes of 'challenges and outlook' in the theory, methods and application of SA.

Source publication
Article
Full-text available
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet to be fully realized, both for advancing mechanistic and data-driven modeling of human and natural systems, and in support of decision making. In this perspective paper, a multidisciplinary group of...

Contexts in source publication

Context 1
... the significant progress and popularity of sensitivity analysis (SA) in recent years, it is timely to revisit the fundamentals of this relatively young research area, identify its grand challenges and research gaps, and probe into the ways forward. To this end, the multidisciplinary authorship team has identified six major themes of 'challenges and outlook', as outlined in Figure 2. In the following, we discuss our perspective on the past, present and future of SA under each theme in a dedicated section. ...
Context 2
... Much work is needed to realize the tremendous untapped potential of SA for mathematical modeling of socio-environmental and other societal problems which are confounded by uncertainty (Section 3.2). SA can help with the management of uncertainty by (1) characterizing how models and the underlying real-world systems work, (2) identifying the adequate level of model complexity for the problem of interest, and (3) pointing to possible model deficiencies and non-identifiability issues, as well as where to invest to reduce critical uncertainties. ...

Similar publications

Article
Our study is keyed to the development of a methodological approach to assess the workflow and performance associated with the operation of a crude-oil desalting/demulsification system. Our analysis is data-driven and relies on the combined use of (a) Global Sensitivity Analysis (GSA), (b) machine learning, and (c) rigorous model discrimination/iden...

Citations

... Single-model global sensitivity analysis (SM-GSA) enables to diagnose model behavior in terms of assessing relationships and interplay between model components and quantifying the relative contribution of parametric uncertainties on model-based results within a theoretical framework based on the use of a single conceptual/mathematical model. In this context, SM-GSA can further guide the implementation of various levels of potential simplification of a given conceptual model upon identification of parameters that might be uninfluential (or only minimally influential) to a target model output (Bastidas et al., 1999;Borgonovo and Plischke, 2016;Gamboa et al., 2014;Lamboni et al., 2011;Liu et al., 2004;Razavi et al., 2021;Saltelli et al., 2007;Song et al., 2015). ...
... This paper is concerned with analyzing the results of experiments or computer simulations in a design matrix of M ≥ 1 input axes (columns) and L ≥ 1 output axes (columns) over N samples (rows). Global Sensitivity Analysis (GSA) [33] examines the relevance of the various inputs to the various outputs. When pursued via ANOVA decomposition of a single output, this leads naturally to the well known Sobol' indices, which have by now been applied across most fields of science and engineering [36,20]. ...
Preprint
Full-text available
Variance based global sensitivity analysis measures the relevance of inputs to a single output using Sobol' indices. This paper extends the definition in a natural way to multiple outputs, directly measuring the relevance of inputs to the linkages between outputs in a correlation-like matrix of indices. The usual Sobol' indices constitute the diagonal of this matrix. Existence, uniqueness and uncertainty quantification are established by developing the indices from a putative multi-output model with quantified uncertainty. Sobol' matrices and their standard errors are related to the moments of the multi-output model, to enable calculation. These are benchmarked numerically against test functions (with added noise) whose Sobol' matrices are calculated analytically.
... In particular, variance-based GSA methods like Sobol' indices aim to rank the significance of input variables based on specific rules, using variance decomposition as their foundation [26]. An in-depth litterature on the subject can be find here [27]. Conversely, active subspace enables the identification and visualization of low-dimensional latent spaces within the input-output relationship. ...
Conference Paper
Full-text available
Surrogate models are of high interest for many engineering applications, serving as cheap-to-evaluate time-efficient approximations of black-box functions to help engineers and practitioners make decisions and understand complex systems. As such, the need for explainability methods is rising and many studies have been performed to facilitate knowledge discovery from surrogate models. To respond to these enquiries, this paper introduces SMT-EX, an enhancement of the open-source Python Surrogate Modeling Toolbox (SMT) that integrates explainability techniques into a state-of-the-art surrogate modelling framework. More precisely, SMT-EX includes three key explainability methods: Shapley Additive Explanations, Partial Dependence Plot, and Individual Conditional Expectations. A peculiar explainability dependency of SMT has been developed for such purpose that can be easily activated once the surrogate model is built, offering a user-friendly and efficient tool for swift insight extraction. The effectiveness of SMT-EX is showcased through two test cases. The first case is a 10-variable wing weight problem with purely continuous variables and the second one is a 3-variable mixed-categorical cantilever beam bending problem. Relying on SMT-EX analyses for these problems, we demonstrate its versatility in addressing a diverse range of problem characteristics. SMT-Explainability is freely available on Github * .
... Global sensitivity analysis (GAS) can provide information on the dependence of the model output on each of its input parameters [44,45,51]. An advantage of GSA methods is that they explore screening or variance decomposition to cover the limitations of local analysis. ...
Article
Full-text available
Obesity and diabetes are diseases that are increasing every year in the world and their control is an important problem faced by health systems. In this work, we present an optimal control problem based on a model for overweight and obesity and its impact on the diagnosis of diabetes using fractional order derivatives in the Caputo sense. The controls are defined with the objective of controlling the evolution of an individual with normal weight to overweight and that overweight leads to chronic obesity. We show the existence of optimal control using Pontryagin's maximum principle. We perform a study of the global sensitivity for the model using Sobol's index of first, second and total order using the polynomial chaos expansion (PCE) with two techniques, ordinary least squares (OLS) and least angle regression (LAR) to find the polynomial coefficients, and two sampling methods, Monte Carlo and Sobol'. With the obtained results, we find that among the parameters with the greatest influence are those we used in the definition of the control system. We have that the best results are achieved when we activate the three controls. However, when we only activate two controls, it shows better results in preventing a person with normal weight from becoming overweight by controlling weight gain due to social pressure and the evolution from overweight to obesity. All strategies significantly reduce the number of cases diagnosed with diabetes over time.
... While uncertainty quantification is a broad concern across various fields, it is particularly critical in engineering complex systems with multiple uncertainty sources [48,49] affecting simulation results. This study employs a local sensitivity analysis [50], investigating the impact of ±5% variations in J-C plasticity and damage parameters on the predicted structural response. A local sensitivity analysis evaluates the influence of deterministic model parameters on the responses of interest [46]. ...
Article
Full-text available
Titanium alloys, such as Ti-6Al-4V, are crucial for aeroengine structural integrity, especially during high-energy events like turbine blade-out scenarios. However, accurately predicting their behavior under such conditions requires the precise calibration of constitutive models. This study presents a comprehensive sensitivity analysis of the Johnson-Cook plasticity and progressive damage model parameters for Ti-6Al-4V in blade containment simulations. Using finite element models, key plasticity parameters (yield strength (A), strain-hardening constant (B), strain-rate sensitivity (C), thermal softening coefficient (m), and strain-hardening exponent (n)) and damage-related parameters (d1, d2, d3, d4, and d5) were systematically varied by ±5% to assess their influence on stress distribution, plastic deformation, and damage indices. The results indicate that the thermal softening coefficient (m) and the strain rate hardening coefficient (C) exhibit the most significant influence on the predicted casing damage, highlighting the importance of accurately characterizing these parameters. Variations in yield strength (A) and strain hardening exponent (n) also notably affect stress distribution and plastic deformation. While the damage evolution parameters (d1–d5) influence the overall damage progression, their individual sensitivities vary, with d1 and d4 showing more pronounced effects compared to others. These findings provide crucial guidance for calibrating the Johnson-Cook model to enhance aeroengine structural integrity assessments.
... Correlation analysis refers to the analysis of two or more variables with correlation, so as to measure the degree of correlation between two variables, but does not involve the determination of causation. Sensitivity analysis focuses on the direct influence of input variables on model output variables and has the determination of causality (Razavi et al. 2021). In the research related to building performance analysis, sensitivity analysis is often used to quantitatively evaluate the influence of design parameters on building performance, so as to identify parameters with greater influence to improve the building design effect (Pang et al. 2020). ...
Article
Full-text available
Under the emerging trend of the new power systems, enhancing the energy flexibility of air conditioning loads to promote electricity demand response is crucial for regulating the real-time balance. As a typical temperature-controlled loads, air conditioning loads can generate rebound effect when participating in demand response, resulting in sudden load increases and posing risks to grid security. However, the existing research mainly focuses on the energy flexibility, which leads to an imperfect demand response mechanism and thus affects the optimal scheduling strategy. Therefore, the study proposes a comprehensive quantification method in view of rebound effect for the demand response performance of air conditioning loads, by using probability distribution, Latin hypercube sampling, Monte Carlo simulation, and scenario analysis methods. The demand response event was divided into response phase and recovery phase, and by considering energy flexibility during the response phase and rebound effect during the recovery phase, three dimensionless evaluation indexes for comprehensive demand response performance were constructed. Using this quantification method, the impact patterns of three types of random variables were compared, including meteorological, design variables, and control variables. Additionally, considering the differences in building types (office and hotel buildings) and building capacities (small, medium, and large), the effectiveness of air conditioning load participation in demand response measures in different building application scenarios was explored. The results show that the influence of the design variables on the response performance is less than that of the control variables, but significant, reaching 45% compared to the control variables. Moreover, the influence varies with building type, capacity and climate zone, and building demand response design has more potential in the following scenarios: the cold climate, the hot summer and cold winter climate, the medium building and the hotel building.
... The variety of global sensitivity analysis method originates from the fact that influence of a parameter on the quantity of interest can be expressed by various mathematical measures, such as the proportion of variance, correlation, regression coefficient, conditional probability density function, etc. Consequently, over the last couple of decades, sensitivity analysis has evolved into a distinct discipline (Razavi et al. 2021) encompassing a vast array of methods and approaches. There are also several variants of sensitivity analysis problem setting (Saltelli and Tarantola 2002;Borgonovo and Plischke 2016;Pianosi et al. 2016;Swiler et al. 2021). ...
... This sample is usually constructed in a specific way, using Sobol' lowdiscrepancy quasi-random sequence (Saltelli et al. 2010). The Sobol' sequence demonstrates significantly higher performance for the even exploration of the parametric space than random sampling and also allows for the sequential addition of new points, which is beneficial when the sufficient size of the sample is initially unknown (Razavi et al. 2021;Kucherenko et al. 2015). ...
Article
Full-text available
Sensitivity analysis is a crucial step in the development of computational models for any complex system, as it allows for comparison of the relative influence of model parameters on the simulation results. Its role is even more critical in the context of numerical safety assessment for future geological repositories of radioactive waste since it provides insights into understanding relevant to safety processes, establishes grounds for prioritizing additional research, enhances confidence in the safety assessment results. Unfortunately, groundwater flow and radionuclide transport models for radioactive waste repositories often become quite detailed and computationally expensive during the iterative process of the safety assessment. Moreover, any meaningful statistical analysis of the simulation results requires hundreds, if not thousands, of model realizations at different parameter combinations. This necessity has led to the increasing popularity of sensitivity analysis methods coupled with metamodeling approaches where a portion of the points in the parametric space (model realizations) is obtained through numerical simulation of processes, and another portion is approximated from available realizations using less computationally demanding data-driven algorithms. Modern machine learning methods also frequently address the problem of predicting a response function at new points using data from known points. Feature importance measures for machine learning models serve a role akin to sensitivity analysis: they assist in quantifying the effect of model inputs (features) on outputs (predictions). In this paper, we compare the modern first-choice variant of sensitivity analysis, variance-based Sobol’ indices, obtained using polynomial chaos expansion metamodel, with well-known importance measures from the machine learning world.
... Advanced modeling techniques [44], therefore, become indispensable to fully characterize and optimize these materials for industrial use. In this context, the development of comprehensive predictive models can bridge the gap between experimental data and real-world applications, facilitating a deeper understanding of how agro-marine reinforcements impact the performance of aluminum composites in operational settings [45,46]. ...
... By isolating the key factors that drive performance, sensitivity analysis streamlines the design process, making it easier to customize the composite material to meet specific industry needs. This approach reduces unnecessary complexity in the model, while also enhancing accuracy, ultimately aiding industries in adopting agro-marine composites in a targeted and efficient manner [46,53]. ...
Article
Full-text available
Aluminum-based composites reinforced with agro-marine waste materials present an eco-friendly, cost-effective solution for industries needing lightweight and durable materials. This study develops and evaluates the mechanical properties of aluminum (AA6061) composites reinforced with plantain stem ash, eucalyptus wood ash, and periwinkle shell powder for potential aerospace, automotive, and marine applications. Composite samples were prepared according to ASTM standards for tensile, compressive, flexural, wear, and fatigue tests. Objective Test Functions (OTFs) were derived using physical modeling and sensitivity analysis to filter impactful variables. Testing was conducted using universal testing machines, and wear/friction tests followed ASTM G99 guidelines. The physical models showed strong correlations (adjusted R²: 0.9033–1.0000). Measured properties included tensile strength (1907.46–2000.05 Pa), compressive strength (30.33–215.14 Pa), flexural strength (412.72–556.42 N-m²), wear rate (55.01–63.27 m³/m), coefficient of friction (0.102), and fatigue cycle (13.81–27.63 cycles). Except for the coefficient of friction, all results were consistent with the developed OTFs, confirming the material’s structural integrity. The composites displayed favourable properties for use in aerospace, automotive, and marine industries, offering strength, lightweight characteristics, and corrosion resistance. The OTFs validated the material’s performance, making it a viable alternative to conventional materials. Consequently, aluminum-agro-marine waste composites demonstrate excellent mechanical properties, with high potential for industrial applications.
... Parametric methods like ANOVA are powerful, but demanding strict normality and homoscedasticity assumptions often violated in CI and DM experiments (Niankara, 2024;Sanchis-Segura & Wilcox, 2024;Yu et al., 2022). This has resulted in broadly and widely misinterpretation, or underestimation of algorithm performance, particularly in complex, real-world applications (Berger et al., 2024;Razavi et al., 2021). The limits of parametric methods, which are being regarded as the less robust option now that nonparametric methods that do not make such onerous assumptions, have been gaining traction. ...
Article
Full-text available
Objective: With the aim of improving monitoring reliability and interpretability of CI and DM experimental statistical tests, we evaluate the performance of cutting-edge nonparametric tests and post hoc procedures. Methods: A Friedman Aligned Ranks test, Quade test, and multiple post hoc corrections Bonferroni-Dunn and Holm were used to comparative analyze data. These approaches were employed to algorithm performance metrics with varied datasets to evaluate their capability to detect meaningful differences and control Type I errors.Results: Advanced nonparametric methods consistently outperformed traditional parametric tests, offering robust results in heterogeneous datasets. The Quade test was the most powerful and stable, and the post hoc procedures greatly increased the power of the pairwise comparisons.Novelty: We evaluate advanced nonparametric methods in CI and DM experiments: the Friedman Aligned Ranks test, the Quade test, and post hoc procedures (Bonferroni-Dunn and Holm). These methods represent a departure from traditional parametric tests that depend on assumptions of normality and homogeneity of variance, allowing for more flexible and robust approaches to analyses of complex, heterogeneous datasets. By comparing the strength and efficacy of these methods, the research also delivers common guidelines for their use; as well as demonstrating their utility in realistic situations characterized by non-standard and dispersed data.Implications for Research: The findings have far-reaching theoretical and pragmatic implications for scholars in CI and DM. On a theoretical level, this work undermines the common bias towards parametric techniques, providing an increasingly robust framework for comparative analysis in experimental research. This work improves understanding of the adaptation of statistical tests to fit the complexities of real-world data by highlighting the advantages of advanced nonparametric methods, specifically the Quade test and post hoc corrections. Practical implications The results give owners of data summaries actionable recommendations, which will assist researchers in the selection of statistical methods that are tuned to the nature of their datasets, resulting in improved reliability and interpretability of future evaluations of algorithms. Thus, this endeavor will promote more powerful and statistically appropriate methods in CI and DM studies, leading to more confident and valid claims surrounding algorithmic performance.
... It allows for a quantitative or qualitative description of how uncertainty in system parameters influences the output results. 29,30 Morris sensitivity analysis, 31 a global sensitivity analysis method, is commonly used to identify the primary influencing factors among the input parameters of the model. This technique involves discretizing the values of each parameter under study and normalizing them within the range [0, 1]. ...
Article
Full-text available
The deflector jet pressure servo valve (DJPSV), a critical component of the aircraft brake servo system, requires a precise foundational model for performance analysis, optimization, and enhancement. However, the complexity of the jet process within the V-groove of the deflector plate presents challenges for accurate mathematical modeling. To address this issue, the paper takes the DJPSV as the research object, carries out detailed mathematical modeling of its components, analyzes the influencing factors of the performance of the key component—the front stage—and optimizes the design of the key factors. First, integrating FLUENT velocity field analysis, this study proposes a novel perspective to rationally simplify and parametrically model the injection process in 3D space. Subsequently, a systematic deduction of the mathematical model for DJPSV is undertaken. Employing the AMESim platform and the secondary development module AMESet, a comprehensive simulation model is constructed, facilitating the study of static-dynamic valve characteristics. Additionally, utilizing the Morris theory and an intelligent algorithm, sensitivity analysis, and structural optimization on the critical component, the pre-stage. The results reveal that the width of the receiving diverter wedge (M), the width of the V-groove outlet (b1), and the distance from the V-groove outlet to the receiving diverter wedge (h) exert the most significant influence on the differential pressure of the pre-stage, which are the key parameters affecting the output differential pressure of the pre-stage. The experiment verifies the accuracy of the simulation model, offering a vital theoretical foundation for valve development and related areas.