## About

171

Publications

39,680

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

6,691

Citations

## Publications

Publications (171)

We examine the construction of variable importance measures for multivariate responses using the theory of optimal transport. We start with the classical optimal transport formulation. We show that the resulting sensitivity indices are well-defined under input dependence, are equal to zero under statistical independence, and are maximal under fully...

We consider a model for the resilience analysis of interconnected critical infrastructures (ICIs) that describes the dependencies among the subsystems within the ICIs and their time-varying behavior. The model response is a function of uncertain inputs comprising ICIs design parameters and failure magnitudes of vulnerable elements in the system, et...

Recent studies have emphasized the connection between machine learning feature importance measures and total order sensitivity indices (total effects, henceforth). Feature correlations and the need to avoid unrestricted permutations make the estimation of these indices challenging. Additionally, there is no established theory or approach for non-Ca...

Decision makers often leverage causal and predictive scientific models, enabling them to better estimate the behavior of a physical or social system and explore several possible outcomes of a policy before actually enacting it. However, these models are only useful insofar as decision makers can effectively interpret the model behaviors, their outp...

Climate change and COVID-19 have brought mathematical models into the forefront of politics and decision-making, where they are now being used to justify momentous and often controversial decisions. Such models are technically very complex, and sources of political authority. Yet disagreement among experts fuels a growing uneasiness about the quali...

Information value has been proposed and used as a probabilistic sensitivity measure, the idea being that uncertain parameters having higher information value are precisely those to which an optimal decision is more sensitive. In this paper, we study the notion of information density as a graphical complement to information value analysis, one that...

A bstract
Generative models for protein sequences are important for protein design, mutational effect prediction and structure prediction. In all of these tasks, the introduction of models which include interactions between pairs of positions has had a major impact over the last decade. More recently, many methods going beyond pairwise models have...

Understanding the response of a catchment is a crucial problem in hydrology, with a variety of practical and theoretical implications. Dissecting the role of sub-basins is helpful both for advancing current knowledge of physical processes and for improving the implementation of simulation or forecast models. In this context, recent advancements in...

Understanding the response of a catchment is a crucial problem in hydrology, with a variety of practical and theoretical implications. Dissecting the role of sub-basins is helpful both for advancing current knowledge of physical processes and for improving the implementation of simulation or forecast models. In this context, recent advancements in...

The COVID-19 pandemic has called international scientific efforts to address important aspects of the pandemic. Data science and scientific modeling are extensively used to provide assessments and predictions for policy-making purposes.
However, result communications need to be supported by a proper uncertainty quantification to assess variability...

Owen and Hoyt recently showed that the effective dimension offers key structural information about the input-output mapping underlying an artificial neural network. Along this line of research, this work proposes an estimation procedure that allows the calculation of the mean dimension from a given dataset, without resampling from external distribu...

In risky public-private partnership (PPP) projects, governments and public institutions tend to offer private investors certain guarantees for their participation. However, when public sector capacity and institutions are weak, these guarantees can generate moral hazard in the bidding process and lead to contractual renegotiations, resulting in a l...

Popular sensitivity analysis techniques such as Tornado Diagrams or the Morris method are based on one-at-a-time input variations (local main effects, henceforth). We evidence a link between local main effects of screening methods and global sensitivity measures. The link allows analysts to obtain insights on interactions on the local and global sc...

In this contribution, we present an innovative data-driven model to reconstruct a reliable temporal pattern for time-lagged statistical monetary figures. Our research cuts across several domains regarding the production of robust economic inferences and the bridging of top-down aggregated information from central databases with disaggregated inform...

Understanding the response of a catchment is a crucial problem in hydrology, with a variety of practical and theoretical implications. Dissecting the role of sub-basins is helpful both for advancing current knowledge on physical processes and for improving the implementation of simulation or forecast models. In this context, recent advancements in...

Abstract submission is now open for the 10th International Conference on Sensitivity Analysis of Model Output (SAMO). The conference will be held at Florida State University, Tallahassee, Florida. The dates of the conference are March 14 -16, 2022, at the Florida State Conference Center.
See: https://samo2022.math.fsu.edu/

Agent-based models (ABMs) are increasingly used in the management sciences. Though useful, ABMs are often critiqued: it is hard to discern why they produce the results they do and whether other assumptions would yield similar results. To help researchers address such critiques, we propose a systematic approach to conducting sensitivity analyses of...

Operations researchers worldwide rely extensively on quantitative simulations to model alternative aspects of the COVID-19 pandemic. Proper uncertainty quantification and sensitivity analysis are fundamental to enrich the modeling process and communicate correctly informed insights to decision-makers. We develop a methodology to obtain insights on...

Identifying interactions and understanding the underlying generating mechanism is essential for interpreting the response of black-box models. We offer a systematic analysis of interaction types and corresponding sources, merging results of the broad statistical literature with findings developed within the computer experiment literature. Piecewise...

The attention paid to the role of money as a store of privacy is increasing. In a monetary transaction, full privacy protection coincides with anonymity. In such situations, an empirical question arises: Is anonymity relevant in shaping the demand for money? We attempt to answer this question through laboratory experiments. The results show that an...

We propose two types of importance measures for component selection in preventive maintenance. The first type evaluates the relative importance of a component in terms of how much the component restoration increases the remaining useful lifetime of the system. The second type of importance measures evaluates the contribution of a maintenance action...

A sustainable management of global freshwater resources requires reliable estimates of the water demanded by irrigated agriculture. This has been attempted by the Food and Agriculture Organization (FAO) through country surveys and censuses, or through Global Models, which compute irrigation water withdrawals with sub-models on crop types and calend...

This work investigates aspects of the global sensitivity analysis of computer codes when alternative plausible distributions for the model inputs are available to the analyst. Analysts may decide to explore results under each distribution or to aggregate the distributions, assigning, for instance, a mixture. In the first case, we lose uniqueness of...

Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet to be fully realized, both for advancing mechanistic and data-driven modeling of human and natural systems, and in support of decision making. In this perspective paper, a multidisciplinary group of...

Demographic and financial factors are key risk-drivers for insurance companies and pension funds. This paper proposes a systematic investigation for deepening our understanding how these risk drivers affect the annuity cost. We employ local and global sensitivity methods. For local sensitivity, we derive closed form expressions for the differential...

Quantitative models support investigators in several risk analysis applications. The calculation of sensitivity measures is an integral part of this analysis. However, it becomes a computationally challenging task, especially when the number of model inputs is large and the model output is spread over orders of magnitude. We introduce and test a ne...

In this contribution, we study the time lag between the local expenditures of European funds by beneficiaries in Italian regions and the corresponding payments reported in the European Commission (EC) database. We propose a model to reconstruct the timing of these local expenditures by back-dating the observed EC reimbursements. Our estimates are v...

Decision makers increasingly rely on forecasts or predictions generated by quantitative models. Best practices recommend that a forecast report be accompanied by a sensitivity analysis. A wide variety of probabilistic sensitivity measures have been suggested; however, model inputs may be ranked differently by different sensitivity measures. Is ther...

Italy has been one of the first countries timewise strongly impacted by the COVID-19 pandemic. The adoption of social distancing and heavy lockdown measures is posing a heavy burden on the population and the economy. The timing of the measures has crucial policy-making implications. Using publicly available data for the pandemic progression in Ital...

Computer experiments are becoming increasingly important in scientific investigations. In the presence of uncertainty, analysts employ probabilistic sensitivity methods to identify the key-drivers of change in the quantities of interest. Simulation complexity, large dimensionality and long running times may force analysts to make statistical infere...

Shapley effects are attracting increasing attention as sensitivity measures. When the value function is the conditional variance, they account for the individual and higher order effects of a model input. They are also well defined under model input dependence. However, one of the issues associated with their use is computational cost. We present a...

How can one make a large and complex model fast and “small”? The simulation literature has extensively addressed this problem, and the kriging method has proven to be one of the most successful methods to deal with complex simulators. In “Facing High-Dimensional Simulators: Faster Kriging?,” Xuefei Lu, Alessandro Rudi, Emanuele Borgonovo, and Loren...

De Graaf et al. (2019) suggest that groundwater pumping will bring 42--79\% of worldwide watersheds close to environmental exhaustion by 2050. We are skeptical of these figures due to several non-unique assumptions behind the calculation of irrigation water demands and the perfunctory exploration of the model's uncertainty space. Their sensitivity...

Computer experiments are becoming increasingly important in scientific investigations. In the presence of uncertainty, analysts employ probabilistic sensitivity methods to identify the key-drivers of change in the quantities of interest. Simulation complexity, large dimensionality and long running times may force analysts to make statistical infere...

Copula theory is concerned with defining dependence structures given appropriate marginal distributions. Probabilistic sensitivity analysis is concerned with quantifying the strength of the dependence among the output of a simulator and the uncertain simulator inputs. In this work, we investigate the connection between these two families of methods...

In probabilistic risk assessment, attention is often focused on the expected value of a risk metric. The sensitivity of this expectation to changes in the parameters of the distribution characterizing uncertainty in the inputs becomes of interest. Approaches based on differentiation encounter limitations when (i) distributional parameters are expre...

Risk analysts are often concerned with identifying key safety drivers, that is, the systems, structures, and components (SSCs) that matter the most to safety. SSCs importance is assessed both in the design phase (i.e., before a system is built) and in the implementation phase (i.e., when the system has been built) using the same importance measures...

The functional ANOVA expansion of a multivariate mapping plays a fundamental role in statistics. The expansion is unique once a unique distribution is assigned to the covariates. Recent investigations in the environmental and climate sciences show that analysts may not be in a position to assign a unique distribution in realistic applications. We o...

This paper takes a fresh look at sensitivity analysis in linear programming. We propose a merged approach that brings together the insights of Wendell's tolerance and Wagner's global sensitivity approaches. The modeler/analyst is then capable of answering questions concerning stability, trend, model structure, and data prioritization simultaneously...

In this work we investigate methods for gaining greater insight from hydrological model runs conducted for uncertainty quantification and model differentiation. We frame the sensitivity analysis questions in terms of the main purposes of sensitivity analysis: parameter prioritization, trend identification and interaction quantification. For paramet...

The risk-triplet approach pioneered by Kaplan and Garrick is the keystone of operational risk analysis. We perform a sharp embedding of the elements of this framework into the one of formal decision theory, which is mainly concerned with the methodological and modelling issues of decision making. The aim of this exercise is twofold: on the one hand...

This chapter discusses the class of moment-independent importance measures. This class comprises density-based, cumulative distribution function-based, and value of information-based sensitivity measures. The chapter illustrates the definition and properties of these importance measures as they have been proposed in the literature, reviewing a comm...

All of the sensitivity measures that have been defined thus far are measures of value sensitivity. In other words, they measure the change in value (or distribution) of the model
output Y as we obtain information about \(X_{i}.\)

In this section, we present an analysis of the interaction properties of multilinear functions
. Our aim is to show that, for a multilinear function, the integral (functional ANOVA)
and Taylor expansions coincide.

Transformation invariance is a particularly interesting property
when analyzing the sensitivity of model output.

Given our discussion of the various methods above, a natural question is: What is the (best) method to use? We phrase this question with “best” in parentheses because we do not believe that there is an absolutely “best sensitivity method”. In fact, even the answer to the simpler question “of which method should be used” is multifaceted.

This chapter proposes a unified view of three additional deterministic methods: scenario analysis, functional ANOVA decomposition and finite change sensitivity indices. We start with scenario analysis.

Sensitivity measures that consider the output’s entire distribution function are called moment-independent measures. In recent years, the amount of attention paid to moment-independent sensitivity measures has grown.

Tornado diagrams provide indications about the sensitivity of the model output to one-at-a-time model-input variations at their extreme ranges.

Before we come to the application section, we address some additional properties of the functional ANOVA
expansion.

We start with the application of local sensitivity analysis
to capital budgeting
.KeywordsCash FlowTotal Factor ProductivityClimate SensitivityBasic EventFault TreeThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

The complete dissection of a finite change requires \(2^{n}-1\) model evaluations, which is the number of finite change sensitivity indices of all orders.

As a way of introducing variance-based methods
, we could ask the following question: Is it possible to introduce sensitivity indicators that explain the variance of the model output rather than only the fraction of the variance associated with the linear surrogate model? The answer is found in variance-based sensitivity measures, which we describe...

A sensitivity analysis
method is deterministic if it does not require the analyst to specify a distribution for the model inputs
.

The solution of several problems in operations research and the managerial sciences leads to classic optimization models.

This chapter, which is our last on deterministic methods, addresses the removal of a typical assumption in sensitivity analysis.

Baucells and Borgonovo (2013) introduce and analyze global sensitivity measures based on cumulative distribution functions
(CDFs).

This section is devoted to the most important step in sensitivity analysis, the formulation of the sensitivity question. Scientists have developed myriads of models in different disciplines, and there are myriads of sensitivity analysis methods waiting to be used to explore the content of those models.

The importance of properly displaying the analyst/decision maker’s degree of belief about the problem at hand is recognized by several agencies and international institutions.

The purpose of this section it to introduce a series of analytical test cases in which the sensitivity measures illustrated thus far can be obtained analytically.

Methods based on differentiation represent an important class of probabilistic sensitivity methods.

After completing an uncertainty analysis
, the next step is a global sensitivity analysis
.

We are living in a new era on the verge of a data-driven economy.

This section investigates the concept of expected value of perfect information. The definition we present here is the definition used in classical decision-analysis courses for decision making under risk.

The European Journal of Operational Research (EJOR) published its first issue in 1977. This paper presents a general overview of the journal over its lifetime by using bibliometric indicators. We discuss its performance compared to other journals in the field and identify key contributing countries/ institutions/ authors as well as trends in resear...

Risk-informed decision making is often accompanied by the specification of an acceptable level of risk. Such target level is compared against the value of a risk metric, usually computed through a probabilistic safety assessment model, to decide about the acceptability of a given design, the launch of a space mission, etc. Importance measures compl...

Scenarios showing future greenhouse gas emissions are needed to estimate climate impacts and the mitigation efforts required for climate stabilization. Recently, the Shared Socioeconomic Pathways (SSPs) have been introduced to describe alternative social, economic and technical narratives, spanning a wide range of plausible futures in terms of chal...

This book is an expository introduction to the methodology of sensitivity analysis of model output. It is primarily intended for investigators, students and researchers that are familiar with mathematical models but are less familiar with the techniques for performing their sensitivity analysis. A variety of sensitivity methods have been developed...

The Birnbaum importance measure plays a central role in reliability analysis. It has initially been introduced for coherent systems, where several of its properties hold and where its computation is straightforward. This work introduces a Boolean expression for the notion of criticality that allows the seamless extension of the Birnbaum importance...

Modern digital systems pose new challenges to reliability analysts. Systems may exhibit a non-coherent behavior and time becomes an important element of the analysis due to aging effects. Measuring the importance of system components in a computationally efficient way becomes essential in system design. Herein, we propose a new importance measure f...

Measures of sensitivity and uncertainty have become an integral part of risk analysis. Many such measures have a conditional probabilistic structure, for which a straightforward Monte Carlo estimation procedure has a double-loop form. Recently, a more efficient single-loop procedure has been introduced, and consistency of this procedure has been de...

Sensitivity Analysis (SA) is performed to gain fundamental insights on a system behavior that is usually reproduced by a model and to identify the most relevant input variables whose variations affect the system model functional response. For the reliability analysis of passive safety systems of Nuclear Power Plants (NPPs), models are Best Estimate...

The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that unc...

The solution of several operations research problems requires the creation of a quantitative model. Sensitivity analysis is a crucial step in the model building and result communication process. Through sensitivity analysis we gain essential insights on model behavior, on its structure and on its response to changes in the model inputs. Several int...

This work addresses the early phases of the elicitation of multiattribute value functions proposing a practical method for assessing interactions and monotonicity. We exploit the link between multiattribute value functions and the theory of high dimensional model representations. The resulting elicitation method does not state any a-priori assumpti...

In selecting the preferred course of action, decision makers are often uncertain about one or more probabilities of interest. The experimental literature has ascertained that this uncertainty (ambiguity) might affect decision makers’ preferences. Then, the decision maker might wish to incorporate ambiguity aversion in the analysis. We investigate t...

In the present paper we use the output of multiple expert elicitation surveys on the future cost of key low-carbon technologies and use it as input of three Integrated Assessment models, GCAM, MARKAL_US and WITCH. By means of a large set of simulations we aim to assess the implications of these subjective distributions of technological costs over k...

This chapter discusses the class of moment-independent importance measures. This class comprises density-based, cumulative distribution function-based, and value of information-based sensitivity measures. The chapter illustrates the definition and properties of these importance measures as they have been proposed in the literature, reviewing a comm...

Decision makers benefit from the utilization of decision-support models in several applications. Obtaining managerial insights is essential to better inform the decision-process. This work offers an in-depth investigation into the structural properties of decision-support models. We show that the input–output mapping in influence diagrams, decision...

Finite element models used in industrial studies for seismic structural reliability analysis are in general very complex and computationally intensive. This is due to the important number of degrees of freedom as well as due to advanced damage models and failure modes that have to be simulated. In consequence, reliability studies are feasible only...

We reconcile Kaplan and Garrick's seminal definition of risk with classical subjective expected utility, filling in the relevant gaps and providing a framework that is ready-to-use in applications. We show that Kaplan and Garrick's "frequency" format can be set in one-to-one correspondence with [26]'s utility theory. Kaplan and Garrick's "probabili...

Monotonic transformations are widely employed in statistics and data analysis. In computer experiments they are often used to gain accuracy in the estimation of global sensitivity statistics. However, one faces the question of interpreting results that are obtained on the transformed data back on the original data. The situation is even more comple...