Conditional Monte Carlo Estimation of Quantile Sensitivities.

Management Science (Impact Factor: 1.86). 01/2009; 55:2019-2027. DOI: 10.1287/mnsc.1090.1090
Source: DBLP

ABSTRACT Estimating quantile sensitivities is important in many optimization applications, from hedging in financial engineering to service-level constraints in inventory control to more general chance constraints in stochastic programming. Recently, Hong (Hong, L. J. 2009. Estimating quantile sensitivities. Oper. Res. 57 118-130) derived a batched infinitesimal perturbation analysis estimator for quantile sensitivities, and Liu and Hong (Liu, G., L. J. Hong. 2009. Kernel estimation of quantile sensitivities. Naval Res. Logist. 56 511-525) derived a kernel estimator. Both of these estimators are consistent with convergence rates bounded by n-1/3 and n-2/5, respectively. In this paper, we use conditional Monte Carlo to derive a consistent quantile sensitivity estimator that improves upon these convergence rates and requires no batching or binning. We illustrate the new estimator using a simple but realistic portfolio credit risk example, for which the previous work is inapplicable.

  • [Show abstract] [Hide abstract]
    ABSTRACT: We develop methods to construct asymptotically valid confidence intervals for quantiles and value-at-risk when applying importance sampling (IS). We first employ IS to estimate the cumulative distribution function (CDF), which we then invert to obtain a point estimate of the quantile. To construct confidence intervals, we show that the IS quantile estimator satisfies a Bahadur-Ghosh representation, which implies a central limit theorem (CLT) for the quantile estimator and can be used to obtain consistent estimators of the variance constant in the CLT.
    Simulation Conference (WSC), Proceedings of the 2010 Winter; 01/2011
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The researchers made significant progress in all of the proposed research areas. The first major task in the proposal involved simulation-based and sampling methods for global optimization. In support of this task, we have discovered two new innovative approaches to simulation-based global optimization; the first involves connections between stochastic approximation and our model reference approach to global optimization, while the second connects particle filtering and simulation-based approaches to global optimization. We have also made significant progress in population-based global optimal search methods, applications of these algorithms to problems in statistics and clinical trials, and efficient allocation of simulations. In support of the second task, we have made progress incorporating simulation-based and sampling methods into Markov Decision Processes (MDPs). We have made significant progress on new sampling methods for MDPs, simulation-based approaches to partially observable Markov decision processes (POMDPs), and applications of these algorithms.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In practice, managers often wish to ascertain that a particular engineering design of a production system meets their requirements. The future environment of this design is likely to differ from the environment assumed during the design. Therefore it is crucial to find out which variations in that environment may make this design unacceptable (unfeasible). This article proposes a methodology for estimating which uncertain environmental parameters are important (so managers can become pro-active) and which combinations of parameter values (scenarios) make the design unacceptable. The proposed methodology combines simulation, bootstrapping, design of experiments, and linear regression metamodeling. This methodology is illustrated through a simulated manufacturing system, including fourteen uncertain parameters of the input distributions for the various arrival and service times. These parameters are investigated through the simulation of sixteen scenarios, selected through a two-level fractional-factorial statistical design. The resulting simulation Input/Output (I/O) data are analyzed through a first-order polynomial metamodel and bootstrapping. A second experiment with other scenarios gives some outputs that turn out to be unacceptable. In general, polynomials fitted to the simulation's I/O data can estimate the border line (frontier) between acceptable and unacceptable environments.
    European Journal of Operational Research 01/2011; 209(2):176-183. · 2.04 Impact Factor


Available from