Content uploaded by John Mulvey

Author content

All content in this area was uploaded by John Mulvey on May 17, 2016

Content may be subject to copyright.

Available via license: CC BY 3.0

Content may be subject to copyright.

We develop a methodology for evaluating a decision strategy
generated by a stochastic optimization model. The methodology is
based on a pilot study in which we estimate the distribution of
performance associated with the strategy, and define an appropriate
stratified sampling plan. An algorithm we call filtered search
allows us to implement this plan efficiently. We demonstrate the
approach's advantages with a problem in asset / liability management
for an insurance company.

Content uploaded by John Mulvey

Author content

All content in this area was uploaded by John Mulvey on May 17, 2016

Content may be subject to copyright.

Available via license: CC BY 3.0

Content may be subject to copyright.

... Given this tree structure, we can immediately observe that the size of the deterministic equivalent grows exponentially with the number of stages. Some authors have dealt with this curse of dimensionality applying large scale optimization techniques Pereira and Pinto (1991), Rockafellar and Wets (1991), Shapiro et al. (2013), while others approximate the original multistage problem by reducing the number decision variables with the adopting single policy rule Rush et al. (2000). ...

... A policy rule is a function of the uncertainty realization that generates a unique sequence of feasible decisions for each time of the planning horizon. This framework fits into the independent scenario structure as stated in Rush et al. (2000), however it usually leads to a suboptimal solution when compared to the original multistage one. Indeed, one could define a set of policy rules generally leading to a non-convex optimization problem. ...

Large corporations fund their capital and operational expenses by issuing bonds with a variety of indexations, denominations, maturities and amortization schedules. We propose a multistage linear stochastic programming model that optimizes bond issuance by minimizing the mean funding cost while keeping leverage under control and insolvency risk at an acceptable level. The funding requirements are determined by a fixed investment schedule with uncertain cash flows. Candidate bonds are described in a detailed and realistic manner. A specific scenario tree structure guarantees computational tractability even for long horizon problems. Based on a simplified example, we present a sensitivity analysis of the first stage solution and the stochastic efficient frontier of the mean-risk trade-off. A realistic exercise stresses the importance of controlling leverage. Based on the proposed model, a financial planning tool has been implemented and deployed for Brazilian oil company Petrobras.

... Each scenario depicts a single plausible path for all of the uncertain parameters over the planning period. Employing variance reduction methods, in concert with the stochastic optimization model can reduce the number of scenarios (see Campbell et al. 1997, andMulvey andRush 1997). ...

Abstract Total enterprise risk management,involves a systematic approach for evaluating/controlling risks within a large firm such as a property-casualty insurance company. The basic idea is to coordinate planning throughout the organization, from traders and underwriters to the CFO, in order to maximize the company's economic surplus at the desired level ofenterprise risk. At present, it is difficult to link strategic systems, such as asset allocation, to tactical systems for pricing securities and selecting new products. We propose two solutions. First, we develop a "'price of risk" for significant decisions possessing correlated factors. Second, we create a set of dynamic investment categories, called hybrid assets, for use in an asset and liability management framework. We illustrate the concepts via an insurance planning problem, whereby the goal is to optimize the company's surplus. 150

Enhancing the resilience of power distribution systems to extreme weather events is of critical concern. Upgrading the distribution system infrastructure by system hardening and investing in smart grid technologies effectively enhances grid resilience. Existing distribution system planning methods primarily consider the persistent cost of the expected events (such as faults and outages likely to occur) and aim at improving system reliability. The resilience to extreme weather events requires reducing the impacts of the high impact low probability (HILP) events that are characterized by the tail probability of the event impact distribution. Thus, the resilience-oriented system upgrades solutions need to be driven by the risks imposed by extreme weather events on the power grid infrastructure rather than persistent costs. This paper aims to develop a risk-based approach for the long-term resilience planning of active power distribution systems against extreme weather events. The proposed approach explicitly models (1) the impacts of HILP events using a two-stage risk-averse stochastic optimization framework, thus, explicitly incorporating the risks of HILP events in long-term planning decisions, and (2) the advanced distribution grid operations (in the aftermath of the event) such as intentional islanding into infrastructure planning problem. The inclusion of risk in the planning objective provides additional flexibility to the grid planners to analyze the trade-off between risk-neutral and risk-averse planning strategies.

Although sampling methods are various, most frequently used method is Stratified Random Sampling in practice, especially, in case of heterogeneous population structure. One of the most important points, which should be considered, in the use of stratified random sampling method is how many units of samples should be selected from which stratum. Determination of optimum sample size to be selected from strata allows the sample to represent the population properly and increases precision of the obtained estimations. Kuhn-Tucker Method, which is accepted as a basic method for determination of sample sizes to be selected from strata in stratified random sampling, and the goal programming method, which can take into consideration the researcher's multi-objectives, will be used in this study. It will be tried to minimize variance of sample mean statistics by using these methods under the non-linear cost constraint and superiorities of these methods over each other will be discussed under the light of the results obtained from the conducted simulation study.

Global insurance/reinsurance companies can gain significant advantages by implementing an enterprise risk management system. The major goals are: increase long-term profitability, reduce enterprise risks, and identify the firm's optimal capital structure. Profitability depends upon evaluating uncertainties as a function of a set of common factors across the enterprise. The system takes up all major decisions: identifying optimal liability-driven asset strategies, expanding/contracting insurance lines and pricing policies, constructing sound reinsurance treaties, and setting the firm's capital structure. To implement risk management in global companies, this chapter presents a decentralized system. It describes a series of case studies in which the dynamic financial analysis (DFA) system can improve decision making. Prominent examples include: asset allocation strategies, selection of business activities to increase or decrease, the structure of reinsurance treaties, and the company's degree of leverage (gearing). Since many of these decisions interact with each other, it suggests that the full system be run in an integrated fashion. It shows that the DFA system can be implemented in large, decentralized companies by extending existing capital allocation policies. The scenario approach lends itself well to the decentralized management, common in most multinational financial organizations.

Stochastic programming models provide a powerful paradigm for decision making under uncertainty. In these models the uncertainties are captured by scenario generation and so are crucial to the quality of solutions obtained. Presently there do not exist many literature reviews on scenario generation; this paper surveys them. We introduce the main concepts behind scenario generation, which are not just concerned with discretising methods. We review the main scenario generation classes and analyse the advantages and disadvantages. We also review new and less commonly known scenario generation methods, such as 'hybrid' methods.

Total enterprise risk management involves a systematic approach for evaluating/controlling risks within a large firm such as a property-casualty insurance company. Thebasic idea is to coordinate planning throughout the organization, from traders and underwritersto the CFO, in order to maximize the company's economic surplus at the desiredlevel of enterprise risk. At present, it is difficult to link strategic systems, such as assetallocation, to tactical systems for pricing securities and selecting new products. We proposetwo solutions. First, we develop a “price of risk” for significant decisions possessingcorrelatedfactors. Second, we create a set of dynamic investment categories, called hybrid assets,for use in an asset and liability management framework. We illustrate the concepts via aninsurance planning problem, whereby the goal is to optimize the company's surplus.

An introduction to the theory and practice of financial simulation and optimization. In recent years, there has been a notable increase in the use of simulation and optimization methods in the financial industry. Applications include portfolio allocation, risk management, pricing, and capital budgeting under uncertainty. This accessible guide provides an introduction to the simulation and optimization techniques most widely used in finance, while at the same time offering background on the financial concepts in these applications. In addition, it clarifies difficult concepts in traditional models of uncertainty in finance, and teaches you how to build models with software. It does this by reviewing current simulation and optimization methodology-along with available software-and proceeds with portfolio risk management, modeling of random processes, pricing of financial derivatives, and real options applications. Contains a unique combination of finance theory and rigorous mathematical modeling emphasizing a hands-on approach through implementation with software. Highlights not only classical applications, but also more recent developments, such as pricing of mortgage-backed securities. Includes models and code in both spreadsheet-based software (@RISK, Solver, Evolver, VBA) and mathematical modeling software (MATLAB). Filled with in-depth insights and practical advice, Simulation and Optimization Modeling in Finance offers essential guidance on some of the most important topics in financial management.

The development and application of control variables for variance reduction in the simulation of a wide class of closed queueing networks is discussed. These networks allow multiple types of customers, priorities and blocking. Alternative methods of generating confidence intervals from independent replications of a simulation are investigated. A result is given which quantifies the loss in variance reduction caused by the estimation of the optimum control coefficients. This loss is an increasing function of the number of control variables. Good variance reduction is obtained providing that the number of control variables remains small.

We review the method of using conditional expectation to reduce variance in discrete event simulation and present a new application to the simulation of queueing networks. This method can be particularly useful in reducing variance when estimating quantities associated with rare events.

The method of antithetic variates is a well-known technique for reducing the variability of estimators in computer simulation experiments. However, the usually suggested way of using the method, if incorrectly applied, can lead to an increase in the variance of estimators of certain quantities, such as percentiles. A procedure, based on two non-standard methods of generating samples from the normal distribution, is suggested which does not suffer this weakness. Numerical examples are given showing the ease of implementation and the effectiveness of the procedure.

An important, recurring problem in statistics involves the determination of strata boundaries for use in stratified sampling. This paper describes a practical method for stratifying a population of observations based on optimal cluster analysis. The goal of stratification is constructing a partition such that observations within a stratum are homogeneous as defined by within-cluster variances for attributes that are deemed important, while observations between strata are heterogeneous. The problem is defined as a deterministic optimization model with integer variables and is solved by means of a subgradient method. Computational tests with several examples show that the within-strata variances and thus the accompanying standard errors can be substantially reduced. Since the proposed model strives to minimize standard error, it is applicable to situations where a precise sample is essential, for example, microeconomic simulation studies.

The paper discusses a stochastic model for investment variables, involving four series: the Retail Prices Index, an index of share dividend yields, an index of share yields, and the yield on ‘consols’. Section 2 describes the model and explains its derivation on the basis of historical data. Section 3 shows how the model can be used for forecasting the distributions of the variables. Section 4 discusses possible applications, and describes two in detail, relating to the expense charges of unit trusts and to guarantees incorporated in index linked life annuities.