We develop a modular and tractable framework for solving an adaptive distributionally robust linear optimization problem, where we minimize the worst-case expected cost over an ambiguity set of probability distributions. The adaptive distributionally robust optimization framework caters for dynamic decision making, where decisions adapt to the uncertain outcomes as they unfold in stages. For tractability considerations, we focus on a class of second-order conic (SOC) representable ambiguity set, though our results can easily be extended to more general conic representations. We show that the adaptive distributionally robust linear optimization problem can be formulated as a classical robust optimization problem. To obtain a tractable formulation, we approximate the adaptive distributionally robust optimization problem using linear decision rule (LDR) techniques. More interestingly, by incorporating the primary and auxiliary random variables of the lifted ambiguity set in the LDR approximation, we can significantly improve the solutions, and for a class of adaptive distributionally robust optimization problems, exact solutions can also be obtained. Using the new LDR approximation, we can transform the distributionally adaptive robust optimization problem to a classical robust optimization problem with an SOC representable uncertainty set. Finally, to demonstrate the potential for solving management decision problems, we develop an algebraic modeling package and illustrate how it can be used to facilitate modeling and obtain high-quality solutions for medical appointment scheduling and inventory management problems.
The electronic companion is available at https://doi.org/10.1287/mnsc.2017.2952 .
This paper was accepted by Noah Gans, optimization.
We study stochastic programs where the decision maker cannot observe the distribution of the exogenous uncertainties but has access to a finite set of independent samples from this distribution. In this setting, the goal is to find a procedure that transforms the data to an estimate of the expected cost function under the unknown data-generating distribution, that is, a predictor, and an optimizer of the estimated cost function that serves as a near-optimal candidate decision, that is, a prescriptor. As functions of the data, predictors and prescriptors constitute statistical estimators. We propose a meta-optimization problem to find the least conservative predictors and prescriptors subject to constraints on their out-of-sample disappointment. The out-of-sample disappointment quantifies the probability that the actual expected cost of the candidate decision under the unknown true distribution exceeds its predicted cost. Leveraging tools from large deviations theory, we prove that this meta-optimization problem admits a unique solution: The best predictor-prescriptor-pair is obtained by solving a distributionally robust optimization problem over all distributions within a given relative entropy distance from the empirical distribution of the data.
This paper was accepted by Chung Piaw Teo, optimization.
Stochastic programming is a powerful approach for decision-making under uncertainty. Unfortunately, the solution may be misleading if the underlying distribution of the involved random parameters is not known exactly. In this paper, we study distributionally robust stochastic programming (DRSP) in which the decision hedges against the worst possible distribution that belongs to an ambiguity set, which comprises all distributions that are close to some reference distribution in terms of the Wasserstein distance. We derive a tractable reformulation of the DRSP problem by constructing the worst-case distribution explicitly via the first-order optimality condition of the dual problem. Using the precise structure of the worst-case distribution, we show that the DRSP can be approximated by robust programs to arbitrary accuracy. Then we apply our results to a variety of stochastic optimization problems, including the newsvendor problem, two-stage linear program, worst-case value-at-risk analysis, point processes control and distributionally robust transportation problems.
We consider stochastic programs where the distribution of the uncertain
parameters is only observable through a finite training dataset. Using the
Wasserstein metric, we construct a ball in the space of (multivariate and
non-discrete) probability distributions centered at the uniform distribution on
the training samples, and we seek decisions that perform best in view of the
worst-case distribution within this Wasserstein ball. The state-of-the-art
methods for solving the resulting distributionally robust optimization problems
rely on global optimization techniques, which quickly become computationally
excruciating. In this paper we demonstrate that, under mild assumptions, the
distributionally robust optimization problems over Wasserstein balls can in
fact be reformulated as finite convex programs---in many interesting cases even
as tractable linear programs. Leveraging recent measure concentration results,
we also show that their solutions enjoy powerful finite-sample performance
guarantees. Our theoretical results are exemplified in mean-risk portfolio
optimization as well as uncertainty quantification.
The paper proposes a novel probabilistic model with chance constraints for locating and sizing emergency medical service stations. In this model, the chance constraints are approximated as second-order cone constraints to overcome computational difficulties for practical applications. The proposed approximations associated with different estimation accuracy of the stochastic nature are meaningful on a practical uncertainty environment. Then, the model is transformed into a conic quadratic mixed-integer program by employing a conic transformation. The resulting model can be efficiently addressed by a commercial optimization package. A special case is also considered and a class of valid inequalities is introduced to improve computational efficiency. Lastly, computational experiences on real data and randomly generated data are reported to illustrate the validity of the program.
This paper discusses the use of polyhedral approximations in solving p-order cone programming (pOCP) problems, or linear problems with p-order cone constraints, and their mixed-integer extensions. In particular, it is shown that the cutting-plane technique proposed in Krokhmal and Soberanis [Risk optimization with p-order conic constraints: A linear programming approach, Eur. J. Oper. Res. 201 (2010), pp. 653–671, http://dx.doi.org/10.1016/j.ejor.2009.03.053] for a special type of polyhedral approximations of pOCP problems, which allows for generation of cuts in constant time not dependent on the accuracy of approximation, is applicable to a larger family of polyhedral approximations. We also show that it can further be extended to form an exact solution method for pOCP problems with O(ϵ−1) iteration complexity. Moreover, it is demonstrated that an analogous constant-time cut-generating algorithm exists for recursively constructed lifted polyhedral approximations of second-order cones due to Ben-Tal and Nemirovski [On polyhedral approximations of the second-order cone, Math. Oper. Res. 26 (2001), pp. 193–205. Available at http://dx.doi.org/10.1287/moor.26.2.193.10561]. It is also shown that the developed polyhedral approximations and the corresponding cutting-plane solution methods can be efficiently used for obtaining exact solutions of mixed-integer pOCP problems.
Equity is an important consideration in public services such as Emergency Medical Service (EMS) systems. In such systems not only equitability but also performance depends on the spatial distribution of facilities and resources. This paper proposes the minimum p-envy facility location model which aims to find optimal locations for facilities in order to balance customers’ perceptions of equity in receiving service. The model is developed and evaluated through the lens of EMS systems, where ambulances are located at facilities (stations) with the objective of minimizing the sum of “envy” among all demand zones (customer points) with respect to an ordered set of p operating stations weighted by the proportion of demand in each zone. The problem is formulated as an integer program, with priority weights assigned according the probability that an ambulance is available, which is estimated using the hypercube model. Because of the computational effort required to obtain solutions using commercially available software, a tabu search is developed to solve the problem. A case study using real-world data is presented. The performance of the proposed model is tested and compared to other location models such as the p-center and maximal-covering-location problems (MCLP).
J. F. Benders devised a clever approach for exploiting the structure of mathematical programming problems withcomplicating variables (variables which, when temporarily fixed, render the remaining optimization problem considerably more tractable). For the class of problems specifically considered by Benders, fixing the values of the complicating variables reduces the given problem to an ordinary linear program, parameterized, of course, by the value of the complicating variables vector. The algorithm he proposed for finding the optimal value of this vector employs a cutting-plane approach for building up adequate representations of (i) the extremal value of the linear program as a function of the parameterizing vector and (ii) the set of values of the parameterizing vector for which the linear program is feasible. Linear programming duality theory was employed to derive the natural families ofcuts characterizing these representations, and the parameterized linear program itself is used to generate what are usuallydeepest cuts for building up the representations.In this paper, Benders'' approach is generalized to a broader class of programs in which the parametrized subproblem need no longer be a linear program. Nonlinear convex duality theory is employed to derive the natural families of cuts corresponding to those in Benders'' case. The conditions under which such a generalization is possible and appropriate are examined in detail. An illustrative specialization is made to the variable factor programming problem introduced by R. Wilson, where it offers an especially attractive approach. Preliminary computational experience is given.
Dynamic traffic simulation models are frequently used to support decisions when planning an evacuation. This contribution
reviews the different (mathematical) model formulations underlying these traffic simulation models used in evacuation studies
and the behavioural assumptions that are made. The appropriateness of these behavioural assumptions is elaborated on in light
of the current consensus on evacuation travel behaviour, based on the view from the social sciences as well as empirical studies
on evacuation behaviour. The focus lies on how travellers’ decisions are predicted through simulation regarding the choice
to evacuate, departure time choice, destination choice, and route choice. For the evacuation participation and departure time
choice we argue in favour of the simultaneous approach to dynamic evacuation demand prediction using the repeated binary logit
model. For the destination choice we show how further research is needed to generalize the current preliminary findings on
the location-type specific destination choice models. For the evacuation route choice we argue in favour of hybrid route choice
models that enable both following instructed routes and en-route switches. Within each of these discussions, we point at current
limitations and make corresponding suggestions on promising future research directions.
KeywordsEvacuation–Travel behaviour–Departure time choice–Destination choice–Route choice–Dynamic traffic simulation
A survey of current continuous nonlinear multi-objective optimization (MOO) concepts and methods is presented. It consolidates and relates seemingly different terminology and methods. The methods are divided into three major categories: methods with a priori articulation of preferences, methods with a posteriori articulation of preferences, and methods with no articulation of preferences. Genetic algorithms are surveyed as well. Commentary is provided on three fronts, concerning the advantages and pitfalls of individual methods, the different classes of methods, and the field of MOO as a whole. The Characteristics of the most significant methods are summarized. Conclusions are drawn that reflect often-neglected ideas and applicability to engineering problems. It is found that no single approach is superior. Rather, the selection of a specific method depends on the type of information that is provided in the problem, the users preferences, the solution requirements, and the availability of software.
We demonstrate that a conic quadratic problem. (CQP) [GRAPHICS] is "polynomially reducible" to Linear Programming. We demonstrate this by constructing, for every epsilon is an element of (0, 1/2], an LP program (explicitly given in terms of epsilon and the data of (CQP)) (LP) [GRAPHICS] with the following properties: (i) the number dim x + dim u of variables and the number dim p of constraints in (LP) do not exceed [GRAPHICS] (ii) every Feasible solution x to (CQP) can be extended to a feasible solution (x, u) to (LP); (iii) if (x, u) is feasible for (LP), then x satisfies the "epsilon -relaxed" constraints of (CQP), namely, Ax greater than or equal to b, parallel toA(l)x - b(l)parallel to (2) less than or equal to (1 + epsilon)[c(l)(t)x - d(l)], l = 1.....m.
Stochastic programs can effectively describe the decision-making problem in an uncertain environment. Unfortunately, such programs are often computationally demanding to solve. In addition, their solutions can be misleading when there is ambiguity in the choice of a distribution for the random parameters. In this paper, we propose a model describing one's uncertainty in both the distribution's form (discrete, Gau ssian, exponential, etc.) and moments (mean and covariance). We demonstrate that for a wide range of cost functions the associated distributionally robust stochastic program can be solved efficiently. Furthermore, by deriving new confidence regions for the mean and covariance of a random vector, we provide probabilistic arguments for using our model in problems that rely heavily on historical data. This is confirmed in a practical example of portfolio selection, where our fra mework leads to better performing policies on the "true" distribution underlying the daily return of assets. Subject classifications: Programming: stochastic, Statistics: estimation, Finan ce: portfolio. Area of review: Optimization. History: Draft created February 20, 2008.
Classical formulations of the portfolio optimization problem, such as mean-variance or Value-at-Risk (VaR) approaches, can result in a portfolio extremely sensitive to errors in the data, such as mean and covariance matrix of the returns. In this paper we propose a way to alleviate this problem in a tractable manner. We assume that the distribution of returns is partially known, in the sense that only bounds on the mean and covariance matrix are available. We define the worst-case Value-at-Risk as the largest VaR attainable, given the partial information on the returns' distribution. We consider the problem of computing and optimizing the worst-case VaR, and we show that these problems can be cast as semidefinite programs. We extend our approach to various other partial information on the distribution, including uncertainty in factor models, support constraints, and relative entropy information. Received September 2001; revision received July 2002; accepted July 2002. Subject classifications: Finance, portfolio: Value-at-Risk, portfolio optimization. Programming, nonlinear: semidefinite programming, dualing, robust optimization. Area of review: Financial Services.
Emergencies, especially those considered routine (i.e., occurring on a daily basis), pose great threats to health, life, and property. Immediate response and treatment can greatly mitigate these threats. This research is conducted to optimize the locations of ambulance stations, deployment of ambulances, and dispatch of vehicles under demand and traffic uncertainty, which are the main factors that influence emergency response time. The research problem is formulated as a dynamic scenario-based two-stage stochastic programming model, aiming to minimize the total cost while responding to as much demand as possible. The Sample Average Approximation is proposed to approximate the original problem using a limited number of scenarios, and a two-phase Benders Decomposition solution scheme is proposed to accelerate computation, especially when solving a large-sized problem. Numerical experiments using real-world emergency data are conducted to validate the performance of the solution method. The results demonstrate the effectiveness and efficiency of the proposed algorithm. We additionally conduct a sensitivity analysis to evaluate the influences of crucial parameters, including the response time standard, facility capacity, service capacity, and facility heterogeneity. The managerial insights derived from sensitivity analysis will provide valuable guidance for the design of an emergency response system in practice.
Every patient should have equal access to fast and qualified assistance by emergency medical services (EMS). Most location planning models for EMS only consider efficiency criteria such as maximum coverage. Especially in metropolitan regions, the heterogeneous distribution of demand leads to unequal service availability: city centres receive higher coverage than suburban areas. To address this issue, we extend existing location planning models by two fairness criteria: the Rawlsian criterion, which maximizes the coverage of the least-covered demand area, and the Gini coefficient, which minimizes differences in coverage between demand areas. We consider both fairness criteria using the ϵ-constraint method and apply both criteria to the case of a German EMS provider. By generating the Pareto front, we analyze the trade-offs between efficiency and fairness. We find that managing the conflict between fairness and efficiency especially in a metropolitan region offers more beneficial solutions with respect to both criteria than generally assumed. We show that it is possible to greatly improve fairness by giving up only a small amount of coverage.
In the era of modern business analytics, data-driven optimization has emerged as a popular modeling paradigm to transform data into decisions. By constructing an ambiguity set of the potential data-generating distributions and subsequently hedging against all member distributions within this ambiguity set, data-driven optimization effectively combats the ambiguity with which real-life data sets are plagued. Chen et al. (2022) study data-driven, chance-constrained programs in which a decision has to be feasible with high probability under every distribution within a Wasserstein ball centered at the empirical distribution. The authors show that the problem admits an exact deterministic reformulation as a mixed-integer conic program and demonstrate (in numerical experiments) that the reformulation compares favorably to several state-of-the-art data-driven optimization schemes.
The medical facility is one of the essential public service facilities. Its spatial accessibility is an important indicator to measure the convenience of access to medical services and a significant factor affecting urban development and the living standards of residents. With the constant process of urbanization, urban agglomerations are formed and developing rapidly. The high-tier healthcare facilities in an urban agglomeration no longer only serve a single city but multiple connected cities. How to characterize healthcare accessibility and supply-demand relationships between medical resources and residents in the context of the urban agglomeration is worthy of being studied but still not fully involved in existing researches. To fill this gap, based on the conventional enhanced two-step floating catchment area method (E2SFCA) and considering the regional development discrepancy and medical preference of residents in the urban agglomeration, this paper proposes a new hierarchical two-step floating catchment area method (H2SFCA) to analyze accessibility to medical facilities. The applicability of the newly proposed method is demonstrated with reference to the Changsha-Zhuzhou-Xiangtan urban agglomeration case. The research findings suggest that: 1) A new method to evaluate accessibility to high-level medical facilities in the urban agglomeration is needed; 2) Due to the ignorance of intensive competition between potential demands, the healthcare accessibility in the core city is overestimated; 3) Residents' medical preference has various impacts on healthcare accessibility in different regions.
Planning public services needs to promote equal access across geographic areas and between demographic groups. However, most location-allocation models emphasize efficiency such as minimal travel burden or maximal demand coverage while omitting the equality issue. This case study optimizes the emergency medical service (EMS) in Shanghai from a trade-off perspective by comparing two models. One is the 2-step optimization (2SO) model that uses the maximum covering location problem (MCLP) to site new facilities and then a quadratic programming (QP) method to optimize capacities, the other performs location selection and capacity optimization simultaneously through greedy optimization (GO). There are several findings from various simulation scenarios. First, the GO model is more effective in optimizing equality, but the 2SO model offers a more balanced approach by covering more people within the mandatory response time while improving access equality. Secondly, solutions of both models change as demands and travel costs vary over time and call for dynamic adjustment of resource allocation. Thirdly, it is important to coordinate EMS with other agencies to ensure reasonable road connectivity and make contingency plans in events such as floods, earthquakes and other natural disasters.
In the preparedness phase of humanitarian logistics, uncertainties from both the supply and demand sides may dramatically increase morbidity and mortality. We consider a distributionally robust facility location model with chance constraints in which the nodes and edges of the network are vulnerable to random failure. Efficiency, effectiveness and equity metrics, which can be explicitly demonstrated as operational costs, service quality and the coverage rate, are incorporated to quantitatively measure system performance under disaster situations. As chance constraints are intractable, we correspondingly propose conic and linear approximations. The reformulated model is solved within the outer approximation framework, where three acceleration techniques, i.e., the branch-and-cut algorithm, in–out algorithm, and Benders decomposition, are embedded to increase the computational efficacy. Through extensive numerical results and a case study, our proposed model is found to be superior to traditional scenario-based approaches.
This paper considers a multiperiod emergency medical services (EMS) location problem and introduces two two-stage stochastic programming formulations that account for uncertainty about emergency demand. Whereas the first model considers both a constraint on the probability of covering the realized emergency demand and minimizing the expected cost of doing so, the second one employs probabilistic envelope constraints that allow us to control the degradation of coverage under the more severe scenarios. These models give rise to large mixed-integer programs, which can be tackled directly or by using a conservative approximation scheme. For the former, we implement the branch-and-Benders-cut method, which improves significantly the solution time when compared with using both a recently proposed state-of-the art branch-and-bound algorithm and the CPLEX solver. Finally, a practical study is conducted using historical data from the Northern Ireland Ambulance Service and sheds some light on optimal EMS location configuration for this region and on necessary trade-offs that must be made between emergency demand coverage and expected cost. These insights are confirmed through an out-of-sample performance analysis.
The pre-positioning problem is an important part of emergency supply. Pre-positioning decisions must be made before disasters occur under high uncertainty and only limited distribution information. This study proposes a distributionally robust optimization model for the multi-period dynamic pre-positioning of emergency supplies with a static pre-disaster phase and a dynamic post-disaster phase. In the post-disaster phase, the uncertain demands are time varying and have partial distribution information that belongs to a given family of distributions. The family of distributions is described by a given perturbation set based on historical information. Therefore, the proposed model forms a semi-infinite programming problem with ambiguous chance constraints, which typically would be computationally intractable. We refine the bounded perturbation sets (box, box-ball and box-polyhedral) and develop computationally tractable safe approximations of the chance constraints. Finally, a realistic application to the Circum-Bohai Sea Region of China is presented to illustrate the effectiveness of the robust optimization model.
This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality; in particular, we provide examples involving path-dependent expectations of stochastic processes. Our approach consists of computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms of a flexible class of distances from a suitable baseline model. These distances, based on optimal transportation between probability measures, include Wasserstein’s distances as particular cases. The proposed methodology is well suited for risk analysis and distributionally robust optimization, as we demonstrate with applications. We also discuss how to estimate the tolerance region nonparametrically using Skorokhod-type embeddings in some of these applications.
An effective Emergency Medical Service (EMS) system can provide medical relief supplies for common emergencies (fire, accident, etc.) or large-scale disasters (earthquake, tsunami, bioterrorism attack, explosion, etc.) and decrease morbidity and mortality dramatically. This paper proposes a distributionally robust model for optimizing the location, number of ambulances and demand assignment in an EMS system by minimizing the expected total cost. The model guarantees that the probability of satisfying the maximum concurrent demand in the whole system is larger than a predetermined reliability level by introducing joint chance constraints and characterizes the expected total cost by moment uncertainty based on a data-driven approach. The model is approximated as a parametric second-order conic representable program. Furthermore, a special case of the model is considered and converted into a standard second-order cone program, which can be efficiently solved with a proposed outer approximation algorithm. Extensive numerical experiments are conducted to illustrate the benefit of the proposed approach. Moreover, a dataset from a real application is also used to demonstrate the application of the data-driven approach.
Over the past 10 years, a considerable amount of research has been devoted to the development of models to support decision making in the particular yet important context of Emergency Medical Services (EMS). More specifically, the need for advanced strategies to take into account the uncertainty and dynamism inherent to EMS, as well as the pertinence of socially oriented objectives, such as equity, and patient medical outcomes, have brought new and exciting challenges to the field. In this context, this paper summarizes and discusses modern modelling approaches to address problems related to ambulance fleet management, particularly those related to vehicle location and relocation, as well as dispatching decisions. Although it reviews early works on static ambulance location problems, this review concentrates on recent approaches to address tactical and operational decisions, and the interaction between these two types of decisions. Finally, it concludes on the current state of the art and identifies promising research avenues in the field.
We consider a stochastic pre-disaster relief network design problem, which mainly determines the capacities and locations of the response facilities and their inventory levels of the relief supplies in the presence of uncertainty in post-disaster demands and transportation network conditions. In contrast to the traditional humanitarian logistics literature, we develop a chance-constrained two-stage mean-risk stochastic programming model. This risk-averse model features a mean-risk objective, where the conditional value-at-risk (CVaR) is specified as the risk measure, and enforces a joint probabilistic constraint on the feasibility of the second-stage problem concerned with distributing the relief supplies to the affected areas in case of a disaster. To solve this computationally challenging stochastic optimization model, we employ an exact Benders decomposition-based branch-and-cut algorithm. We develop three variants of the proposed algorithm by using alternative representations of CVaR. We illustrate the application of our model and solution methods on a case study concerning the threat of hurricanes in the Southeastern part of the United States. An extensive computational study provides practical insights about the proposed modeling approach and demonstrates the computational effectiveness of the solution framework.
The problem is to find “generating capacities” and arc capacities such that a random demand appearing regularly at the nodes should be feasible in a large percent of the cases and subject to this and further deterministic conditions, the total cost should be minimum. The cost function has two parts: the cost of the capacities and the cost of the long term average “outage” cost. As an illustration of the model the problem of planning in interconnected power systems is used. The model is a special case of one of the models introduced in [9]; it is a combination of chance constrained programming and two stage programming under uncertainty.
AMS/MOS/ Subject Classifications: 90C15, 90C25, 62C99, 90C35
We introduce a risk-averse stochastic modeling approach for a pre-disaster relief network design problem under uncertain demand and transportation capacities. We determine the sizes and locations of the response facilities and the inventory levels of relief supplies at each facility while guaranteeing a certain level of network reliability. We introduce a probabilistic constraint on the existence of a feasible flow to ensure that the demand for relief supplies across the network is satisfied with a specified high probability. Responsiveness is also accounted for by defining multiple regions in the network, and introducing local probabilistic constraints on satisfying demand within each region. These local constraints ensure that each region is self-sufficient in terms of providing for its own needs with a large probability. In particular, we use the Gale-Hoffman inequalities to represent the conditions on the existence of a feasible network flow. The solution method rests on two pillars. We first use a preprocessing algorithm to eliminate redundant Gale-Hoffman inequalities, and then, formulate the proposed models as computationally efficient mixed-integer linear programs by utilizing a method based on combinatorial patterns. Computational results for a case study and randomly generated problem instances demonstrate the effectiveness of the models and the solution method.
In location problems for the public sector such as emergency medical service (EMS) systems, the issue of equity is an important factor for facility design. Several measures have been proposed to minimize inequity of a system. This paper considers an extension to the minimum p-envy location model by evaluating the objective of the model based on a survival function instead of on a distance function since survival probability is directly related to patient outcomes with a constraint on minimum survival rate. The model was tested on a real world data set from the EMS system at Hanover County, VA, and also compared to other location models. The results indicate that, not only does the enhanced p-envy model reduce inequity but we also find that more lives can be saved by using the survival function objective. A sensitivity analysis on different quality of service measures (survival probability and traveled distance) and different choices of priority assigned to serving facility is discussed.
We develop tractable semidefinite programming based approximations for distributionally robust individual and joint chance constraints, assuming that only the first- and second-order moments as well as the support of the uncertain parameters are given. It is known that robust chance constraints can be conservatively approximated by Worst-Case Conditional Value-at-Risk (CVaR) constraints. We first prove that this approximation is exact for robust individual chance constraints with concave or (not necessarily concave) quadratic constraint functions, and we demonstrate that the Worst-Case CVaR can be computed efficiently for these classes of constraint functions. Next, we study the Worst-Case CVaR approximation for joint chance constraints. This approximation affords intuitive dual interpretations and is provably tighter than two popular benchmark approximations. The tightness depends on a set of scaling parameters, which can be tuned via a sequential convex optimization algorithm. We show that the approximation becomes essentially exact when the scaling parameters are chosen optimally and that the Worst-Case CVaR can be evaluated efficiently if the scaling parameters are kept constant. We evaluate our joint chance constraint approximation in the context of a dynamic water reservoir control problem and numerically demonstrate its superiority over the two benchmark approximations.
We consider a discrete facility location problem with a new form of equity criterion. The model discussed in the paper analyzes the case where demand points only have strict preference order on the sites where the plants can be located. The goal is to find the location of the facilities minimizing the total envy felt by the entire set of demand points. We define this new total envy criterion and provide several integer linear programming formulations that reflect and model this approach. The formulations are illustrated by examples. Extensive computational tests are reported, showing the potentials and limits of each formulation on several types of instances. Finally, some improvements for all the formulations previously presented are developed, obtaining in some cases much better resolution times.
This paper views the location of emergency facilities as a set covering problem with equal costs in the objective. The sets are composed of the potential facility points within a specified time or distance of each demand point. One constraint is written for each demand point requiring “cover,” and linear programming is applied to solve the covering problem, a single-cut constraint being added as necessary to resolve fractional solutions.
ABSTRACT The maximal covering location problem is based on locating p facilities in such a manner that coverage is maximized within set distance or time standards. This problem has been extended in a number of different ways where the main theme still involves locating a fixed number of facilities. In many applications site costs are not equal; this should cast doubt on the use of a constraint that fixes the number of facilities at a given number as compared to an approach that minimizes site costs and maximizes coverage. This paper addresses the use of site costs in a maximal covering location problem and presents several approaches to solutions, along with computational results. Theoretical linkages to other location models are also presented.
Scheduling heating oil production is an important management problem. It is also a complex one. Weather and demand uncertainties, allocation of production between different refineries, joint- and by-product relations, storage limitations, maintenance of minimal supplies and many other factors need to be considered.
This paper is concerned with one of an integrated series of operations research studies directed toward improvement in such scheduling methods. Emphasis is on essentials of the mathematical model. Institutional features and other phases of the OR studies are brought in only as required.
The author of the work “Mathematical Methods of Organizing and Planning Production”, Professor L. V. Kantorovich, is an eminent authority in the field of mathematics. This work is interesting from a purely mathematical point of view since it presents an original method, going beyond the limits of classical mathematical analysis, for solving extremal problems. On the other hand, this work also provides an application of mathematical methods to questions of organizing production which merits the serious attention of workers in different branches of industry.
This is the English translation of the famous 1939 article by L. V. Kantorovich, originally published in Russian.
Government efforts designed to help improve healthcare access rely on accurate measures of accessibility so that resources can be allocated to truly needy areas. In order to capture the interaction between physicians and populations, various access measures have been utilized, including the popular two-step floating catchment area (2SFCA) method. However, despite the many advantages of 2SFCA, the problems associated with using fixed catchment sizes have not been satisfactorily addressed. We propose a new method to dynamically determine physician and population catchment sizes by incrementally increasing the catchment until a base population and a physician-to-population ratio are met. Preliminary application to the ten-county region in northern Illinois has demonstrated that the new method is effective in determining the appropriate catchment sizes across the urban to suburban/rural continuum and has revealed greater detail in spatial variation of accessibility compared to results using fixed catchment sizes.
This paper considers a double coverage ambulance location problem. A model is proposed and a tabu search heuristic is developed for its solution. Computational results using both randomly generated data and real data confirm the efficiency of the proposed approach.
This paper is concerned with the formulation and the solution of a probabilistic model for determining the optimal location of facilities in congested emergency systems. The inherent uncertainty which characterizes the decision process is handled by a new stochastic programming paradigm which embeds the probabilistic constraints within the traditional two-stage framework. The resulting model drops simplifying assumptions on servers independence allowing at the same time to handle the spatial dependence of demand calls. An exact solution method and different tailored heuristics are presented to efficiently solve the problem. Computational experience is reported with application to various networks.
This paper addresses the problem of designing robust emergency medical services. Under this respect, the main issue to consider is the inherent uncertainty which characterizes real life situations. Several approaches can be used to design robust mathematical models which are able to hedge uncertain conditions. We are using here the stochastic programming framework and, in particular, the probabilistic paradigm. More specifically, we develop a stochastic programming model with probabilistic constraints aimed to solve both the location and the dimensioning problems, i.e. where service sites must be located and how many emergency vehicles must be assigned to each site, in order to achieve a reliable level of service and minimize the overall costs. In doing so, we consider the randomness of the system as far as the demand of emergency service is concerned. The numerical results, which have been collected on a large set of test problems, demonstrate the validity of the proposed model, particularly in dealing with the trade-off between quality of service and costs management.
From CVaR to uncertainty set: Implications in joint chance-constrained optimization
Jan 2010
470
Chen
Linear programming under uncertainty
Jan 1955
197
Dantzig
The distribution free newsboy problem: review and extensions